Re: NANO: Custom molecules (gulp!)

D.den Otter (neosapient@geocities.com)
Sun, 28 Nov 1999 17:58:01 +0100



> From: Eliezer S. Yudkowsky <sentience@pobox.com>
> "D.den Otter" wrote:
> >
> > No, no, NO!
>
> Join us, Spike... don't be afraid...

Siren song...

> > For god's sake man, trust your instincts
> > on this one. If an AI singularity comes first, and the
> > AI *is* fully rational, there is a *major* chance that it
> > will kill us all. I'm pretty sure it's 99.something %
>
> Let's apply the Bayesian Probability Theorem. If, speaking about an
> event on which there is great controversy and no access to immediate
> evidence, Human A says vis theory is 70% likely to be right and Human B
> says vis theory is 99% likely to be right, which one is more likely to
> be totally wrong?

My estimate is based on the very reasonable assumption that a SI wouldn't need anyone else (as the reader may recall we've discussed this before and Eliezer was in full agreement back then), and wouldn't be bound by redundant evolutionary adaptations such as altruism. Add to that the fact that humans, if allowed to continue freely with their development after the SI has ascended, would most likely create and/or become superintelligences (i.e. competion for resources and a potential threat), and you have a pretty strong argument for extinction. Now, where does that 70% figure come from??

> Trust me: I don't think I'm infallible.

But nonetheless you are prepared to act as if you *were* infallible...The moment you activate your ASI Golem, you and you alone will have passed judgement on the world, using your finite wisdom.

> > If nanotech comes first, on the other hand, we *will*
> > have a fighting chance, certainly if we start planning
> > a space program (as mentioned in the "SPACE:
> > How hard IS it to get off Earth?" thread)
> > and related projects (see my Nov. 14 post in the
> > above thread for some suggestions) ASAP.
>
> Maybe. But if humanity survives nanotech, sooner or later it'll still
> come face-to-face with a greater-than-human intelligence. As you admit.

Of course. But, so what? The primary aim of the ascension initiative isn't to save "humanity", but to save *oneself*. And don't even try to get sanctimonious on me, Eliezer, as saving humanity isn't your primary concern either. Let's forget saving the world for a minute, ok, we're talking about naked survival here (which may not matter to you, but it certainly matters to me and a whole lot of others).

> > [Wacky rant on "why subjective morality is the objective truth" deleted.]

Rant yes, wacky my ass! Morality is always subjective, because it only exists in the minds of subjective creatures. As there is no objective creature, there can be no objective morality. Objective truth (reality), on the other hand, is quite real, but not very relevant unless placed in the context of some subjective creature's goal system.

Your obsession with "the Objective" is, IMHO, essentially religious in nature and has little to do with common sense. The very fact that you refuse to give survival its rightful (top) place indicates that there is a serious flaw in your logic department. Usually I don't bother to argue with the religious, but since you are a potentially ASI-spawning genius *and* apparently have a considerable influence on many list members, I don't really have a choice.

> > And yes, there is a "better way", and it's called
> > synchronized uploading.
>
> I wouldn't trust den Otter on this one, Spike.

Of course not. Trust no-one, Spike, and certainly not Eliezer, who values the Singularity more than your, mine or anyone else's existence. Or maybe you *can* "trust" us, as we're both pretty honest about what could happen. Both our motivations (and those of everyone else) are essentially selfish of course; the only real difference is that Eliezer is selfishly trying to accomplish something which probably isn't in his enlightened, rational ("objective") self-interest.

> Long before he started
> talking about "synchronized uploading" he believed that only one Power
> could survive, and he plans to be it.

Only one Power is (obviously) the only stable, and therefore "ideal", situation from that one Power's perspective. Yes, of course I'd like to be the one, but...if the choice is (nearcertain) death or shared godhood, I'm inclined to choose the latter.

> He'll take your money, and while
> you're waiting for your journey into the promised land,

Hey, you're the Pied Piper here, dude! The ASI-Singularity is the abyss towards which the lemmings are dancing, hypnotized by the gaily tunes of World Peace, Universal Love and Unlimited Wisdom. They're in for one hell of a rude, albeit mercifully short, awakening...

> he'll jump into
> the "experimental prototype hardware" and leave you suckers to burn.

Oh, now I'm flattered. Apparently you think that I could trick everyone directly involved in the project (which could easily be a hundred people or more) and do a "perfect" ascension on the sly with some experimental prototype, which, foolishly, was left completely unguarded. Somehow I don't think that the others would be SUCH morons (and if they were, they'd *deserve* to get cheated). *Of course* no-one could trust anyone else *completely*. For example, would you leave anyone on this list alone in a room with a fully functional uploader for even 5 minutes? I sure wouldn't. People betray eachother for the most petty reasons, and instant godhood would be a truly unprecedented temptation. Consequently, the security protocols would have to be "unprecedented" as well. Duh!

Now Eliezer seems to think, or should I say wants *you* to think (remember, this guy's got his own agenda), that synchronized uploading would only make sense if a bunch of noble, selfless saints did it. This is clearly not true. In the REAL world, it is perfectly feasible to have fruitful cooperation without 100% mutual trust. It is the very basis of our society, and indeed nature itself.

The synchronized uploading scenario is the classical prisoner's dilemma. Of course we can expect some attempts at defection, and should take the appropriate precautions. No other scenario gives you a better fighting chance than this one. Throwing yourself at the mercy of some artificial god is fit for simple religious sheep, not proud, individualistic transhumanists. Your technophilia has blinded you all.