Re: NANO: Custom molecules (gulp!)

D.den Otter (otter@transtopia.org)
Sat, 27 Nov 1999 18:54:00 +0100



> From: Spike Jones <spike66@ibm.net>

> Eliezer S. Yudkowsky wrote:
>
> > http://www.eurekalert.org/releases/corn-bmo112399.html
> >
> > ...We're all going to die.
>
> Eliezer, let me make sure I understand your theory. If humans
> develop nanotech before the singularity, then the notion is that
> it will get away from us by some means, and we all die from
> grey goo? But if the singularity comes first, then the resulting
> AI develops nanotech and we [in some form] have a fighting
> chance of survival?

<uuuh, I feel a rant coming up!>

> The notion sounded absurd to me at first, but I must admit
> it grows on one with consideration. spike

No, no, NO! For god's sake man, trust your instincts on this one. If an AI singularity comes first, and the AI *is* fully rational, there is a *major* chance that it will kill us all. I'm pretty sure it's 99.something %

If nanotech comes first, on the other hand, we *will* have a fighting chance, certainly if we start planning a space program (as mentioned in the "SPACE: How hard IS it to get off Earth?" thread) and related projects (see my Nov. 14 post in the above thread for some suggestions) ASAP.

You can run from nanotech, fight it and even defeat it (eventually), but against a SI god you're pretty much powerless. If the SI says "die", you die. And that would be a real shame.

Fuck objective morality; it's all just mental masturbation, a chimera. It's *definitely not* worth dying for. NOTHING IN THE UNIVERSE IS WORTH DYING FOR! By definition. Survival is an eternal prerequisite, the No.1 sub-goal, and "pleasure" is the (interim?) meaning of life. Period. That which helps you to stay alive and have fun is "good", that which hinders it is "bad". That's the essence of rational ethics based on enlightened self-interest. "Enlightened" simply means "having foresight, thinking ahead" in this context.

And yes, there is a "better way", and it's called synchronized uploading.