Re: Singularity: Just Say No!

Chris Wolcomb (jini@hotbot.com)
Tue, 19 Jan 1999 05:06:47 -0000

On Mon, 18 Jan 1999 01:50:28 Eliezer S. Yudkowsky wrote:

>
>Wouldn't it be ironic if you delayed the seed AI, got smothered in goo,and all along it turned out that the Singularity would have obligingly provided an environment hedonistic beyond the most fevered imaginings of the AhForgetIt Tendency?

Well, lets hope your right. But that brings up my central point - we need to take responsibility now, all of us, not just extropians and tranhumanists, but everyone who wants to create a better future, to ensure through rational planning, scenario creation, increased technological accesability, dynamic optimism, to create the future. Since the future is up for grabs, it is we (all of us) who will ultimately determine if we end up as grey goo or waxing poetic in The Culture. Therfore, we *must* take full responsibility about *how* these seed AI's are created, their growth algorithms, etc. As far as grey goo, I'd suggest that as many people as possible get access as soon as possible to as much nanotech as possible. That way it ensures that if any one specific group, no matter how malign, will have several if not hundreds of more benign competitors just as likely to develop nanotech before they can.

>--
>Who on this list is Culture, who is Contact... and who is SC?

Since The Culture has implict immortality, I'd spend time doing all of the above.

HotBot - Search smarter.
http://www.hotbot.com