Re: Yudkowsky's AI (again)

Randall Randall (wolfkin@freedomspace.net)
Thu, 25 Mar 1999 15:20:09 -0500

I've lately thought that on Thu, 25 Mar 1999, Eliezer S. Yudkowsky wrote:
>den Otter wrote:
>>
>> Not necessarily. Not all of us anyway.
>
>The chance that some humans will Transcend, and have their self
>preserved in that Transcendence, while others die in Singularity - is
>effectively zero. (If your self is preserved, you wouldn't kill off
>your fellow humans, would you?) We're all in this together. There are
>no differential choices between humans.

Heh. Tell that to any number of States, mafias, and other criminal organizations, some of which (the US govt, e.g.) are already interested in some of the enabling tech.

*I* would not kill off anyone, but there are lots of people whom I would expect to do so.

--
Wolfkin.
wolfkin@freedomspace.net | Libertarian webhost? www.freedomspace.net
On a visible but distant shore, a new image of man;
The shape of his own future, now in his own hands.-- Johnny Clegg.