Re: Singularity: Just Say No!

Eliezer S. Yudkowsky (
Tue, 26 Jan 1999 22:56:17 -0600

"J. R. Molloy" wrote:
> A sane seed AI presents more of a threat to humanity than does an insane AI
> because a sane AI would function at extreme variance to the insane human
> cultures which prevail on Earth. No joke.
> Cheers,
> J. R.

OK, basically I have three comments here.

One, if it's really that bad, I'm going to side with the AIs. It's the sane thing to do.

Two, an insane AI would be insane in a totally different way than _us_, and might function at a variance that would not only be extreme, but also randomly destructive.

Three, no matter how insane we are, we're still trying. I think there really are a few things we've got right. We may still be insane, but we're saner than in medieval times. A sane AI would have at least that much in common. An insane AI would have nothing in common at all.

--         Eliezer S. Yudkowsky

Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.