Re: Singularity: Just Say No!

J. R. Molloy (jr@shasta.com)
Tue, 26 Jan 1999 15:20:37 -0800

Billy Brown wrote,
>IMO, an 'instant singularity' precipitated by the sudden appearance of an
SI
>is not very likely. However, the factors that determine whether it will
>happen are not under human control. It depends on the answers to a number
>of questions about natural law (like: How hard is it to increase human
>intelligence?). If the answers turn out to be the wrong ones, the first AI
>to pass a certain minimal intelligence threshold rapidly becomes an SI. If
>they don't, we have nothing to fear. The only thing we can do that makes
>much difference is to make sure our seed AIs are sane, in case one of them
>actually works.

A sane seed AI presents more of a threat to humanity than does an insane AI because a sane AI would function at extreme variance to the insane human cultures which prevail on Earth. No joke.

Cheers,

J. R.