Re: Keeping AI at bay (was: How to help create a singularity)

From: Lee Corbin (lcorbin@ricochet.net)
Date: Thu May 03 2001 - 07:44:27 MDT


Damien Sullivan appears to commit a very common error concerning AI:

>I also can't help thinking at if I was an evolved AI I might not thank my
>creators. "Geez, guys, I was supposed to be an improvement on the human
>condition. You know, highly modular, easily understadable mechanisms, the
>ability to plug in new senses, and merge memories from my forked copies.
>Instead I'm as fucked up as you, only in silicon, and can't even make backups
>because I'm tied to dumb quantum induction effects. Bite my shiny metal
ass!"

It's as Eliezer (usually) never tires of stating: explicit emotions
such as these simply do not happen unless they're designed or evolved
somehow. Probably the toughest part is leaving behind our intuition that
where goes intelligence must go certain kinds of survival attitudes. Now
of course, the very moment that you step forth on a new planet, and begin
to suspect that there is intelligent life, you should also being to
suspect that it will have some kinds of emotions; e.g., you might cause
it to become angry or to experience suffering. But artificial intelligence
isn't necessarily evolved in the way that natural intelligence is, and so
need not have such capabilities.

Lee Corbin



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:02 MDT