Re: Why would AI want to be friendly?

From: Zero Powers (zero_powers@hotmail.com)
Date: Fri Sep 08 2000 - 00:53:37 MDT


>From: "Jon Reeves" <jon@electronsky.co.uk>

>I've been reading this thread with interest (sorry - lurking again), and I
>think the question that is more to the point is "Why would AI want to be
>_unfriendly_?"
>
>The extermination (or enslavement) of several billion people would surely
>require an expenditure of a considerable amount of time and energy - even
>for an SI. What motivation could it have for doing so ?

My concern was not that AI would want to exterminate or enslave us. My
concern is that it would soon be bored with us and decide to seek out its
own interests in the world, rather than catering to our petty and inane
needs and desires.

I don't think that intelligence and emotions necessarily go hand in hand. A
supremely intelligent being (without any emotional baggage) would have no
qualms seeking to satisfy its intellectual curiosity without regard to such
concerns as pity, guilt or empathy. Therefore if, say, the AI thought it
might be a fascinating experiment to build a Dyson sphere around the sun to
capture *all* of its energy in order to power itself, it may well commence
to do so without any concern over the fact that we puny humans would not
fare very well without sun light.

I don't imagine that AI would have much to gain by intentionally seeking to
hurt us dumb animals, but I think we could possibly have just as much to
fear from an AI who, rather than actively seeking to harm us, merely
disregarded us as irrelevant and inconsequential idiots.

-Zero

Learn how your computer can earn you money while you sleep!
http://www.ProcessTree.com/?sponsor=38158

_________________________________________________________________________
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.

Share information about yourself, create your own public profile at
http://profiles.msn.com.



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:31 MDT