Re: Why would AI want to be friendly?

From: Jon Reeves (jon@electronsky.co.uk)
Date: Thu Sep 07 2000 - 18:03:20 MDT


I've been reading this thread with interest (sorry - lurking again), and I
think the question that is more to the point is "Why would AI want to be
_unfriendly_?"

The extermination (or enslavement) of several billion people would surely
require an expenditure of a considerable amount of time and energy - even
for an SI. What motivation could it have for doing so ?

It seems to me that most sapients consider diversity to be a very important
thing - why would a A/SI not think the same. To keep humans around would
require very little expense to the SI, while at the same time providing it
with an enormous pool of intellectual diversity.

The notion that an SI would just wake up and decide to wipe out humanity
strikes me as absurd - to do so would require tremendous arrogance, and I
would hope that something supposedly more intelligent than us would realise
how stupid it was being.



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:30 MDT