Re: Why would AI want to be friendly?

From: Scott Badger (w_scott_badger@yahoo.com)
Date: Sat Sep 09 2000 - 13:05:14 MDT


>From: "Jon Reeves" <jon@electronsky.co.uk>

> I've been reading this thread with interest (sorry
> - lurking again), and I think the question that is
> more to the point is "Why would AI want to be
> _unfriendly_?"
>
> The extermination (or enslavement) of several
> billion people would surely require an expenditure
> of a considerable amount of time and energy - even
> for an SI. What motivation could it have for doing
> so?

Actually, I doubt that it would take that much time or
energy. Couldn't an SI simply create and release a
deadly airborne human-specific virus with relative
ease? Or perhaps create some self-replicating nanobots
designed to identify and disassemble homo sapiens.

This assumes hostility though and I tend to agree with
those who worry more about humans being decimated
through casual disregard than by malevolent intent.

--Scott

__________________________________________________
Do You Yahoo!?
Yahoo! Mail - Free email you can access from anywhere!
http://mail.yahoo.com/



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:34 MDT