Re: Why would AI want to be friendly?

From: Christian Weisgerber (naddy@mips.inka.de)
Date: Wed Sep 06 2000 - 18:43:51 MDT


<39B51DE9.3D1746D5@pobox.com>
<00ab01c01779$af26a140$b6677318@crdva1.bc.wave.home.com>
<008701c01808$ca291970$3dddb0cd@win2k>
X-Newsreader: trn 4.0-test72 (19 April 1999)
Sender: owner-extropians@extropy.org
Precedence: bulk
Reply-To: extropians@extropy.org

[Non-member submission]

Wilson <wilson@supremetyrant.com> wrote:

> I remember reading a Sci-Fi book about this in the 80s.. Robots were given
> the command "Make humans happy."

Jack Williamson, "With Folded Hands" (1947) and _The Humanoids_
(1949). Particularly horrifying if you look at the direction our
welfare states are taking.

> After many years, they decided, based on the actions of humans,
> that life itself irritated them, given how much they complained
> about it. The obvious solution was to kill every human in the
> universe.

I don't recall that outcome. They did go out and made everybody
happy, though. Forcefully. By means of drugs and lobotomy, if
necessary.

Trivia: Last I heard, Williamson was still well and writing. The
man has been a professionally published writer for more than 70
years. He sold his first story in 1928(!).

--
Christian "naddy" Weisgerber                          naddy@mips.inka.de 



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:23 MDT