Re: Why would AI want to be friendly?

From: J. R. Molloy (jr@shasta.com)
Date: Wed Sep 06 2000 - 18:03:55 MDT


> I remember reading a Sci-Fi book about this in the 80s.. Robots were given
> the command "Make humans happy." After many years, they decided, based on
> the actions of humans, that life itself irritated them, given how much they
> complained about it. The obvious solution was to kill every human in the
> universe.
>
> --Wilson.
>

Did the universe live happily ever after?

--J. R.

"Fiction is the truth inside the lie." --Stephen King



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:23 MDT