Re: Why would AI want to be friendly?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Sep 06 2000 - 09:22:44 MDT


Wilson wrote:
>
> > From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
> >
> > It would be nice, for instance, to have SIs that are made happy by making
> > humans happy.

This is not a quote from me!

> I remember reading a Sci-Fi book about this in the 80s.. Robots were given
> the command "Make humans happy." After many years, they decided, based on
> the actions of humans, that life itself irritated them, given how much they
> complained about it. The obvious solution was to kill every human in the
> universe.

Yes, this comes under the "Why the SI needs a grain of common sense"
department.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:19 MDT