Re: Why would AI want to be friendly?

From: Brent Allsop (allsop@fc.hp.com)
Date: Tue Sep 05 2000 - 09:47:25 MDT


xgl <xli03@emory.edu> responded:

> i see no reason that an SI (the kind that eliezer envisions,
> anyway) would experience anything remotely as anthropomorphic as
> gratefulness. we are talking about an engineered transcendent mind,
> not a product of millions of years of evolution -- no parents, no
> siblings, no competition, no breast-feeding.

        There is the kicker - "no competition"! In the past we had to
compete to survive. That is the law when there is no other more
intelligent way to progress. But once anyone or anything achieves the
intelligence required to progress more intentionally than via
"survival of the fittest" all the rules change drastically. No longer
are we competing, now we are communicating and sharing. If anyone
anywhere grows, learns, and so on and so forth, it is better for us
all.

        We must be concerned for lower life forms, for if we do
nothing but destroy and consume them since we are superior, then we
must expect the same fate as soon as we rune intoo any ET that is
superior to ourselves. What a lonely, hideous, and non diverse place
this universe would be with only the single most advanced being having
destroyed and consumed everything else.

John <Pvthur@aol.com> replied:

> The road from human equivalence to Power is not instantaneous. Any
> initial post-human AI will be an information-hound. We are full of
> information. It'll upload us like so many glazed donuts. We will
> become it. It will be us.

        Yes, right on John! Notice the difference here between
"consume and destroy" and "eat up via uploading."

                Brent Allsop



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:12 MDT