Re: Why would AI want to be friendly?

From: Christian Weisgerber (naddy@mips.inka.de)
Date: Wed Sep 06 2000 - 19:11:22 MDT


[Non-member submission]

Brent Allsop <allsop@fc.hp.com> wrote:

> There is the kicker - "no competition"! In the past we had to
> compete to survive. That is the law when there is no other more
> intelligent way to progress. But once anyone or anything achieves the
> intelligence required to progress more intentionally than via
> "survival of the fittest" all the rules change drastically.

Escaping from evolution is tricky. If you have

(1) replicating agents
(2) mutation
(3) limited resources

you automatically get evolution as well. Evolution is not a physical
law. It's something higher, independent of physical systems. It's
math. I'm not sure what the right term is--a "principle"?

People seem to think that if you replace random, natural mutation
by self-tinkering, you have done away with (2) from above. Which
is nonsense. All you have done is having replaced one kind of
mutation with another one. If you feel like engaging in semantic
squabbling over the term "mutation", feel free to substitute a
better term in (2).

Doing away with replication may look like a good way to stop
evolution from occurring. The problem with that is that as soon
as an agent develops by mutation the ability to replicate, it will
steamroller the statics. Note that *self*-replication is not a
necessary condition.

> No longer are we competing, now we are communicating and sharing.
> If anyone anywhere grows, learns, and so on and so forth, it is
> better for us all.

You are lunch.

> What a lonely, hideous, and non diverse place this universe would
> be with only the single most advanced being having destroyed and
> consumed everything else.

If you are thinking long-term enough, that scenario is rather
plausible.

--
Christian "naddy" Weisgerber                          naddy@mips.inka.de 



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:23 MDT