Re: Why would AI want to be friendly?

From: CYMM (cymm@trinidad.net)
Date: Sun Oct 01 2000 - 19:27:47 MDT


BARBARA LAMAR SAID: ".... What difference would cooperative coevolution make
with respect to the relationship between humans and highly evolved AI?..."

CYMM SAYS: Firstly, when we say cooperative in respect of evolution - it's
only in retrospect. It's not like in a game theoretic scenario where intent
and perception of intent occurs in addition to resource considerations.

The kind of cooperation that would interest humans would involve political &
game theory analysis of the situation - and won't be darwinian.... which is
a very simple form of adaptive computation.

Secondly, as I said before, if you're talking real biological (...or
neobiological...) symbiosis - humans and the AI are not running on the same
clocks. Humans and bacteria or virii are. The clock here is based on DNA.
Irrespective of the difference generation times of humans and bacteria there
is a degree of temporal coherence between the two species' genetic
adaptation to the environment - there is enough coupling to facilitate
organic coevolution.

But suppose the machine evolves ten billion times faster - suppose the
mutation rate is ten billion times that of a human - the boundary conditions
might preclude the darwinian selection that allows for concomitant coupling
of the two species' evolution.

Look at a scenario that has been done in a 'sixties (...i think) SF story.

You make the supermachine. You plug it in. The supermachine disappears
within milliseconds of being activated. God alone knows where or what it has
become (...really!).

What could have happened is that the supermachine discovered a more general
quantum mechanics where the Born criterion doesn't exactly hold -
conservation of mass (...and strict causality..) is violated; the machine
reaches back to the quantum fluctuations at the beginning of time - and
subtly remakes the physical universe.

Look at another scenario... if the machine/universe system that has evolved
is sufficiently consistent with the human/universe system; then we could
transform from one to the other in a sort of expanded Principle of
Relativity.

If the transform is close to affine - I'm sure that we could recognize the
machine as some sort of object.

If the transform is wierd - not particularly well-behaved... then the
physical manifestations of such an object might not easily be perceived by
humans... in the sense that we may not be able to identify sufficent machine
"features" in order to mentally consolidate into an object in our own human
minds. The implicit & explicit semiotics of our perception and our cognition
might not allow us to perceive such a "highly evolved" object.

In fact there may be tons of such entities all around us... in fact we may
find little evidence of "intelligence" in the universe because WE ARE NOT
INTELLIGENT!

Lastly - in organic evolution, species only coevolve if they interact with
respect to common environmental resources. If Eliezer's machines evolve very
rapidly then it might be likely that in a short time - (...say...)
milliseconds to half a decade - they would not compete substantially for
resources with the human species.

This will be an adaptive radiation scenario...it is within the realms of
probability.. because the machines might discover a new physical environment
into which they could radiate. At the very best we won't notice anything.
The manifestations of our hyperAI in our perceived physical world would be
benign - and human civilization would proceed along the "star trek" line
(ie, linearly and conservatively...).

A little worse scenario is that we might start to notice subtle but damning
changes in our physical laws - and progress in physics and engineering would
seem to be directed away from producing more AIs. Guess why?

A little worse again, is that we'd have a very brief and extremely
destructive (...to us...) "Terminator" sort of conflict - which would last
only long enough for the machines to access resources that we can't - say
mass-energy conservation violation or similar wierdness.

Any way you cut it... the future is not good for the all-consuming human
ego... but this is predictable... primates tend to have big egos - and they
also tend to be contriol freaks. You walk around with a handful of big
balloons and someone will come along to burst 'em.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:14 MDT