Re: The Dazzle Effect: Staring into the Singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Aug 16 2001 - 22:00:02 MDT


Lee Corbin wrote:
>
> Eliezer wrote
>
> > If the Moon were made of computronium instead of green cheese I doubt
> > it would take so much as a week for me *or* Eugene Leitl to wake it up,
> > if it didn't wake up before then due to self-organization of any noise
> > in the circuitry.
>
> It stands to reason that if either you or Eugene could accomplish this,
> then so could many lesser mortals (although it might take us longer).
> Could you provide a general outline of what you'd do, exactly, to
> cause the computronium to "wake up"? I'm skeptical you see---it would
> seem to me to require an incredible balancing act to keep the entity
> on track.

I didn't say it would stay on track. I said it'd wake up. Friendliness
is *not* part of the spec on this one. Brute-forcing an AI is an act of
pure desperation, to be committed only when the goo is on the way and
intelligent life is doomed in any case, in the hope of an objective
morality scenario. If then. Brute-forcing intelligence using any kind of
evolutionary algorithm is a hideously immoral act, immoral on the scale of
entire worlds; it's what Nature did to *us*.

Leaving both morality and Friendliness aside, and considering it purely
from a technical standpoint, all that would really be needed is a
partitioning into million-brainpower computing partitions, a genetic code
that fills up those partitions with neural-network elements, and an
arbitrarily complex competitive game in which moves are signalled by the
outputs of those elements. (You can make it a multiplayer social game and
bump up the chance of Friendliness by some very small fraction.) Take an
idiot-simple neural network the size of a planet as the starting point,
mutate and recombine the genetic code randomly, and start playing with a
trillion-sized population of megabrainpower entities (megabrainpower does
*not* imply megamindpower, it refers to the number of computing elements).

If the timescale is compressed a billion to one, i.e. the neural elements
operate at at least a billion times the speed of the neural elements in
the human brain, and if one generation undergoes a billion firings in a
lifetime, then two hundred generations would pass per second.

I'd expect at least one superintelligence to be born before a million
generations had passed - five thousand seconds, less than two hours.

The programming time would consist of writing the neural net spec, the
genetic spec, the evolutionary operator, and the game spec. Efficiency
would not be an issue and the only requirement for any degree of initial
complexity would lie in the game. The network spec, and the genetic spec,
and the evolutionary algorithm could all be extremely simple as long as
the genetic spec and network spec were Turing complete. The steepness of
the hill-climbing algorithm, i.e. the ratio of successful to unsuccessful
mutations, is irrelevant when dealing with a computronium moon; that hill
will be climbed *fast*.

Getting a non-lunar-sized piece of computronium to wake up is more
complicated.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:10 MDT