Re: extropians-digest V1 #110

Eliezer Yudkowsky (
Sat, 16 Nov 1996 22:37:39 -0600

Saith yours truly:
<<One hundred percent absolutely correct! The Powers will think faster
than we do, be smarter. How in the world do you imagine that they'll get
to an "intermediate level" and just *stop* there? The essence of the
Singularity is that a smarter being has an increased capacity to invent
still smarter beings; that's why everything goes to infinity.>>

Saith: (Robin Hanson)
<Whether smarter beings can create still smarter beings at a faster
rate than lessor beings can improve on themselves depends on how the
difficulty of the task of getting smarter scales with intelligence.
It could be that the task gets harder so fast that smarter beings take
*longer* to get still smarter. We know far too little to say that
this isn't the case.>

Computer power doubles every two years.
Computer power doubles every two *subjective* years.
Computer power goes to infinity four years after computer researchers
reach human equivalence.
Mathematically, I believe it's called a "singularity".

You don't need enhanced smartness to outrun *any* task no matter
*how* it scales up, all you need is recursive Moore's Law.

            Eliezer S. Yudkowsky
                 "I'm sorry, my karma ran over your dogma."
Disclaimer:  My opinions do not necessarily represent those of the other
members of the Interstellar Institute for Software Development and World
Conquest.  The IISDWC is a licensed conspiracy in competition with the
Microsoft Corporation and Dogbert's New Ruling Class.  Unless otherwise
specified, I'm not telling you everything I know.