Re: Human AI to superhuman (Re: Max More)

Emmanuel Charpentier (
Thu, 10 Sep 1998 05:15:35 -0700 (PDT)

---"Eliezer S. Yudkowsky" <> wrote:
> Robin Hanson wrote:
> >
> > Eliezer S. Yudkowsky writes:
> > >You can't draw conclusions from one system to the other. The
> > >genes give rise to an algorithm that optimizes itself and then
> > >the brain according to genetically determined architectures ...
> >
> > But where *do* you draw your conclusions from, if not by analogy
> > other intelligence growth processes? Saying that
"superintelligence is
> > nothing like anything we've ever known, so my superfast growth
> > are as well founded as any other" would be a very weak argument.
Do you
> > have any stronger argument?
> Basically, "I designed the thing and this is how I think it will
work and this
> is why." There aren't any self-enhancing intelligences in Nature,
and the
> behavior produced by self-enhancement is qualitatively distinct. In
> this is not a time for analogic reasoning.

Excuse me for carrying on (I hope I'm not being a pain, or pointless, or boring...), but I would say that, IMHO, the first Artficial Intelligence we will create will mostly have the same characteristics as us. The same defaults, the same qualities, no magic wand. And self-enhancement is no better, I don't see why it would make anything other than geniuses with any sort of add-on as can be imagined.

And we (as humans in flesh) will probably have a cutting edge for a long time: evolution has made us, we are very strongly a part of the world, we have instincts (quite entangled sometimes). Those instincts can cover everything from pain/pleasure to basic wirings of the brain, to algorithms for creating more wirings through what we call games, tests, experiments. And we have a body.

This could mean that singularity doesn't have to happen. Just the usual exponential knowledge growth. And if we might ever stumble upon a new architecture for intelligence (real new, not just adding or improving parts), this change might or might not be of great importance. But how can we know.

Well, it is true that AI still (seem to) have the speed/power advantage (and possibly eli's others abilities), but do that mean it will change all the cards? Lead to that end of history that seems to be the singularity???


Get your free address at