Re: Singularity: Generation gap

Eric Watt Forste (arkuat@pigdog.org)
Sat, 27 Sep 1997 14:30:45 -0700


Eliezer S. Yudkowsky writes:
> I think of the Singularity as protection against Failures Of The
> Imagination. Such failures can have *real* and *horrible*
> consequences.

Good point. Thank you for putting this perspective on things
for me.

> Omniscient FAPP means that you can visualize (and perfectly predict)
> all other "agents" in a game-theoretic scenario. While multiple
> competing OFAPP agents are questionable (halting problem?), a Power
> deciding whether to force-upload humanity could well be OFAPP,
> game-theoretically speaking.

Thank you for the very clear and precise definition. I know
that I tend to be more skeptical about scalar comparisons
between the computational power of the kinds of Turing machines
that we know how to make and the computational power of the
kinds of Turing machines that we are than you do. If I
swallowed Moravec's estimates of these things whole, I'd share
your concern. But I don't know whether or not we know how to
measure the computational power of chordate nervous systems in
a way that we can compare directly to the computational power
of our silicon abacuses.

There's research going on to implement sophisticated neural nets
in silicon hardware, and that research route might lead to an
omniscient FAPP entity someday even if we are fundamentally different
from abacuses. But a review of the taxonomy of neural net
architectures (I usually distinguish between feedforward and
recursive architectures, and make a second distinction depending
on whether the learning algorithm uses feedback from the exterior
environment or not) makes it clear that chordate nervous systems
are both recursive in architecture and (to some extent or another)
seek out and use reinforcing behavior from the environment in the
learning algorithms. It seems to me that we have very little
technical experience building and training nets that use *all* the
architectural tricks, or in other words, building and training nets
that even remotely resemble ourselves or even other real animals.
While I agree with you in wanting to guard against failures of
imagination, venturing real predictions in a field as new and
inchoate as this one is folly. I consider Moravec's predictions
to be an enjoyable form of play, but I don't let them keep me up
at night.

But you may well know more neural-net theory than I do (because
I'm guessing that you may well have more math than I do), so
maybe I'll adjust my paranoia upward a notch or two. As the
Bears song goes, "fear is never boring." ;)

--
Eric Watt Forste ++ arkuat@idiom.com ++ expectation foils perception -pcd