> [Zero Powers doubts]... that even those who regard the "slow singularity"
> scenario as silly
> > (whoever that faction may be) expect us to leap from 20th Century tech to
> > strong AI in the course of a day or two. Now *that's* what I'd call silly.
> "Eliezer S. Yudkowsky" wrote: Do you mean a day at *our* speed or a day at
> *their* speed?
The term "slow" is not really what I was getting at. AI will be plenty
fast as far as that goes, but we need to deal with other axes besides
speed and friendliness, such as ambition. I can imagine a Singularity
that is fast enough to restructure existing computing hardware in a day,
and have the theoretical ability to do strong drextech, but be strangely
lacking in what we would call desire. Like a chess program, the
Singularity might be perfectly content to just compute whatever it is
asked to compute. It may have no ambition to upload humans or
expand itself in ways that seem so natural to species which prospered
by successfully competing for survival. An AI that arose in such a
manner might have no survival instinct, no urge to preserve itself, a trait
which is universal in species that evolved by natural selection.
An AI might invent itself in such a way that is neither friendly nor
unfriendly to humans, but rather indifferent. An AI *might* invent
itself in such a way that it really doesn't *want* to rule to universe,
even if it could. spike
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:08 MDT