I alerted some friends to the existence of "The Singularity is Near",
A Book Précis, by Raymond Kurzweil
( http://www.kurzweilai.net/articles/art0012.html?printable=1 ),
and I've been a little surprised by a couple of the positive
responses I've gotten. I bought copies of _The Age of Spiritual
Machines_ (Kurzweil) and _Robot: Mere Machine to Transcendent
Mind_ (Moravec) back in early '99 to lend to people, with tepid
responses at best (though I should mention that I got a
very excited response from a guy at work to whom I lent my
copy of Damien Broderick's _The Spike_ -- back in '97, I think it
was). I've also tried getting people to sample Eliezer
Yudkowsky's Web essays, with no success at all (sorry, Eliezer :-< ) --
though **I** thought these were **fabulous** when I stumbled
across them in early '97, I think they may be a little
off-putting to "civilians".
However, Kurzweil's new Web article may be just the ticket for
getting acquaintances acquainted with the big S. It's less of
an imposition than trying to push a book on somebody (it's also
snappier and more to the point, and less long-winded and cutesy
than the earlier _Spiritual Machines_, which also had an
unfortunate title, IMO). I've taken the liberty of including the
following e-mail exchange with one friend (with the permission of
Michael S. Goldfarb). Oh, and I've sent S-packets past this
friend in the past, but this is the first one that he's ever
really grabbed (or that's grabbed him)!
Wow! I just finished reading it. (Well, skimming it anyway. [It's a good
thing that I will soon be able to extend my intelligence so that I don't
immediately glaze over when mathematical formulae are introduced!]) Quite
impressive, and possibly even largely probable. Somehow, I doubt that the
progression of events that he describes will go quite so smoothly. He
ignores a good deal of the complex social crises that the Singularity will
bring to a head (have vs. have-not, scientific vs. religious, the whole
moral/ethical/legal quagmire, etc.) The sort of thing that Asimov or
Heinlein would have described in their golden-age stories as "the nasty
conflict between the more-than-human Extenders vs. plain old humans".
Well, I guess it's going to be an interesting ride. Nice to hear so much
carefully thought-out optimism instead of the usual dystopia stuff. I
sincerely hope that all our economic and social models are indeed operating
on obsolete principles... (Of course, I remain essentially pessimistic!)
Gee, Ray's a REALLY smart guy, isn't he? (And yes, I had a prickle of pride
when he mentioned his '70s OCR technology!) [Michael had operated
early Kurzweil OCR equipment before we became acquainted,
and had a great deal of respect for Kurzweil before I
ever knew who he was.]
Thanks for sending this, it was a thought-provoking and inspiring read.
Glad to hear you enjoyed it!
> Somehow, I doubt that the progression of events that he describes
> will go quite so smoothly.
I don't expect it's going to be smooth, either. On the other hand,
when I saw Kurzweil in person giving a keynote speech at PC Expo
last summer (he talked about the same thing as in the Web
article, and showed the same graphs), he mentioned the fact
that, on one of his graphs, the Great Depression was barely
a glitch in the long-term trend (that graph is in the Web article,
it's the one labelled "U.S. GDP In Constant Dollars"; look at the
black data points for the glitch, not the smooth blue line).
When you think of all the suffering during those years, it's a
bit appalling that it fades so quickly into insignificance in the
big picture. I pointed out on the Extropians' not so long ago
that, if Kurzweil's (and similar) predictions about the mass of
non-biological intelligence eventually dwarfing the mass of human
intelligence on the planet is accurate, we could have a war that
wipes out 99% of the human race, and it still might only look
like a "glitch" in the big picture.
> ...the complex social crises that the Singularity will
> bring to a head (have vs. have-not, scientific vs. religious, the
> whole moral/ethical/legal quagmire, etc.)
Some of this has already started, in the debates about genetically-
engineered crops, use of fetal tissue in research, cloning, etc.
But would you believe that one very smart person -- Bill Joy
of Sun Microsystems, of all people (the inventor of 'vi' and Java!) has
started to call for the government to step in to put limits
on uncontrolled technological growth, and is calling for corporations
to start thinking about ways to adopt a voluntary policy of
"relinquishment". For his latest spiel, see
it's the full text of an article (don't bother with the link it gives --
that's stale) from the San Jose Mercury News about a panel in which Joy
participated in San Francisco a couple of weeks ago. This would have
to involve a **major** curtailment of civil liberties, which Joy thinks
is better than the alternatives.
> ...The sort of thing that Asimov or Heinlein would have described
> in their golden-age stories as "the nasty conflict between the
> more-than-human Extenders vs. plain old humans".
Yep, or what Hugo de Garis calls the conflict between the "Terrans"
and the "Cosmists". See
> Gee, Ray's a REALLY smart guy, isn't he?
Yes he is, but he's by no means the only smart person thinking
about these things (or even writing books about them). However, he
is emerging as the most publicly visible of the pro-technologists,
at the same time as Joy is emerging as the spokesperson of
the anti-technology technologists.
Okay, I just wanted to briefly mention that I read most of de Garis'
material and found it interesting, if hyperbolic. The Kurzweil piece seemed
better thought out and more believable, though any kind of prognosticating
beyond the next ten years or so seems pretty dubious, if not futile.
(Personal experience: Even after I listened to your first CD player...,
I figured it would take a decade or two for CDs to eclipse
the entrenched LP technology... and it took about five years.)
What I find interesting (and I'm sure is also one of your traditional
appreciations of this material) is how closely it hews to so many classic
(and little-known, but familiar to folks like us who've read thousands of
novels over the decades) SF tropes. Apart from some of the specifics of the
new technologies, there is actually very little about it that hasn't been
treated in some (fictitious) detail before. Even just limiting ourselves to
the narrow confines of SF in movies/TV, we're talking Krell machines, the
Terminator future, the Borg, Dr. Forbin's dilemma, V'ger, Replicants, the
Monolith, etc. Whoa...
Well, you've done it - I find I am actually somewhat interested in this
Singularity/new intelligence/evolutionary leap stuff. Given my vast SF
background, how could I NOT be?!?
As to Bill Joy, you must know that anything resembling repression of the
creative/scientific impulse, no matter whose "good" it's for, just scares
the bejeesus out of me. Since, if I weren't a card-carrying liberal Dem,
I'd probably be a Libertarian, I'm all for a laissez-faire approach to
invention. But there are some awfully scary times a-comin', aren't there?
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:48 MDT