Re: more on Brand and the Singularity idea

From: Jim Fehlinger (
Date: Tue Jan 16 2001 - 05:38:39 MST

Charlie Stross wrote:
> I'm just saying that new technologies have side-effects, sometimes disastrous
> ones, and insisting that their deployment is *always* beneficial isn't going
> to fool anyone (and is going to make us look like idiots).

Also, effects that some folks on this list can contemplate with
are events that would horrify many people outside of extreme
circles. For example, Eliezer Yudkowsky has said on many occasions
as long as the human race survives long enough to give birth to some
of superintelligence, the ultimate fate of humanity is of no consequence
(to him or, presumably, in the ultimate scheme of things). I suspect
that this
attitude is part of what gives folks like Bill Joy the willies.

When I heard Ray Kurzweil speak at the PC Expo last summer, he showed
of various graphs he had prepared of historical data, which he claimed
that progress on various fronts is exponential. One of these graphs was
economic data, in which (as Kurzweil pointed out) the Great Depression
a visible glitch, but one which was more than made up for by the surge
economic growth which took place after its end. It crossed my mind then
if the evolution of post-human intelligence involves the extermination
of most
or all of the human race (as in the scenarios of Hugo de Garis or Kevin
a retrospective Kurzweil graph of the event might still show it as a
glitch in the exponential curve -- if the Singularitarians are right,
sheer computational capacity of the entities swarming through the solar
a few years after the extinction of humanity might be such that to them,
loss of several billion human brains' worth of processing capacity might
be no
more than the ongoing quota of traffic fatalities that the human race
(or the
industrialized part of it) is willing to bear as the price of having
cars. Or maybe
even less -- no more than the unnoticed and unmourned loss of a few
cells from an
individual human being.

>From an individual's perspective, the Great Depression was a period of
almost unimaginable suffering, as would de Garis' Cosmist-vs-Terran
war (either between pro- and anti-AI humans, or between humans and AIs).
Many mainstream people would say that anybody who can contemplate such
event with detachment must be a bit of a monster. Be that as it may,
it may prove to be an unbridgeable gulf between Singularitarians
and the rest of humanity (even technologically sophisticated folks like
Joy), if the former are seen as taking a cold-blooded,
"che sera, sera" attitude toward the possibility of the extinction of
human race. I think the motives of enthusiastic Singularitarians are
always going to be mistrusted by the mainstream, and Extropians and
Singularitarians are likely to continue to be portrayed by journalists
authors as they are, for instance, in Ken Macleod's _The Cassini

Jim F.

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:19 MDT