Re: more on Brand and the Singularity idea

From: Jim Fehlinger (fehlinger@home.com)
Date: Tue Jan 16 2001 - 05:38:39 MST


Charlie Stross wrote:
>
> I'm just saying that new technologies have side-effects, sometimes disastrous
> ones, and insisting that their deployment is *always* beneficial isn't going
> to fool anyone (and is going to make us look like idiots).

Also, effects that some folks on this list can contemplate with
equanimity
are events that would horrify many people outside of extreme
sci-fi/technophilic
circles. For example, Eliezer Yudkowsky has said on many occasions
that,
as long as the human race survives long enough to give birth to some
sort
of superintelligence, the ultimate fate of humanity is of no consequence
(to him or, presumably, in the ultimate scheme of things). I suspect
that this
attitude is part of what gives folks like Bill Joy the willies.

When I heard Ray Kurzweil speak at the PC Expo last summer, he showed
transparencies
of various graphs he had prepared of historical data, which he claimed
showed
that progress on various fronts is exponential. One of these graphs was
of
economic data, in which (as Kurzweil pointed out) the Great Depression
was
a visible glitch, but one which was more than made up for by the surge
in
economic growth which took place after its end. It crossed my mind then
that,
if the evolution of post-human intelligence involves the extermination
of most
or all of the human race (as in the scenarios of Hugo de Garis or Kevin
Warwick),
a retrospective Kurzweil graph of the event might still show it as a
barely-visible
glitch in the exponential curve -- if the Singularitarians are right,
the
sheer computational capacity of the entities swarming through the solar
system
a few years after the extinction of humanity might be such that to them,
the
loss of several billion human brains' worth of processing capacity might
be no
more than the ongoing quota of traffic fatalities that the human race
(or the
industrialized part of it) is willing to bear as the price of having
cars. Or maybe
even less -- no more than the unnoticed and unmourned loss of a few
cells from an
individual human being.

>From an individual's perspective, the Great Depression was a period of
almost unimaginable suffering, as would de Garis' Cosmist-vs-Terran
war (either between pro- and anti-AI humans, or between humans and AIs).
Many mainstream people would say that anybody who can contemplate such
an
event with detachment must be a bit of a monster. Be that as it may,
it may prove to be an unbridgeable gulf between Singularitarians
and the rest of humanity (even technologically sophisticated folks like
Bill
Joy), if the former are seen as taking a cold-blooded,
"che sera, sera" attitude toward the possibility of the extinction of
the
human race. I think the motives of enthusiastic Singularitarians are
always going to be mistrusted by the mainstream, and Extropians and
Singularitarians are likely to continue to be portrayed by journalists
and
authors as they are, for instance, in Ken Macleod's _The Cassini
Division_.

Jim F.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:19 MDT