Re: The Politics of Transhumanism

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 06 2002 - 16:28:12 MST


"J. Hughes" wrote:
>
> http://www.changesurfer.com/Acad/TranshumPolitics.htm

In which he wrote:
>
> There are occasional discussions on the extropian list
> about the potential downsides or catastrophic
> consequences of emerging technologies, but these
> are generally waved off as being either easily
> remediable or acceptable risks given the tremendous
> rewards. This form of argumentation is more
> understandable since most extropians have adopted a
> form of millennial apocalyptic which they call “the
> Singularity.”

I deal with the unjustified use of the prejudicial term "millennial
apocalyptic" in greater detail below. However, the above contention (that
belief in the Singularity leads to "waving off" potential downsides) is
directly contradicted by historical observation of sides taken on the
Extropian list. For example, during the goo prophylaxis debate partially
archived at
 http://www.nickbostrom.com/old/nanotechnology.html
I think I may legitimately lay claim to having had the most worried
outlook.

> The extropians’ Singularity is a coming
> abrupt rupture in social life, brought about by some
> confluence of genetic, cybernetic and nano
> technologies.

My personal experience is that, while this definition of the Singularity
is quite popular among outsiders looking in on transhumanism, it is much
rarer among actual transhumanists. It is a vague and even eschatological
definition. The canonical definition for the hard-core Singularity
interest group represented by the Singularity Institute is that the
Singularity comprises those events following from the rise of
greater-than-human intelligence in any form. Nanotechnology is thus
relevant only insofar as it is an enabling technology of AI or
intelligence enhancement.

> The concept of the Singularity was
> first proposed by science fiction author Vernor
> Vinge in a 1993 essay, referring specifically to the
> apocalyptic consequences of the emergence of
> self-willed artificial intelligence.

Again, Vinge's original Singularity concept refers to the emergence of
greater-than-human intelligence in *any* form.

> But the roots of the
> Singularity idea are in the transcultural millenarian
> impulse; the Singularity is a vision of techno-Rapture
> for secular, alienated, relatively powerless,
> techno-enthusiasts (Bozeman, 1997).

Your comments as they stand are reminiscent of Smalley on nanotechnology
or Lewontin on evolutionary psychology.

An examination of Bozeman 1997 shows that he presents no evidence
whatsoever for the assertion that the Singularity bears any current or
historical relation to the millenarian impulse; he simply happens to cite
the Singularity, in passing and without supporting evidence, in the same
paper as that describing other ideas which he believes to bear the same
flaw.

Proponents of the Singularity cite specific and technical arguments as
reasons for their beliefs; you must discuss the claimed sufficiency of
these specific and technical arguments before you can legitimately
postulate that religious fervor is a necessary ingredient for belief.
That people once believed certain things could be done by magic or
theology is no reason to suppose that they cannot be achieved
technologically; otherwise the belief in winged angels would have been
sufficient to prevent the Wright Brothers from leaving the ground. Humans
have always wanted to fly, and invented many emotionally satisfying
stories in which this desire was granted; eventually the always-existent
desire was coupled to new technological capabilities, and the impulse was
satisfied in reality as well as myth. The same may later prove to hold
true of the human wish to live forever. That Singularitarians wish to
bring about a massive worldwide change for the better is not evidence of
millenarian apocalyptism unless it can be demonstrated that we are
mistaken in asserting that this goal has now become technologically
feasible.

Given that the Singularity concept first rose to popularity among a social
group composed primarily of scientifically literate aggressive
rationalists, your assertion above is (a) extreme fighting words in that
cultural mileu, and (b) not very likely to be correct. If you don't like
the Singularity, present substantive arguments against it. A dismissal in
passing, as if the issue were already settled, is not constructive.

And if I may speak to the purposes behind your paper, may I also remind
you that advocates of the Singularity are not exclusively drawn from the
Extropian mailing list, and that your potshots at the Singularity are not
satisfying punches delivered to the "evil reactionary" Extropians but
rather punches delivered to an entirely separate interest group in
transhumanism. That you have chosen to attack the Singularity in this
paper makes it clear that you believe the Singularity to be an ideological
support of "evil reactionary" Extropianism. It is not. The Singularity
interest group comprises an entirely separate subsector of transhumanism,
an interest group which includes "good progressive" transhumanists such as
Damien Broderick and Ben Goertzel.

(The above paragraph should not be taken as tacitly condoning the style of
argument in which the universe revolves around the opponents and
supporters of liberalism. I find the paper's use of this style to be
appallingly parochial, but I'm just here to handle the Singularity end of
it.)

> The appeal of
> the Singularity for libertarians such as the extropians
> is that, like the Second Coming, it does not require
> any specific collective action;

I beg your pardon? What are we doing here at the Singularity Institute,
knitting sweaters?

> it is literally a deus ex
> machina. Ayn Rand envisioned society sinking into
> chaos once the techno-elite withdrew into their
> Valhalla. But the Singularity will elevate the
> techno-savvy elite while most likely wiping out
> everybody else.

An assertion contradicted by the Singularity Institute's literature, the
Singularitarian Principles, the moral content proposed in "Creating
Friendly AI", discussions on the SL4 mailing list, and the stated
humanistic reasons of most individuals explaining their personal
involvement in the Singularity.

> Working individually to avidly stay
> on the cutting edge of technology, transforming
> oneself into a post-human, is one’s best insurance of
> surviving and prospering through the Singularity.

As D. den Otter's example illustrates, such sentiments are expressed very
rarely and when expressed have found very little agreement (I can only
think of two cases). And of course D. den Otter's case also illustrates
another effect; that those who profess such beliefs will tend to be
rapidly marginalized out of Singularity communities because of their own
professed disinterest in group action.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:33 MST