From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Mon Jan 07 2002 - 21:46:16 MST
"J. Hughes" wrote:
> Eliezer wrote:
> > The Singularity
> > interest group comprises an entirely separate subsector of transhumanism,
> > an interest group which includes "good progressive" transhumanists
> Indeed, I wish I had paid more attention to your good work before writing
> the essay, since you do seem to be addressing things in a very constructive
> way. However I didn't see in the work of the SIAI any explicit political
> viewpoint comparable to those expressed in the Extropian principles or the
> WTA founding documents, so I didn't invest in studying what you were doing.
Yes, that is correct. Our vision of a better future makes no appeal to
prescribing a supposedly ideal form of political organization for society;
as far as we're concerned human politics are a temporary means to a
transhuman end, with any transhuman organizational forms being subject to
the final opinion of the transhumans. It doesn't get much more apolitical
> Re: the millenial impulse and the Singularity. Personally I confess to
> millenial inspiration and longing, and see no shame in it. It has a long
> history, and is present in every culture. Atheists express it in many ways,
> such as in Marxism. It is a sociological observation not an epithet.
> However, among the varieties of millenial expression, the most common is the
> pre-millenialist who sees the vast majority of people as the damned,
> condemned to perish in the tribulations, while the elect are taken up. The
> post-millenialist, among which I would place you Eliezer, believe that the
> millenium can only be established by working hard to make this world the
> Kingdom. This is for instance the Buddhist eschatological vision: the world
> becomes rich and fruitful, living in peace, ruled by a just government,
> which then withers and renounces authority when the next Buddha comes.
"Millennial apocalyptism" does not sound like a description of a
human-universal emotion. It sounds like a description of a cultural
tradition, with particular appeal to the tradition of a God that thinks in
base-ten round numbers. If what you mean is the human-universal hope for
a better world, then please pick a descriptive term other than "millennial
apocalyptism". Frankly, it is impossible for me to interpret your paper's
current comments on the Singularity as having any friendly meaning (see
below for the relevant quotes).
> Also, I don't think the material reality of the Singularity means that many
> people are not inspired and effected by the millenial traditions when they
> come to learn about Singularitarian possibilities.
The material reality of the Singularity means that no appeal to the
millennial traditions is necessary. I explicitly reject such appeal as
harmful on the rare occasions when someone on the SL4 email list attempts
it. Reading your paper (and Bozeman, whom you quote) clearly shows that
millennial apocalyptism is being considered as a memetic source of
nonfactual and counterfactual input into the Singularity meme, and not as
the source of humanist ends in conjunction with technological means.
I here re-quote your paper:
> There are occasional discussions on the extropian list
> about the potential downsides or catastrophic
> consequences of emerging technologies, but these
> are generally waved off as being either easily
> remediable or acceptable risks given the tremendous
> rewards. This form of argumentation is more
> understandable since most extropians have adopted a
> form of millennial apocalyptic which they call “the
In other words, you appeal to millennial apocalyptism as the putative
source of beliefs which you believe to be both false and harmful.
> But the roots of the
> Singularity idea are in the transcultural millenarian
> impulse; the Singularity is a vision of techno-Rapture
> for secular, alienated, relatively powerless,
> techno-enthusiasts (Bozeman, 1997).
Again, it's hard to see this as anything except an attack on the
Singularity meme via an attempt to explain it solely through reference to
impulses and emotions which have no basis in fact.
[Further quotes return to the replied-to message.]
> Re: Extrop List politics: I understand from some of the responses I received
> that I have missed a set of serious political discussions on the list that
> revealed that the vocal libertarians are in fact (at least now) a minority.
No, they're not. "Libertarian" simply doesn't have the connotations you
think it does. There are a few people who are both libertarian and
profoundly unenlightened; these people's views are not shared by either
the other libertarian transhumanists nor the conservative, liberal, or
socialist transhumanists. We've had some vocal and profoundly
unenlightened liberal/socialist transhumanists on this mailing list too,
by the way, so please don't go blaming it on libertarianism.
> I'm happy to hear libertarian views, have learned a lot from them, and
> become more libertarian as a result. Its not that I don't want to hear from
> libertarians, I just don't want to be flamed off by them. For instance Mike
> Lorrey's response to my essay:
> > Don't you recognise who he is talking about? Anyone who thinks his
> > 'vision' of a "sexy, democratic future" run by his social democratic
> > (i.e. bolshevik) ideology is a dumb idea is 'obviously' part of the
> > alleged 'neo-nazi wing' of transhumanism. Before you can send
> > anyone off to reeducation camps, you have to
> > properly identify them as members of a 'well documented'
> > counter-revolutionary cabal.
> ...this is rather typical of what I have encountered on the list in the
Mike Lorrey is one of the three people whose enormously voluminous posts I
was trying to counteract with the "one humanity, all in the same boat"
thread. This is an aggressively open, uncensored, unmoderated list, and
even if it weren't, quoting one opinion would *still* prove nothing. I
think you are reasoning from rather subjective data here! Maybe these
opinions "jump out" at you because nobody would ever express them at all
in most fora you're used to, but this represents a wider *variance* among
transhumanists, a wider distribution, not a distribution skewed towards
one end of the curve; you're just noticing the messages from one end of
the curve because, frankly, you happen to find those messages morally
offensive and therefore highly salient.
> You, and Nick Bostrom BTW, assert that Nazi transhumanism is an oxymoron.
> But a central point of my essay, and a point expressed by many, is that
> transhumanism, i.e. the idea that human beings should be able to improve
> themselves radically through technology, does not have much intrinsic
> political content. It is probably imcompatible with theocracy, although I
> bet a theocracy could prove us wrong (maybe the Scientologists). And
> racialism is bad science. But I don't think it makes sense to say that
> simply because most transhumanists are anti-Nazi that transhumanists can't
> be Nazis.
I think that perhaps you fail to appreciate the historical context of Nazi
ideology. See for example my post "MEME: History, Nazis, and >H" in Dec
1999, which I quote (slightly cleaned up) here:
Eliezer Yudkowsky wrote:
> After reading _The Rise and Fall of the Third Reich_, I've realized that
> the claimed association between "transhuman" and "Nazi Germany"
> is based on a historical misconception. As far as I can tell, the Nazis
> never tried to create a new superior race; never even claimed to be
> evolving into one. Rather, the Nazi philosophy was that there was an
> Aryan ideal, the "blond beast of prey", which had existed at some point
> in the past until it was corrupted by mixing with lesser races. They
> were trying to *breed back to that*, not breed forward to a new
> humanity. Transhumanism, which deals with shareable technological
> augmentation rather than eugenics, has essentially no point of contact.
> I've therefore done my best to create the following one-minute
> countermeme to the whole concept of transhumanism bearing even a vague
> resemblance to the Nazis:
> Historically, the Nazi philosophy was that they were protecting the
> existing Aryan ideal from racial corruption. Despite the common
> misconception, they never tried to create supermen. They just claimed
> to be supermen. And that's the key difference. That's *always* the key
> difference. That's what separates the Spanish Inquisition, which burned
> heretics in the defense of what they thought was the truth, from
> science, which tries to discover new truths and share them with
> everyone. It's the difference between claiming to be perfect, which is
> bad, and trying to be better, which is good.
> Transhumanists don't claim to be transhumans. We're just ordinary
> mortals, and we know it, but we want to create
> transhumanity and share it with everyone - or at least offer everyone
> the opportunity. History teaches us to beware of claims to have the
> absolute truth - not to be wary of truth itself, or of the quest for
> truth. History teaches us to be wary of those who claim to be a
> superior race. But that says nothing about the quest for transhumanity,
> nor about actual transhumans.
So the claim that Nazism and transhumanism are fundamentally opposed is
not just a reflexive defense; there is some nonobvious philosophical and
historical depth to it.
(This is also, BTW, why I'm such a pedant whenever someone uses the word
"transhuman" in place of "transhumanist".)
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:33 MST