Re: One humanity, all in the same boat

From: Jacques Du Pasquier (
Date: Sun Dec 23 2001 - 20:32:26 MST

As I don't share, at least for now, Eliezer's convictions about the
"Singularity" (which are in fact not part of the Extro Principles), I
cannot really say "me too" on this.

But I do share his sense of proximity to all human beings, in light of
the "cosmic perspective" and otherwise, and of course, if there are
people here mainly concerned with race issues they are in the wrong
place ; go away ! (for Newton sake)

However I do find unfortunate that the post alludes to Mike's "Indian"
post in this context, as to me it was only Olga's interpretation (and
further quiet insistance) that was problematic.

I love the openness and independence from PC baggage that I find here,
so please keep it that way. People should not feel threatened that if
they say something possibly un-PC they will be responsible for long
pointless flames. It is impossible to please everyone and to take into
account any out-of-control interpretations ; nor is it our business to
do so.

And finally, I think the problem of the list (which I enjoy very much
all the same) is not xenophobia but the fact that some contributors
want to convince everyone (or only one person, but publicly, for some
reason) of their political ideas and historical takes about things
past, rather than actually think about the future and cover new


Eliezer S. Yudkowsky a écrit (23.12.2001/15:18) :
> I have to confess, some of the discussion on Extropians these days is
> making me feel sick to my stomach. It is enough to make me seriously
> consider leaving, possibly for good, or until someone else informs me that
> the list has made a recovery. If Extropy and ExI takes a serious hit from
> the xenophobia now being tossed about on the mailing list in the name of
> free speech, don't say I didn't warn you, because transhumanists in
> general *are* being held responsible for what gets said in these public
> forums. I am sick and tired of hearing about how transhumanism (or even
> the Singularity!) is a form of Social Darwinism in which the slow runners
> are forced out of the race, but after watching the debate on the list
> these past months, I can see why the idea keeps popping up. I'm writing
> this message in the hopes that those who have joined the list only
> recently can get a taste of what transhumanism is really about - that the
> ideas now being expressed don't represent the majority opinion, but only
> the opinions of those posters who are *still* posting, instead of having
> given up months ago.
> As the child of two science fiction readers, I grew up reading books that
> preached peace and tolerance toward other life forms, whether they had
> hands or tentacles, whether they breathed oxygen or liquid helium, as long
> as they were sentient. Tolerance toward other *human* countries was so
> obvious that it rarely even needed to be stated explicitly. Where there
> were group conflicts, the good guys were generally "humanity" - although
> sometimes it was humanity that was in the wrong, and if so (the books
> said) a real good guy would still be sure to fight on the side of right,
> even against their own species.
> Unnoticed in these dramas of good against evil was the implicit notion of
> one species, one faction. The unity of humanity was taken for granted;
> what the books tried to teach was tolerance of *aliens*. Or at least that
> was the way I saw it when I was, oh, nine years old or thereabouts. Today
> I have a more detailed view of science fiction, and yes, I can see that
> some of it implicitly accepts the idea of human factions by preaching that
> human factions ought to be nice to each other - but I still think that the
> best argument of all is the one my ears heard initially, that humanity is
> really one world, one species, one faction, all in the same boat,
> regardless of whatever short-term arguments currently plague the world.
> As far as I'm concerned, this is part of what transhumanism is all about.
> If we can learn that uploads and AIs and augmented humans are on our side,
> then it implies - as a simple, even overlooked corollary, which is much
> better than explicit preaching - that all humans must be on the same side
> as well.
> Only the most clueless of Singularity gradualists, in severe hard takeoff
> denial, could even begin to imagine that the Singularity was an argument
> for Social Darwinism; it is cluelessness of the same order as arguing that
> the Singularity concept leads to passivity. If the value of the whole
> world is a quadrillion dollars, and a quintillion dollars of new wealth is
> created by an egalitarian superintelligence, the previous distribution of
> wealth becomes irrelevant; ripples erased by a tidal wave. *Any* wealth
> created by a Friendly AI, or created by an altruistic egalitarian
> transhuman such as a transcended Eliezer or Samantha, is an equalizing
> force that smoothes over or completely wipes out the old divisions between
> humans. The Singularity meme is a psychological force that mediates
> against any form of prejudice by showing that all inequalities are
> temporary, even genuine inequalities; it doesn't matter who has an IQ of
> 130 and who has an IQ of 90 today, if both can cooperate to have an IQ of
> 180 in a few years. And the *real* Singularity is a fundamentally
> egalitarian force because I expect the Singularity to completely ignore
> the faction fights that humans spend so much time pursuing. Nor does gray
> goo respect national boundaries. We win together, or lose together. We
> are all in the same boat.
> There are no "Indians". There are no "Afghani". There are only humans.
> Transhumanism, and Singularitarianism, are basically egalitarian
> philosophies, because they permit the analysis of cosmic perspectives, and
> when you look through a cosmic perspective the modern-day divisions
> between humans become absurd. There is no reason why the happiness of an
> American computer programmer should weigh more in my calculations than the
> happiness of an Indian computer programmer. They are both the same kind
> of entity. They are both evolved biological organisms. There is no
> reason why one would have any greater intrinsic worth than the other.
> Humans may spend most of their day dividing into factions and fighting,
> but that doesn't mean the factions are real, it just means that most
> people sadly lack the perspective of species-wide unity which is provided
> by thinking about transhumanity and uploads and AI.
> Well, that's what it's really about. I think it'd be nice if everyone who
> agrees with this reasoning posts a "Me too!", and everyone who disagrees
> with the argument but agrees with the conclusion posts a note saying that
> as well. The people who are new to the list need to see something to
> outweigh some of the poisonous remarks that seem to be getting so much
> airtime. Of course I fear that most of the good guys in this battle may
> have already gotten sick of the list and unsubscribed, so it may be too
> late. But for the record, I want to say that there was a time when I
> would never have needed to post this message, and *that* is what
> transhumanism is *really* about, no matter what happens to this mailing
> list.
> -- -- -- -- --
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:30 MDT