Re: WTA Mission Statement

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jan 14 2002 - 14:29:58 MST


Mike Treder wrote:
>
> [For a proposed WTA mission statement -- Eliezer.]
>
> To promote discussion of the possibilities
> for radical improvement of human capacities using genetic,
> cybernetic, and nano technologies; to understand, manage, and direct
> the development of these technologies in a way that will preserve
> and protect the best of what makes us human while reducing or
> eliminating the worst; to raise public consciousness of these
> matters and assist in the orderly but urgent reorientation of our
> society into the new age that is nearly upon us.

"J. Hughes" wrote:
>
> Now, turning to Mike's language, the problem I have is that it suggests the
> WTA will actually wield some influence over the course of human destiny,
> which I think is a little grandiose a suggesiton for now. Also the phrase
> "the new age that is nearly upon us" is a tad too certain of the immanence
> of Singularity, and its millenial character, for my taste. Although I accept
> that the rapidly accelerating pace of technological change makes apocalyptic
> social change likely, and social prediction beyond a certain horizon
> impossible, this still sounds too millenial too me.

Writing about the Singularity isn't easy. It begins, of course, with the
complete *personal* rejection of the millennial attitude. But there's
also a grab-bag of authorial patterns which are useful in avoiding the
appearance of millennialism *after* you've successfully avoided the
substance. This grab-bag may not be sufficient in itself, but it should
nonetheless be useful. (Incidentally, if you think that SIAI's current
online materials don't adequately follow the following rules, this is
because I wrote them a while back, before I was finished mentally
formalizing the rules.)

For the problems above, I would suggest these solutions:

> Now, turning to Mike's language, the problem I have is that it suggests the
> WTA will actually wield some influence over the course of human destiny,
> which I think is a little grandiose a suggesiton for now.

If the WTA were not *actually* going to wield some influence over the
course of human destiny, I wouldn't have bothered signing up. So in this
case the problem is not that the suggestion *is* grandiose but that it
*sounds* grandiose.

The underlying cause of this problem is the pervasive sense of learned
helplessness that characterizes modern-day society. People are taught
that social changes are not the result of small groups or individual
efforts, but rather the result of massive economic forces, majority
opinions, major governments, megacorporations, ultrawealthy individuals,
random luck, or other forces external to the individual. In the vast
majority of cases, of course, this is quite correct. Back when we were
running around in 200-person hunter-gatherer tribes it may have been
possible for one individual to save the whole tribe or change the whole
observed world, but a 6-billion-person planet has a bit more inertia.
However, because we *are* adapted to 200-person tribes, there is an
underlying desire to accomplish something significant on the scale of the
whole of observed society, and a feeling of nonfulfillment when this is
impossible, as it usually is in today's world.

Now, you might think, if you were an optimist approaching this problem for
the first time, that this state of affairs would help propagate
transhumanist memes, because it's a hole transhumanism can fill.
*Snicker*. Not in this universe, buddy.

In cases of frequently unfulfilled desire, the most common result is a
prevalent "consolation philosophy" - in this case, people being told "Just
do what you can, and don't try to save the world" or "Working to improve
your own life and the lives of your family is just as important as
improving the world" or "If everyone does a little, it adds up to a lot."
The attractiveness of a consolation philosophy stems from averting the
negative mental feelings resulting from an unfulfilled desire. If you
offer a way to actually *fulfill* the desire, your offer is seen by the
audience as a threat to their mental equilibrium - by trying to tell
people that they can change the world, you are threatening the precious
mental equilibrium that lets them *not* try to change the world with only
a slight hollow feeling.

The other thing that happens, in the case of pervasively unfulfilled
desire, is that various kook memes arise which use their spurious offer to
fulfill the desire as the memetic bait; that is, a lot of kook memes arise
which *promise*, but do not deliver, the chance to change the world.
This, in turn, causes a skeptic's heuristic to arise which says that
"Offering the chance to change the world is a sign of cultishness."

The long-term or even mid-term solution to the problem, if you're writing
an extended essay, is to demonstrate that your statements are rationally
supportable, independent of any other nonrational memes that may happen to
be floating around. There is a sense in which having and taking the
chance to change the world is a "lower energy state" than nonfulfillment
protected by consolation philosophies, or cult memes protected by
rationalizations, so if you can get people to see that this is a genuine
offer they're likely to take it.

However, there's also an authorial hack that you can use on an immediate
level, when you only have a few sentences or paragraphs in which to make
your point.

Instead of writing:

"The Singularity Institute will directly implement the Singularity and
thereby vastly benefit humanity."
    you write
"In the next few decades, humanity will gain the technological capability
to create smarter-than-human intelligence. We believe that this event has
the potential to enormously benefit humanity. The Singularity Institute
was founded in the belief that an event this large deserves a nonprofit
devoted to doing something about it."

In the second version, a reader who's hitting on your page for the first
time can imagine - if they so wish - that the Singularity is the result of
IBM research, a US government project, a vast planetary trend, or whatever
else they like that's outside of the individual. In other words, you
defer direct contradiction of the "individual helplessness" mindset until
you have the leisure to back up your challenge with technical arguments.

> Also the phrase
> "the new age that is nearly upon us" is a tad too certain of the immanence
> of Singularity, and its millenial character, for my taste. Although I accept
> that the rapidly accelerating pace of technological change makes apocalyptic
> social change likely, and social prediction beyond a certain horizon
> impossible, this still sounds too millenial too me.

The phrase is indeed too millennial, but not for reasons intrinsic to the
Singularity model. In this case, the phrase "the new age that is nearly
upon us" contains two errors of language. If you visualize the actual
model implied by this phrase, as a Lakoff-and-Johnson visual metaphor, it
implies that there is an object called "the new age" which is approaching
the viewpoint observer as the result of the viewpoint observer moving
forward in time, or rather the viewpoint observer remaining motionless as
future events move toward the observer. The two semantic errors rooted in
this model, which trigger suggestions of cultishness (what James calls
"millennialism") are:

1) The implication that the "new age" is known to exist with absolute
certainty. This is not correct. Either biological or nanotechnological
warfare would appear to be a sufficient condition to wipe humanity off the
face of the Earth and hence prevent a Singularity, and many lesser
catastrophes would substantially delay it. See Nick Bostrom's
"Existential Risks" paper.

2) The implication that the new age will come about without human
intervention, as the simple result of the future becoming the present.
This implies a passivist view of the world, but that's not the only
problem. The semantics ("the new age that is nearly upon us") implies
that the new age is caused by an external agency - that it is divinely
imposed, predestined fate, and so on. It sounds like there's a preset
calendar date on which the old age becomes the new age. This is also a
signal of cultishness.

The overall contrast I am trying to draw is between a cult/prophetic
model, in which global changes are externally caused, possess set calendar
dates, and are certain; and rational models of the future, in which global
changes are the result of intrinsic causal agencies and have a probability
spectrum distributed over a range of dates. And please note that even
people who can't describe this difference explicitly will still feel the
contrast intuitively.

And of course, the most obvious problem of all is the use of the two words
"new age". Why go there?

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:34 MST