Re: Singularity, Breaker of Dreams

den Otter (
Mon, 7 Sep 1998 14:19:22 +0200

> From: Eliezer S. Yudkowsky <>

> Now I am still sure that the Powers will be ethical, but I am no
> longer sure that this precludes taking us apart for spare atoms. I no
&gt; longer think that our continued survival has to threaten the Powers
> for us to be erased; I am now willing to accept that simple efficiency
> may require it. I am willing to accept that life may be meaningless.
> I am willing to accept that the only reward for all my service will be
> a painful death, for myself, for those I love, and for the entire
> human race. Only when one can accept all possibilities is one ready
> to choose between them.
> The power of the Singularitarian philosophy is that it draws on
> concepts with more force than our own desires. Over time, over years,
> it corrodes away our rationalizations. And above all, it presents an
> emotionally and rationally acceptable course of action, even after all
> the darkest alternatives are accepted. It really doesn't matter what
> the relative probabilities are. Either life has meaning, or it
> doesn't; either human life has meaning, or it doesn't; either we'll be
> upgraded, or we'll die. So we'll die? All the other generations have
> died. It's not some major tragedy because it happens to me instead of
> somebody else. So humanity will die? In a billion years, it is
> certain that humanity will die. Is it so horrible if humanity dies
> giving birth to something greater, giving meaning to all our dead
> ancestors? Sooner or later, some generation will face the choice
> between Singularity and extinction. Why push it off, even if we
> could? And besides, we might not die at all.

But we can't rely on that, can we?

> Because my best-guess dedication to the Singularity is fairly
> unaffected by the above probabilities varying between 0% and 90%, I
> can calmly and without worry evaluate arguments for and against. I
> can accept that every possibility might be real. The Singularity,
> through it all, is the only sane way to go. In accepting every
> possibility, I can also accept the dictates of my own philosophy; I
> don't need to distort it to avoid "unacceptable" outcomes.
> We must each lose our dreams in order to grow, but not in despair.  We
> must abandon the small dreams of childhood, but without abandoning the
> ability to dream.  For there are two ways in which a dream may be
> broken; by the death of hope, or by a greater dream.  To acknowledge
> that we do not command the future, is not to say that we do not make
> it; and when we tear our eyes away from our yearnings, we may look
> upward to the sun, forward to tomorrow.

This isn't completely true. We *can* command the future if we finish the dash for SI in first place. If you lose yourself in defeatism you're already as good as dead. The facts are simple:

  1. at this time, only a handful of people really grasp the enormity of the coming changes, and (almost certainly) most of them are transhumanists (unfortunately, even in this select group many can't/won't understand the consequences of a Singularity, but that aside).
  2. This gives us a *huge* edge, further increased by the high concentration of scientific/technological talent in the >H community.
  3. If we start preparing now, by keeping a close eye on the the development of technologies that could play a major part in the Singularity (nanotech, AI, implants, intelligence augmentation, human-machine interfaces in general etc.) and by aquiring wealth (by any effective means) to set up a SI research facility, then we have a real chance of success.

Note: the goal should (obviously) be creating SI by "uplifting" (gradually uploading and augmenting) humans, *not* from AIs.

> Forward to the day when humanity awakens from its dream, to the
> ultimate shattering of Maya.  To the day when the greatest hacker of
> them all compiles the very last line of code, and looks out at an
> early morning sky, knowing that he looks on the Final Dawn.  To the
> day when it is said, in the tradition of Oppenheimer who looked upon
> another sun:  "I am become Singularity, Breaker of Dreams."

"Now I've become death, the destroyer of worlds."- The First SI?