Extropian Form Letter (was: an exhortation to action)

Eliezer Yudkowsky (sentience@pobox.com)
Sat, 30 Nov 1996 18:16:13 -0600

> What we need to do is to
> create ignition, heat up the idea by spreading it until the point that it
> burns the fuel rapidly into a gigantic wildfire out of control. In this way
> it is also alot like creating the first controlled fusion reaction where
> the ignition is the difficult part because variables need to be *perfect*
> and continueing the reaction is the easy part. All you need is a spark to
> ignite a raging brushfire. Lets ignite that fire
> by selecting the people who would be most effective at spreading the memes,
> contacting them and add them to the collective of the group. (No BORG jokes)

If I may make a suggestion? Start a new thread called "Extropian Form
Letter", devoted to building an outline for an article/letter/pamphlet
that would introduce the untutored to the Singularity. I believe that
the Singularity meme would spread faster than the Extropian culture
because it's smaller, slimmer, less vulnerable, easier to remember and
easier to grasp, plus it's all you need to know to start looking into
the future.

Remember, this is an outline for an *introduction*, not an

I'm requesting an outline on the theory that any individual Extropian
can implement it as an article or whatever, thus allowing it to be
published in multiple locations with multiple spins. Also, I can't do
outlines by myself for cognitive reasons; all I can do is craft a lot of
paragraphs and hope they're in the right order. My feeble take on V.1,
from an article I was working on:

Title: "History Ends In 2025, If We're Smart Enough."
Reasons: Introduce "End of History" to grab attention, "2025" to
emphasize immediate importance, "If We're Smart Enough" to dissociate
from tired doomsday memes (Greenhouse Effect) and religious/apocalyptic
memes that would only appeal to groups of a particular faith.

1. Computers double in power every two subjective years.
2. Recursive intelligence amplification.
3. Defines "Singularity."
4. Is this a good thing? Names factions.
5. Doomsday faction.
6. Uploading faction.
7. Intro to nanotechnology.
8. Nearness of nanotechnology.
9. Nanotechnology replaces economy.
10. Gray goo problem.
11. Intelligence amplification.
12. Runaway positive feedback of IA.
13. Replacement of human society: End of History.
14. Singularity provides Interim Meaning of Life.
15. Summary: History is about to end.

Let us discuss the following issues:
1. What material needs to be included in an introduction?
2. In what order should that material be presented?
3. How should that material be conveyed? (Examples, digressions into
specific paths, theoretical explanations...)
4. What objections *must* be countered?
5. What is being left out of this list?

         sentience@pobox.com      Eliezer S. Yudkowsky
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I know.