Re: Extropian Form Letter (was: an exhortation to action)

Twirlip of Greymist (
Sun, 1 Dec 1996 11:35:46 -0800 (PST)

On Dec 1, 9:37am, Eliezer Yudkowsky wrote:

} > } > 1. Computers double in power every two subjective years.
} > } The easiest argument to understand. Attention-catcher.
} > And the easiest to doubt has predictive power. And I think Eugene,
} > perhaps among others, has challenged the metric of "power" used.

You're missing the point. The people most useful to target are the most
likely to think about the incoming memes. If you claim Moore's Law as
evidence, and they think that you are silly for extending it like that,
then their doubt and disbelief will probably count against the rest of
your argument. Excuse me, meme package.

Snappy memes for simple minds. Sophisticated memes for sophisticated

I also have a cognitive flaw; I have a low-level bias toward truth and
honesty. Lying is best done for very material and personal gain in a
non-intellectual environment. This is neither.

} Maybe I'd have better luck asking the "Church of Virus" for help.

Good luck. That list could certainly use something else to talk about.

But Kennita Watson's post reminded me of something: even if I believed
in your version of the Singularity, I'm not sure that I would want to
spread it like that. I'd like some chance of surviving the Singularity
in some manner, and the ingoing memes might have an effect. Nazi
scientists trigger Singularity; whee, we're all dead. Paranoid
survivalists trigger it; whee, we're all dead, or at least very tightly
constrained. "Nice" people trigger it; I might get spared, uploaded, or
even transcended myself.

Merry part,
-xx- Damien R. Sullivan X-) <*>

"Life's little ceremonies leap everlasting, humans spring eternal on
hope's breast, and frying pans without fires are often far between."
-- Roger Zelazny, one of the Amber books, Corwin