Extropian Form Letter (was: an exhortation to action)

Hal Dunn (johngalt@digital.net)
Sun, 01 Dec 1996 17:15:56 -0500

>From: Eliezer Yudkowsky <sentience@pobox.com>
>Date: Sat, 30 Nov 1996 18:16:13 -0600
>If I may make a suggestion? Start a new thread called "Extropian Form
>Letter", devoted to building an outline for an article/letter/pamphlet
>that would introduce the untutored to the Singularity.
>My feeble take on V.1,
>from an article I was working on:
>Title: "History Ends In 2025, If We're Smart Enough."
>Reasons: Introduce "End of History" to grab attention

I've never understood the "End of History" meme. (I know Francis Fukuyama
wrote a book titled *The End of History and the Last Man.*) I think it's
inaccurate, even tho' some will defend it as an metaphor. History will
never end until existence ends. Human history will never end until human
existence ends.

I think revolutionary events, e.g., the singularity, cause significant
"historical shifts" but not the end of history.

(Actually, I don't like the term "singularity," either, but I'll save that
criticism for another post. I like the concept, just not the name for it.)

>1. Computers double in power every two subjective years.

Also, isn't it Moore's Law that states that the processing power of the
microchip doubles at least every 18 months? Maybe I'm wrong, but I thought
that wouldn't necessarily equate to computers doubling in power every two
subjective years.

Hal Dunn