John, Prince von Rittergeist wrote:
>
> Please forgive a novice's question, but could someone give me a clear
> distinction between Extropianism and Transhumanism? From what I've been able
> to glean from the web, Extropianism seems to be a subset of Transhumanism,
> but I would appreciate a clear explanation of what the differences are
> between the two.
>
> Thanks in advance.
I would define Extropianism as that philosophy which has technological advance at its center. Extropians believe in high technology, power over the material and informational world, and particularly technological enhancement of the self. As I said once before, what defines an Extropian technology is hubris - challenging the gods, tearing down the foundations of the world, casually discussing technologies that send the uninitiated into shock, or technophobes screaming into the night. Uploading. Nanotechnology. Superintelligence. Technologies that warp reality around them, that change human nature or life as we know it. Any Extropian generally covets at least one specific ultratechnology.
Transhumanists want to change themselves, to surpass their own limits - through the use of high technology or other rational means. Any transhumanist generally chafes at at least one specific limit.
Most transhumanists are also Extropians, and vice versa, for obvious reasons. If, however, an Extropian wished nanotechnology to explore the galaxy but not to live longer or improve vis own mind, ve would be an Extropian but not a transhumanist. If someone wished to double vis IQ but didn't really care how, that person would be a transhumanist but not an Extropian.
I, incidentally, am a fanatic Singularitarian. I want something smarter and more powerful and more ethical than I am to exist; I don't care how, or what happens to the human race or myself afterwards. When I say that I "don't care how", it doesn't mean that I'm an impractical fool - I write Web pages about "how" - but rather that if someone handed me a genie bottle, I would not find it less soul-satisfying to wish for a Singularity, rather than doing it all by hand. My purpose is the result, not the method, although like any practical being I spend more time thinking about methods.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.