Robin Hanson wrote:
> Eliezer S. Yudkowsky writes:
> >Transhumanism is the idea that future technology will probably improve
> >dramatically on the human body and the human mind. We or our descendants, in
> >five years or five hundred, may transcend so many limitations that we will no
> >longer be recognizably human - or even mortal.
> I have a lot of sympathy for this, as it's pretty similar to my proposal.
> But why just mention the body & mind? Transhumanists seem to talk alot
> about improving a lot of other things as well.
There are a lot of people talking about stardrives...but that's not intrinsic to "transhuman". What is a human but a body and a mind? It's improvement of the core self, not big flashy technosorcery, that defines trans-*humanity*.
> (And why especially highlight the mortality issue?)
Well, I think of "mortal" as a superset of "human"; the idea is to in some way imply a spectrum which includes bush robots and Jupiter Brains, not just nano-immortal humanoids. In retrospect, the tail-end format, the phrasing, and the nonstandard terminology lead me to think that the last three words should be dropped from the definition. Besides, it doesn't seem intrinsically necessary to the definition of "transhumanism", more like "posthumanism". If ya know, ya know; why try to cram it into the definition?
Transhumanism is the idea that future technology will improve dramatically on the human body and the human mind. We or our descendants, in five years or five hundred, may transcend so many limitations that we will no longer be recognizably human.
(This also incorporates Dan Fabulich's point about making the evaluation of probability external to the definition.)
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.