Eliezer S. Yudkowsky wrote:
>Transhumanism is the idea that future technology will probably improve
>dramatically on the human body and the human mind. We or our descendants, in
>five years or five hundred, may transcend so many limitations that we will no
>longer be recognizably human - or even mortal.
Hmmm. Why not lose the "probably" in the definition, and then allow each person to qualify it for themselves? If transhumanism "is the idea future technology *will* improve dramatically ... etc" then it allows me to say "I think transhumanism is probably true," or "I'm quite certain transhumanism is true," or "I think that there's a remote but nonetheless significant change that transhumanism is true," or even "transhumanism probably isn't true" etc.
Just my $0.02.
-Dan
"Decay is inherent in all compounded things. Strive unceasingly."