Let me apologize to everyone if I gave the impression I was trying to propose something official. Nick suggested we take our private conversation public, but I should have thought more before doing so.
I am vaguely aware that there have been discussions on this topic before which I have not paid great attention to, and I am also aware that there are various people and organizations which have a stake in making "official" definitions.
As Natasha mentioned, I'm newly the chief editor of J. of Transhumanism, which aspires to be an academic quality journal. This inspired me to think a bit more about what exactly "academic-style transhumanism" does or should mean.
As Max mentions, there are various kinds of definitions of terms. I don't have much respect for PR-inspired definitions, intended to convince everyone else that they were really transhumanists all along. I'm much more interested in socially-functional definitions that actually do distinguish people and things frequently called "transhumanist" from everything else out there.
I agree that normative associations are strong in the way people seem to use the term. I am made suspicious by the striking correlation between believing big change is coming and believing that's probably a good thing. So I'm not sure whether normative claims are really the root of what makes us different. But for now I relent and say:
Transhumanism is the idea that new technologies are likely to change the world so much in the few centuries that we or our descendants will in many ways no longer be "human," and that that's probably a not a bad thing.
Again, I'm not proposing this for anything official - just thinking out loud.
Max More writes:
>(2) Is it relevant to specify the time period of "a century or two". If
>someone thinks it will take 300, 400, or 500 years, are they not a
>transhumanist? The one-two century time frame probably does include most
>self-described transhumanists. But if this is to be in the definition it
>needs to be qualified... "change the world so much in the future (perhaps
>within a century or two)..."
I do think that some sort of positive timing claim is needed, since I think if you push most thoughtful people they will admit such changes are quite possible over a time scale of say a billion years, and that such changes are probably not bad. Such people are not transhumanists in the usual usage of the term.
>TRANSHUMANISM: Any philosophy of life that seeks the acceleration of our
>development beyond its current human limitations by means of science,
>technology, creativity, and other rational methods.
>I think there is room for improvement here, but I think that re-defining
>transhumanism to exclude normativity turns it into something else.
I'm concerned that this sort of definition isn't ideal for "academic-style transhumanism." For this purpose, one wants to focus on the aspects of "transhumanism" of most interest to academics, most amenable to academic tools, and avoiding academic hot buttons.
Most academics consider themselves rational people contributing to intellectual progress, and don't consider "philosophies of life" to be terribly academic. Max's definition seems a bit too imprecise for academic tastes, for example, being coy about what "limitations" are to be overcome.
firstname.lastname@example.org http://hanson.berkeley.edu/ RWJF Health Policy Scholar, Sch. of Public Health 510-643-1884 140 Warren Hall, UC Berkeley, CA 94720-7360 FAX: 510-643-8614