Re: Defining Transhumanism

Max More (maxmore@globalpac.com)
Mon, 19 Oct 1998 17:53:31 -0700

At 10:19 AM 10/19/98 -0700, Robin Hanson wrote:
>Nick Bostrom suggested that he & I take public our private
>conversation about definitions of transhumanism. I proposed:
>
> Transhumanism is the idea that new technologies are likely
> to change the world so much in the next century or two that
> our descendants will in many ways no longer be "human."

(1) Like Phil Goetz, I think it's a mistake to include "descendants" here. Just about all the transhumanists I know think there's a reasonable chance that they, and not just their descendants, will no longer be human in the future. Your definition will, I believe, lead people to think that we see these changes as too far ahead for ourselves. While some of us may believe this, and others are unsure, many obviously do not.

(2) Is it relevant to specify the time period of "a century or two". If someone thinks it will take 300, 400, or 500 years, are they not a transhumanist? The one-two century time frame probably does include most self-described transhumanists. But if this is to be in the definition it needs to be qualified... "change the world so much in the future (perhaps within a century or two)..."

(3) Yes, I do think that a definition should focus on normative issues. Transhumanism is and always has been a normative affair. The positive (in the sense you mean) belief in this coming change is a matter of future studies rather than of transhumanism itself. "Transhumanism" naturally grows out of humanism. Humanism clearly was and is a normative view as well as a positive view. Humanism without its values (as stated in Corliss Lamont's Humanist Manifesto, or in Free Inquiry, or in the works of Paul Kurtz), would not be humanism. Humanists not only believe progress to be possible, they seek certain goals and values. Transhumanism follows suit, except that it has a more radical notion of progress.

When I first used the term "transhumanism", deriving it from the existing term "transhuman", my definition (1990) was normative. Earlier this year we spent some time examining and improving that definition. (Generally it's appropriate to refer to past work on the same topic.) Here it is:

TRANSHUMANISM: Any philosophy of life that seeks the acceleration of our development beyond its current human limitations by means of science, technology, creativity, and other rational methods.

I think there is room for improvement here, but I think that re-defining transhumanism to exclude normativity turns it into something else.

Definitions generally are either lexical/reportive, or stipulative (though several other categories exist). A lexical definition reports on how a term is typically used. A stipulative definition gives a new meaning for a particular purpose, or to remove vagueness. (The latter being a "precising definition".) I contend that normative views are a standard part of the use of the term "transhumanism". Giving a new stipulative definition like this might be handy in some contexts. To present it as the main definition would be misleading and inconsistent with its roots and history.

>This definition focuses on positive, not normative, beliefs.
>To those who think that a definition should focus on normative
>beliefs I ask: Why do there seem to be so few people who share
>our positive beliefs but not our normative beliefs? That is,
>where are the people who believe that big technology-induced
>change is coming, and think that is a terrible thing?

In a previous post I noted that there are such people. Overall though, I agree that those who expect transformative change to happen tend to favor it. But why is this a reason for making a definition non-normative? The normative aspect of a definition imply factual expectations -- we wouldn't bother to value and *seek* becoming posthuman if we thought it impossible. My definition, or something like it, includes the positive views while stating the normative concerns that have always characterized transhumanism. Why cut out the heart of this world-view?

Onward!

Max



Max More, Ph.D.
more@extropy.org (soon also: <max@maxmore.com>)

http://www.maxmore.com
Consulting services on the impact of advanced technologies President, Extropy Institute:
exi-info@extropy.org, http://www.extropy.org