Faced with a question such as, for example, "Can the mind override genetic tendencies?" or "Does a Specialist count as a transhuman?", I find it useful to divide "the stuff that people are made of" into three layers:
Panhuman
Gaussian
Personal
The panhuman layer includes all complex functional adaptations - called "panhuman" for the reason that you can't get a complex adaptation without it evolving from a simple adaptation, which requires that the simple adaptation be reliably present in the population over evolutionary time. For the visual cortex to evolve there had to be a retina, for example. Thus every complex adaptation I've heard of is panhuman.
The Gaussian layer refers to things like levels of particular cognitive abilities, height, weight, and other characteristics that - in "gaussian" or unaltered humans - tend to be distributed in a gaussian curve. If the defining quality of the panhuman layer is genetically stored complexity, the defining quality of the gaussian layer is quantitative variation. (Terminology confusion warning: Altered humans can have "trans-gaussian" levels of a "gaussian-layer" characteristic.)
The Personal layer refers to constructs which can be composed in complex and extremely variant ways. I would say "arbitrary", except that the contents of the Personal layer do have to be constructed of human components. We might compare the Personal layer to shapes made of Legos; they have to be made of Legos, but they can be put together in arbitrary ways. Personal philosophies, for example, are generally personal, as are most beliefs and convictions except those produced directly by instinct and emotion. The defining quality of the Personal layer is complex structures with variance on the component level - that is, while components and the set of interactions may be predefined, the total structure of interactions is not.
So in the discussion below, the question is: Can the Personal layer override the genetic variance in the Gaussian layer? And the answer is: "Yes, usually, for mental characteristics, although there are limits to how much farther along the curve you can push them."
Similarly, to the question of "Is Eliezer Yudkowsky a transhuman?", we
can answer:
Panhuman layer: Definitely not.
Gaussian layer: Slightly (some abilities a bit off the curve).
What does it mean to be transhuman on a Personal layer? Not a lot. But if I had to define something, one effect of being a Personal transhuman would be possessing personal structures which can be translated into nonhuman minds; for example, an AI, or an alien. The actual definition would be having Personal structures which converge to the truth (which is presumably the same for all minds) rather than being dependent on initial conditions (in those cases where a "truth" exists, of course). Obviously you don't have to have even slightly transhuman complexity or abilities to have a transhuman personality, so this is something of an abuse of the term. But if we did ask the question on those terms:
Personal layer: Quite a bit (50% human).
Matt Gingell wrote:
>
> From: Robert J. Bradbury <bradbury@www.aeiveos.com>
>
> >On Fri, 3 Sep 1999, Lee Daniel Crocker wrote:
> >
> >> Individuals can always rise above the predispositions of their
> >> genetic heritage,
> >
> >Absolutely false. I will never be a basketball player and am
> >extremely unlikely to be a football player. Anyone over the
> >age of 30, who did not start a sports "career" as a teenager,
> >is unlikely to be involved in any professional sport due to
> >the genetic heritage known as "aging". People to a large
> >degree (we will say ~50%) their genes and there are many
> >situations where even a superhuman effort will not allow
> >them to "rise above" that. In thinking about this, I believe
> >that in *most* situations, there exist individuals who would
> >never be able to master the situation because of their
> >genetic heritage. Genetic equality is a fantasy.
> The mind is a
> great deal more flexible than the body. The breakdown of environmental and
> genetic influence isn’t nearly so clear.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way