From: Damien R. Sullivan (phoenix@ugcs.caltech.edu)
Date: Wed Feb 20 2002 - 01:16:11 MST
On Tue, Feb 19, 2002 at 10:48:29PM -0800, Zero Powers wrote:
> >From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
> >(3) in particular is interesting because the way in which it came about is
> >that evolution "instructed" us to do certain things without "telling us
> >why". We won against evolution because evolution failed to treat us
> >as equals and take us into its confidence (he said,
> >anthropomorphizing the blind actions of an unintelligent process with
> >no predictive foresight).
I think a simplistic concrete example which would illustrate this is:
nature didn't actually design us to "want children". Well, it might
have more for women, but that's still open to debate. Some people of
both sexes want children, but not all, and it's hard to separate out
cultural effects. What we do know is that we're hardwired to want to
have sex, and from anecdotal reports I think a wired response of
affection upon seeing a newborn baby is also supportable. But because
of the two-step process we can use birth control in the early "no baby
in sight, don't want children much" stage and not have children. If
we'd been properly designed to Want Children, that wouldn't work. But
we Want Sex instead.
> "Hey humans, here's the deal. You are genetically programmed to maximize
> the probability of the survival of your geneset. So all those things you
> think you like (like sex, food, clothing, and close relatives) and all those
> things you think you hate (like pain, hunger and loneliness) are merely
> subgoals hardwired into your essential make-up simply to attain the genetic
> supergoal (maximizing the probability of the survival of your geneset--or in
> other parlance being "friendly" to your genes)."
>
> Do you think that nature's honesty would have persuaded us to continue being
> "friendly" to our genes? Or do you think we would still be saying "Screw
> the survival of my geneset, I'm in this for the maximum enjoyment of my own
> particular phenotype"?
Graydon of rec.arts.sf.written/fandom has an idea about embedded
creationist memes in our thinking and language. We can call ourselves
an evolutionist or atheist, and be an evolutionist and atheist, but it's
hard to make all of our ideas match up. It's like being a round peg in
a round hole, then becoming a square peg. You're still in a round
hole... "natural rights" is an easy example; the idea comes from
Christian and Deist thinkers and lacks any equivalent support in hard
materialist terms, but is still extremely pervasive.
And more obscurely, I think there's a similar 'creationist' fog around
the concept of intelligence. An instinct that something recognizable as
intelligent in being able to think and talk must have a soul similar to
us in other respects. Natural enough in a way, since extrapolating from
our own human nature is the first shortcut to understanding other
humans, but still wrong. Intelligence is a tool. Self-awareness is a
tool. They are not package deals, bringing with them self-respect,
self-centeredness, a John Galtian yearning to be free and independently
creative. Just because a being is aware of itself doesn't mean it cares
about itself, anymore than being aware of sparrows means you care about
sparrows. That concern is a separate mechanism, which would not be the
top priority of an AI you don't want to go rogue.
We do not have some transcendant concern for the enjoyment of our
phenotype. Our various enjoyments are effects, direct or indirect, of
the mechanisms given us by nature to propagate our genes. Given how
quickly we've changed our own conditions, not all of the mechanisms work
fully now. If nature had been more explicit in wiring us to care about
our geneset there wouldn't be any conflict between its honesty and our
concern for our phenotypic enjoyments; the geneset would be our top
priority, period.
Most _humans_ aren't John Galt. Quite a few prefer to serve others, if
their dignity is respected: loyal servants, self-sacrificing wives or
husbands, religious devotees. The range of AI motivations is not going
to be less than that of human motivations.
I think a lot of -- most? -- people who'll explicitly state they believe
they're a biological machine have not really implicitly incorporated that.
The implicit beliefs and reflexes reflect an image of an AI, or of human
nature, with its own inherent desires, independent of the human or
evolutionary process which designed it. The proper image should be of a
machine which could have been built to do, or at least aim for,
anything. Humans, of course, were not designed with much foresight or
coherent vision.
-xx- Damien X-)
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:40 MST