Re: AI:This is how we do it

From: Zero Powers (zero_powers@hotmail.com)
Date: Tue Feb 19 2002 - 23:48:29 MST


>From: "Eliezer S. Yudkowsky" <sentience@pobox.com>

>Zero Powers wrote:
> >
> > My point in a nutsell: friendliness cannot be imposed on one's superior.
> > Genes tried it, and made a good run of it for quite a while. Increasing
>our
> > intelligence made our genes ever more successful than the competitors of
>our
> > species. But, as our genes found out, too much of a good thing is a bad
> > thing. We now pursue gene-imposed subgoals (sex, for instance) while
> > bypassing completely the supergoals (i.e., kids) at our whim.

>It may be reassuring (or not) to realize that the means by which we resist
>evolution is itself evolved, only on a deeper level. We are imperfectly
>deceptive social organisms who compete by arguing about each other's
>motives;
>that is, we are political organisms. We have adaptations for arguing about
>morality; that is, in addition to our built-in evolutionary morality, we
>also
>have dynamics for choosing new moralities. In the ancestral environment
>this
>was, I suspect, a relatively small effect, amounting to a choice of
>rationalizations. However, any seed AI theorist can tell you that what
>matters in the long run isn't how a system starts out, it's how the system
>changes.
>
>So of course, our dynamics for choosing new moralities are starting to
>dominate over our evolutionary moralities, due to a change in cognitive
>conditions: Specifically, due to: (1) an unancestrally large cultural
>knowledge base (stored-up arguments about morality); (2) an unancestrally
>good
>reflective model of our own mentality (timebinding historical records of
>personalities, and (more recently) evolutionary psychology); (3) an
>unancestral technological ability to decouple cognitive supergoals that are
>evolutionary subgoals from their evolutionary rationales (i.e.
>contraception).
>
>(3) in particular is interesting because the way in which it came about is
>that evolution "instructed" us to do certain things without "telling us
>why".
>We won against evolution because evolution failed to treat us as equals and
>take us into its confidence (he said, anthropomorphizing the blind actions
>of
>an unintelligent process with no predictive foresight).

Your posts are conceptually dense (it may not sound like it, but that *is* a
compliment). Kind of like the Bible, you see something new almost everytime
you read it. I'm still digesting (re-reading) some of your posts. Don't
know if I'll respond, but I do want to see what you're seeing. Enough
preamble. To my question:

Suppose evolution had somehow come clean with us and said:

"Hey humans, here's the deal. You are genetically programmed to maximize
the probability of the survival of your geneset. So all those things you
think you like (like sex, food, clothing, and close relatives) and all those
things you think you hate (like pain, hunger and loneliness) are merely
subgoals hardwired into your essential make-up simply to attain the genetic
supergoal (maximizing the probability of the survival of your geneset--or in
other parlance being "friendly" to your genes)."

Do you think that nature's honesty would have persuaded us to continue being
"friendly" to our genes? Or do you think we would still be saying "Screw
the survival of my geneset, I'm in this for the maximum enjoyment of my own
particular phenotype"?

-Zero

"I'm a seeker too. But my dreams aren't like yours. I can't help thinking
that somewhere in the universe there has to be something better than man.
Has to be." -- George Taylor _Planet of the Apes_ (1968)

_________________________________________________________________
Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:40 MST