Re: Why Does Self-Discovery Require a Journey?

From: Robin Hanson (rhanson@gmu.edu)
Date: Fri Jul 11 2003 - 08:52:10 MDT

  • Next message: Natasha Vita-More: "Re: Cooperation between All Transhumanist Organizations (Was Re: ExI/WTA)"

    On 7/11/2003, Eliezer S. Yudkowsky wrote:
    >>>... your phraseology ... seems to preemptively settle the issue by
    >>>identifying people's built-in emotional reinforcers as their real wants,
    >>>while dismissing their cognitively held hopes and aspirations and
    >>>personal philosophy as a foreign force interfering with their true selves. ...
    >>
    >>If people have contradictory beliefs, how can we say which ones are the
    >>"real" beliefs? By reference to the basic schema of self-deception, in
    >>which the real beliefs tend to determine less visible actions with more
    >>fundamental consequences, and the false beliefs tend to determine what we
    >>tell others and ourselves about ourselves, and the most socially visible
    >>actions with the least fundamental consequences.
    >
    >... If you go around determining real beliefs by the *outcomes* of
    >people's actions, you run the risk of confusing evolutionary motives with
    >cognitive ones, the classic mistake in evolutionary psychology. ...
    >
    >>People may want to produce art to gain social approval, wealth, mates,
    >>etc., but want to be thought of as doing it just for the art. People
    >>may want to advocate positions that make them seem clever and
    >>compassionate, and get them social accepted by the right folks, but want
    >>to be thought of as wanting only to tell the truth. People may want to
    >>be unfair when serving as a neutral judge, but want to thought of as fair.
    >
    >Aren't these instances of the classic error? People have emotional
    >hardware and cognitive representations leading them to be devoted to art
    >for its own sake, ... People have emotions leading them to honestly
    >advocate positions that people applaud as clever and compassionate,
    >... People think they're honest and they are, but what think is the truth
    >is output by biased reasoning hardware ... Over and over, people seize
    >power "for the good of the community". ... What I'm saying is that in this
    >case, the adaptive bias is being applied to the computation p(x|a) rather
    >than U(x). ... It would not even be accurate to say that the people are
    >being deceived about their "real motives"; they are being deceived about
    >which means correspond to which ends. In other words, the rationalization
    >warp looks like this:
    >
    >Evolutionary end, i.e., subgoal of reproduction: Y. (Status, power...)
    >Cognitively held end which is socially acceptable: X. (Good of the tribe.)
    >So evolution is applying a bias to the computation of p(x|a) such that
    >people find A to appear very plausible as a subgoal of X, given that it is
    >*actually* a subgoal of Y. In other words, p(x|a) will be computed as
    >higher than it should be, given that p(y|a) is *in fact* high. ...
    >But the point is that people *really do want* to help others, to create
    >art, to be compassionate. It's the whole reason why we find the
    >evolutionary puppet strings so horrifying once we become aware of them; ...

    [FYI, there are many papers/books in philosophy and psychology, and fewer
    evolutionary psychology, on self-deception. And classic literature has
    many things to say about it. (I'm co-hosting a small invitation-only
    interdisciplinary conference on the subject here in October.) This is
    exactly the sort of topic that extropians shouldn't try to reinvent before
    surveying the existing literature. RH]

    There are many possible mechanisms of self-deception, one of which is as
    you describe. The whole system of calculating actions from goals and
    beliefs has many entry points for motivational bias. Some of these points
    are focused more on beliefs, others more on goals. The system also has
    many rich layers of protection from situations that might to remove such
    bias. We are much better at spotting self-deception in others than in
    ourselves, because we have ways of avoiding looking at the relevant
    evidence, and rationalizing it away when others point it out.

    There are two classic ways to determine what people "really" want. One is
    based on "happiness," the other on informed choice. In your example, the
    happiness metric asks if people are happier when they get status/power
    versus when they actually do good for the tribe, without getting such
    status/power. The informed choice metric asks whether people would choose
    status/power or good for the tribe if they were briefly and privately
    informed, via enough evidence to typically be persuasive to a neutral
    observer, that this is actually what they are choosing between. (I say
    briefly so that they can quickly forget the conversation every happened and
    revert to the state where they actually believe they are doing good for the
    tribe.)

    My reading of human behavior in most of the contexts in which
    self-deception is an issue is that most people are happier with the
    status/power type option, versus the doing good for the tribe type option,
    and that this is what they usually actually choose when briefly and
    privately informed. I agree that most people do believe that they want to
    do good for the tribe. My claim is that this belief is relatively isolated
    and ineffectual; it is allowed to influence what people say and some
    actions that influence social perceptions, but is otherwise little used.

    Consider that today most people around here would say that the think the
    world is their tribe, but they give almost no money to help poor people in
    Africa, even when they believe that such aid would make the world a better
    place overall.

    If I still haven't convinced you, I suggest we consider the more familiar
    and tractable question I suggested before, namely how we can tell what a
    corporation "really" wants:

    >If a corporation polluted a lot, but had a public relations department
    >that insisted that it did not pollute, and that PR department managed to
    >make sure that no pollution was obvious during public tours of corporate
    >facilities, I'd say the corporation wanted to pollute but did not want to
    >be thought of as polluting.

    When would you say that a corporation that consistently continues to
    pollute, even though its PR denies it, "really wants" to not pollute?

    Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
    Assistant Professor of Economics, George Mason University
    MSN 1D3, Carow Hall, Fairfax VA 22030-4444
    703-993-2326 FAX: 703-993-2323



    This archive was generated by hypermail 2.1.5 : Fri Jul 11 2003 - 09:02:35 MDT