From: Robin Hanson (rhanson@gmu.edu)
Date: Tue Jul 08 2003 - 05:19:38 MDT
At 10:02 PM 7/7/2003 -0400, Eliezer S. Yudkowsky wrote:
>>There is a key difference between wanting to want and wanting to believe
>>that you want. I hypothesize that people want, and want to want, ignoble
>>things, but that they want others to believe that they want, and want to
>>want, noble things. Therefore they do not want to believe that they
>>want, and want to want, ignoble things.
>
>But... why? Why postulate this? What is the motivation for this dark and
>gloomy view of the world? And especially, how can you walk up to people
>who say "I want to be a better person" and say "No you don't"? I'm not
>asking about the gloominess, mind you, I'm asking what definition of
>volition you're using. How can you determine what people want to want
>except by asking them? If I believe that I want to be nicer, and I
>believe that I want to want to be nicer, so that to the question "Do you
>want to be a nicer person?" I answer "Yes", then to where are you jumping,
>outside that system, to determine that the real answer is no? ...
>If someone has the cognitive representation that they want to be nicer,
>and also contains emotional hardware supplying positive reinforcement to
>darker actions, it is not at all clear to me (to put it mildly) that the
>emotional hardware should take precedence over the cognitive
>representation in defining the person's volition.
>... people might have built-in tendencies and/or pleasurable reinforcement
>on ignoble behaviors, and inaccurate cognitive representations for those
>tendencies, and positive reinforcement for those inaccurate
>representations (i.e., they Hanson-"want" to disbelieve, although not
>necessarily Eliezer-"want" as I define volition).
>I am objecting to your phraseology here because it seems to preemptively
>settle the issue by identifying people's built-in emotional reinforcers as
>their real wants, while dismissing their cognitively held hopes and
>aspirations and personal philosophy as a foreign force interfering with
>their true selves. One could just as easily view the system from the
>opposite perspective.
I do agree that this is a subtle question, whose answer is not immediately
obvious. The topic of self-deception can be a conceptual morass, as our
usual anchors are not as available. Nevertheless, I do want to argue for
the claim you find questionable.
If people have contradictory beliefs, how can we say which ones are the
"real" beliefs? By reference to the basic schema of self-deception, in
which the real beliefs tend to determine less visible actions with more
fundamental consequences, and the false beliefs tend to determine what we
tell others and ourselves about ourselves, and the most socially visible
actions with the least fundamental consequences.
If you smoked often in private, but told yourself and others that you did
not smoke, and did not smoke in public, I'd say you really wanted to smoke
but didn't want others to believe that you smoked. If a corporation
polluted a lot, but had a public relations department that insisted that it
did not pollute, and that PR department managed to make sure that no
pollution was obvious during public tours of corporate facilities, I'd say
the corporation wanted to pollute but did not want to be thought of as
polluting.
If we make the hidden smoker's actions visible enough, he might stop
smoking. And if we give enough publicity to the corporation's polluting,
it might stop polluting. But is this because the smoker did not want to
smoke, and the corporation did not want to pollute? I'd say it was because
they wanted more not to be believed to smoke, and to pollute.
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Assistant Professor of Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030-4444
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Tue Jul 08 2003 - 05:30:09 MDT