RE: [POLITICS] Why People Are Irrational about Politics

From: Lee Corbin (lcorbin@tsoft.com)
Date: Tue May 27 2003 - 22:26:02 MDT

  • Next message: Robert J. Bradbury: "OFFLIST RE: Reality bites"

    Dan had posted

    > For the full essay, see http://home.sprynet.com/~owl1/irrationality.htm

    One of the problems I have with Michael Huemer's points---although
    agreeing that this paper is very good and on the whole makes
    valuable points---is when he writes

          In one psychological study, subjects were exposed to evidence
          concerning the deterrent effect of capital punishment. One
          study had concluded that capital punishment has a deterrent
          effect; another had concluded that it does not. All experimental
          subjects were provided with summaries of both studies, and then
          asked to assess which conclusion the evidence they had just
          looked at most supported, overall. The result was that those
          who initially supported capital punishment claimed that the
          evidence they’d been shown, overall, supported that capital
          punishment has a deterrent effect. Those who initially opposed
          capital punishment thought, instead, that this same evidence,
          overall, supported that capital punishment had no deterrent
          effect. In each case, partisans came up with reasons (or
          rationalizations) for why the study whose conclusion they
          agreed with was methodologically superior to the other study.
          This points up one reason why people tend to become polarized
          (sc., to adopt very strong beliefs on a particular side) about
          political issues: we tend to evaluate mixed evidence as
          supporting whichever belief we already incline towards—--
          whereupon we increase our degree of belief.

    I agree that everything he has said here is true; I simply don't
    think that evolution erred to make us this way. But first of all,
    a point about PCR (Pat-Critical Rationalism).

    It must be held in mind that mixed evidence DOES NOT knock down a
    belief! Mixed evidence does not refute anything, and a conjecture
    subjected to only "mixed evidence" survives the criticism, perhaps
    unscathed. So therefore it is in the nature of how our learning
    take place---through conjecture and refutation---that this tendency
    within us arises. And so far as we know (considering the absence
    of other beings who think a lot different from us), this is a good
    thing.

    He also wrote about the very same subject,

    b. Selective attention and energy:

          Most of us spend more time thinking about arguments supporting
          our beliefs than we spend thinking about arguments supporting
          alternative beliefs. A natural result is that the arguments
          supporting our beliefs have more psychological impact on us,
          and we are less likely to be aware of reasons for doubting our
          beliefs. I think that most of us, when we hear an argument for
          a conclusion we disbelieve, immediately set about finding "what’s
          wrong with the argument". But when we hear an argument for a
          conclusion we believe, we are much more likely to accept the
          argument at face value, thereby further solidifying our belief,
          than to look for things that might be wrong with it.

    Well, for some reason this strikes me as only natural. Suppose
    that you were assessing the impact of a flock of birds in the
    distance; it's a little bit disconfirming of your belief that
    the best hunting ground lies in the other direction. This ought
    to give you a little pause, and to perhaps cause a bit of
    re-thinking---but you'll probably just try to fit it in with
    your pre-existing beliefs. Sounds perfectly fine to me. But
    suppose that you see the flock in the direction which you
    already to believe to lie the best hunting; as he says, you'll
    experience some reinforcing gratification.

          This is illustrated by the capital punishment study mentioned
          above (section 5, d): subjects scrutinized the study whose
          conclusion they disagreed with closely, seeking methodological
          flaws, but accepted at face value the study with whose conclusion
          they agreed. Almost all studies have some sort of epistemological
          imperfections, so this technique almost always enables one to hold
          the factual beliefs about society that one wants.

    Well, as the skeptics say, extraordinary claims demand extraordinary
    evidence! If someone presents me with what appears to be an airtight
    argument that people are not affected by incentives, and that (say)
    redistribution of income won't affect total production, then I'm going
    to apply much stronger filters to that argument than I would its simple
    converse. This too is only natural, isn't it?

    Finally, along these lines, a good task for all those who find our
    methods of reasoning---including the very well-documented ways in
    which we err on probability problems (see the studies by Kahneman,
    Slovic, and Tversky)---is to explain how it is that we evolved to
    be predictably less than perfect reasoners, even given more than
    ample enough time to have had it corrected.

    Lee



    This archive was generated by hypermail 2.1.5 : Tue May 27 2003 - 22:36:59 MDT