Re: Why believe the truth?

From: Robin Hanson (rhanson@gmu.edu)
Date: Mon Jun 16 2003 - 19:40:19 MDT

  • Next message: Spudboy100@aol.com: "HOLEY Fullerenes, Batman!"

    On 6/16/2003, Eliezer S. Yudkowsky wrote:
    >>... Yes, truth is instrumentally useful in various ways, but unless you
    >>assign it overwhelming primary importance, it is easy to find situations
    >>where you should be willing to give up the instrumental advantages of
    >>truth to obtain other things. Such as in marriage, promoting group
    >>loyalty, and much else. Can't you imagine any such situations?
    >
    >... Calling the truth "instrumentally useful" vastly understates the
    >point. ... In any given situation where it *seems* that there is some
    >"good" reason for believing a lie, it is more likely that the reason
    >itself is wrong. The truth is valuable for reasons which are not always
    >immediately obvious. You and I both know that a less-informed or
    >less-intelligent truthseeker would *in fact* be far safer trying simply to
    >find the truth, always, rather than trying to recalculate in any given
    >instance whether knowing the truth is the best strategy. Despite all
    >conceptually possible thought experiments, in practice, in real life, the
    >truth is so important that the net expected utility of asking the question
    >"Should I believe what is true?" is negative. You are more likely to make
    >a mistake than you are to run across (and successfully identify!) a
    >situation where wholeheartedly pursuing the truth is somehow not the best
    >strategy.
    >...
    >It's one thing to admit the philosophical possibility that there are
    >hypothetical scenarios where the "right" thing to do is believe falsely,
    >just as it is possible to construct thought experiments where the right
    >thing to do is commit suicide or overwrite your own goal
    >system. .... The rule "Just Be Rational" is more reliable than you
    >are. Sticking to the truth is a very simple procedure, and it is more
    >reliable than any amount of complex verbal philosophization that tries to
    >calculate the utility of truth in some specific instance.

    It would be nice if what you said were true, but alas I think it is
    not. We actually usually follow the rule "Just be Natural". Evolution has
    equipped us with vast and complex habits of thought, which people mostly
    just use without understanding them. This isn't a particularly dangerous
    strategy for the usual evolutionary reasons: your ancestors did pretty well
    following these habits, so you probably will to.

    If you are thoughtful and try to evaluate these inherited habits of
    thought, you will find that some of them seem to be biased, systematically
    leading away from truth relative to some other habits you can imagine. But
    if you think more about these biases you can see that they are plausibly in
    your self interest; you can see why evolution might have given you such
    habits of thought. At this point you must choose between the safe route of
    just continuing with the standard biased inherited habits of thought, or
    taking the chance of using less-tried habits that seem to you to be less
    biased, but which would probably have harmed your ancestors, and may well
    harm you as well.

    For example, we have the habit of over-estimating our popularity and
    abilities, thinking ourselves to be better lovers and drivers than we
    are. We know that the best salespeople honestly believe in their product,
    so we allow ourselves to believe in whatever product we are supposed to
    sell. When we "fall in love" we believe that we have found a very good
    match that will last a long time, consistently over-estimating such
    things. We find it easy to believe that our in-group is more able and
    morally justified that out-groups. We consistently have persistent
    disagreements with others, believing ourselves to be right more often in
    such disagreements, when in fact we are only right as often as others.

    We do all these things without thinking about them. They are
    natural. They are safe strategies. Once you realize that they are biased
    habits of thoughts, you can just say isn't that interesting, then forget
    about it and go on with your life as usual. Or you can decide that truth
    is really important to you, and take the chance of violating these standard
    habits, substituting less tried social strategies which may well have great
    personal costs.

    You might tell your girlfriend that she is average among the girlfriends
    you have had, and that you think you are likely to stay with her for the
    average time. You might admit you are an average lover and driver, that
    the products you sell are average, and that your in-group is no more
    morally justified than any other group. You might rarely disagree, and
    accept that others are right as often as you. You might admit that you
    care almost nothing about poor people in Africa.

    When you do so, the likely consequences are that people will consider you
    less able, less caring, and less loyal, then they otherwise would. They
    may choose to associate with you less because of this. Now maybe they will
    also realize that you seem to be especially truth-oriented, and want to
    keep you around because you can provide instrumental benefits of
    truth. But they may not care about the truth near as much as they care
    about loyalty, and your ability to accomplish most of your other goals in
    life will be thereby diminished. That may be acceptable if you put a high
    enough value on truth, but otherwise it may not be.

    Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
    Assistant Professor of Economics, George Mason University
    MSN 1D3, Carow Hall, Fairfax VA 22030-4444
    703-993-2326 FAX: 703-993-2323



    This archive was generated by hypermail 2.1.5 : Mon Jun 16 2003 - 19:50:19 MDT