Re: Why believe the truth?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 16 2003 - 16:44:30 MDT

  • Next message: gts: "RE: Investing"

    Robin Hanson wrote:
    >
    > I edited out more of the discussion above because it all comes down to
    > one simple point. Yes, truth is instrumentally useful in various ways,
    > but unless you assign it overwhelming primary importance, it is easy to
    > find situations where you should be willing to give up the instrumental
    > advantages of truth to obtain other things. Such as in marriage,
    > promoting group loyalty, and much else. Can't you imagine any such
    > situations?

    Even if you assign truth overwhelming primary importance, an altruist can
    as easily be confronted with the dilemna, i.e., "If you don't believe that
    2 + 2 = 5 in the next ten seconds, I'm going to implant that false belief
    into ten other people. Is your knowledge of the truth more valuable than
    theirs?" Fortunately, my brain is wired in such a way that I would fail
    to believe this even if I tried.

    Calling the truth "instrumentally useful" vastly understates the point.
    The truth is an "ethical heuristic", in the CFAI sense. In any given
    situation where it *seems* that there is some "good" reason for believing
    a lie, it is more likely that the reason itself is wrong. The truth is
    valuable for reasons which are not always immediately obvious. You and I
    both know that a less-informed or less-intelligent truthseeker would *in
    fact* be far safer trying simply to find the truth, always, rather than
    trying to recalculate in any given instance whether knowing the truth is
    the best strategy. Despite all conceptually possible thought experiments,
    in practice, in real life, the truth is so important that the net expected
    utility of asking the question "Should I believe what is true?" is
    negative. You are more likely to make a mistake than you are to run
    across (and successfully identify!) a situation where wholeheartedly
    pursuing the truth is somehow not the best strategy.

    Reality is all one piece, and the explanation of reality is all one piece.
      To defend a lie requires embracing other lies, and you cannot know the
    cost of those lies in advance, because you have destroyed the very means
    by which you would accurately calculate the true cost. To abandon the
    truth is far more dangerous than it looks, for reasons which are not
    immediately obvious, and which apply in a way that approximates full
    generality. That is, to calculate the supposed utility of self-deception
    in any specific instance is extremely demanding of both computing power
    and knowledge, has a very high prior answer of yielding a negative answer,
    has a higher chance of yielding false positives than false negatives, has
    a higher penalty for false positives than false negatives, and requires
    already knowing the truth about the matter for which one is attempting to
    calculate the utility of self-deception.

    It's one thing to admit the philosophical possibility that there are
    hypothetical scenarios where the "right" thing to do is believe falsely,
    just as it is possible to construct thought experiments where the right
    thing to do is commit suicide or overwrite your own goal system. I would
    not, however, advise people to try and calculate the utility of truth in
    specific instances; we are not that smart. We are especially not that
    smart in a situation where sticking to the truth seems emotionally hard.
    The safe rule is to Just Be Rational; to arrive at that conclusion once,
    for the general case, and not reconsider it, especially in moments of
    emotional stress. The rule "Just Be Rational" is more reliable than you
    are. Sticking to the truth is a very simple procedure, and it is more
    reliable than any amount of complex verbal philosophization that tries to
    calculate the utility of truth in some specific instance.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Mon Jun 16 2003 - 16:53:41 MDT