Re: Why believe the truth?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 16 2003 - 13:36:15 MDT

  • Next message: gts: "RE: Investing"

    Robin Hanson wrote:
    > On 6/16/2003, Eliezer S. Yudkowsky wrote:
    >
    >>> Of course that raises the question of why believing in the truth should
    >>> have such an overwhelming importance, moral or otherwise. Some
    >>> plausibly argue that it is more moral to be loyal to one's group, even
    >>> if this means that your beliefs will be biased, just as they believe it
    >>> is more moral to give charity to members of your group, even if this
    >>> means that worse off outsiders go without. I have to admit to while I
    >>> think that there are few things more important than believing the
    >>> truth, I can offer only disappointingly weak moral arguments to justify
    >>> this.
    >>
    >> Here are some of my favorite answers:
    >> * "Because I value the truth as a thing in itself."
    >
    > That's a statement of the position, not an argument for it.

    Mm, no, I'd call it a refinement of the position. I.e., one can value the
    truth as either a means or an ends; this refines the statement by focusing
    on ends.

    >> * "If the sky is blue, it's blue. If I think that's true, how
    >> could I believe something else just because I wanted to? That's not
    >> how my mind is wired."
    >
    > Then your brain is different from most humans. As with patriotism,
    > humans are in fact wired to be biased in many ways, and we have to work
    > to overcome those biases. Of course we don't believe we are biased, but
    > that doesn't stop us from being biased to believe things we want to
    > believe. Honestly, I think it must be the same with you; you'd like to
    > believe you are different, but you are not.

    I think most humans are wired such that, standing under a clear blue sky,
    they would find it at least a little difficult to say and believe the sky
    is green. This is what I was speaking of. It takes considerable training
    to overcome that innate honesty. In ambiguous matters, or where there is
    political controversy, things are different. There is a certain capacity
    to believe what one wants to believe, but it's not unlimited. It would
    take me a tremendous amount of work to rearrange my mind such that I could
    believe the sky was green, if I could do it at all, and I expect the same
    would be true of most people.

    >> * "Because knowing the truth is the best and only means of
    >> achieving the goals I care about."
    >
    > To judge whether this is true, we need to know more about the goals you
    > care about.

    How would a goal get achieved without a Bayesian decisionmaking system to
    implement it? (Not a perfect Bayesian decision process; a process which,
    to the extent it works, works because it has Bayesian structure somewhere
    in it.) How would a Bayesian decisionmaking process work without
    containing Bayesian reasoning on beliefs?

    >> * "Because even the process whereby I *decide* my goals depends on
    >> finding the actual answers (as opposed to the comforting or convenient
    >> answers) to logical or factual questions."
    >
    > People *can* choose comforting or convenient goals. For example, a
    > common argument against moral consequentialism is that it is too much
    > work to figure out the consequences of your actions for all people over
    > all time.

    Ah, but do you want goals that are *really* comforting or convenient, or
    just goals that you *think* are comforting and convenient? In other
    words, if you are computing the "convenience" of a goal, do you want an
    accurate result for that computation - the really genuinely true result
    for that computation - or will you be satisfied with a convenient answer
    to that computation? In the latter case, I would guess that your goals
    will end up not being really convenient at all; but perhaps it will be,
    though not *really* convenient to believe your goals are more convenient
    than they are, at least convenient to believe that it is convenient to
    believe that your goals are more convenient than they are.

    >> * "Because if I wrapped myself up in a private world, severing my
    >> connection to outside reality, I would destroy my potential to grow as
    >> a person."
    >
    > There are clearly lots of ways to grow that do not depend on close times
    > to outside reality.

    Name one?

    >> * "Darn, I don't have enough knowledge to answer that question.
    >> I'd better go get some."
    >
    > Most people answer such questions without much of any knowledge.

    Just because you can answer a question without any knowledge, does not
    mean that you should. For any person who is aware of their own lack of
    knowledge, and who regards that as a bad thing, the above would seem to be
    a valid argument.

    >> * "Because without accurate knowledge, I'd have no way of knowing
    >> how dangerous it was to be ignorant."
    >
    > Most of us make most of the important decisions in our lives without
    > accurate knowledge of their dangers.

    Yes, with consequences that are instrumentally negative under a wide
    variety of ends.

    > An instrumental justification of seeking truth seems bound to run into
    > situations where it harms other goals more than it helps. For example,
    > if you want a happy marriage you might do better to not know too much
    > about how unhappy most marriages end up, or how similarly happy you
    > might have been with the other spouses you could have chosen. Probably
    > simpler to just declare truth-seeking as a strong primary goal.

    Simpler, sure. And you are safer, in your pursuit of rationality, if your
    love of the truth is strong enough in itself to overcome any temptations
    to depart. But that does not mean that there is not more than one
    argument that locks the truth into place. Also, I don't consider
    convenience alone as a valid argument for choosing primary ends, so it's
    lucky that I do in fact happen to have truth as a primary end.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Mon Jun 16 2003 - 13:46:03 MDT