Re: Why believe the truth?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 17 2003 - 14:17:36 MDT

  • Next message: Harvey Newstrom: "RE: POL Dishonest Debate and Evidence"

    Robin Hanson wrote:
    > On 6/17/2003, Eliezer S. Yudkowsky wrote:
    >>
    >> ... I think you're overlooking the incremental nature of striving for
    >> rationality. Someone who says, "I'm going to try and be rational"
    >> does not instantly acquire the realization that certain actions are
    >> self-deceptive in nature. It takes time to build to that level of
    >> self-awareness. You seem to be thinking of some ordinary bloke who,
    >> in espousing rationality, is instantly emptied of all self-deceptive
    >> content (if only this were true!) but goes on living an ordinary life,
    >> without acquiring any of the higher sorceries. This sounds to me like
    >> an anachronistic picture. To the extent that you can even identify
    >> self-deceptions in order to give them up, you are wielding rationality
    >> on a very high level, at least relative to current norms. By the time
    >> someone is capable of giving up all the self-deceptions that Robin
    >> Hanson knows how to identify, he or she will no longer be "most
    >> people". ...
    >> I'd bet that most people would end up saying: "For most people the
    >> possible benefits of avoiding snack bars just do not outweigh the many
    >> costs of being rational... but me, I'm different." So are they being
    >> self-deceptive? Or are you overestimating the extent to which your
    >> own circumstances and motives are special?
    >
    > I agree our actual choices tend to be piecemeal, but unfortunately this
    > mostly raises the costs of being rational.

    Why? It looks to me like the costs are lowered, because you acquire both
    costs and benefits incrementally, rather than acquiring all the costs at
    once, as you seemed to assume earlier. You get coping skills along with
    any new challenges, and it takes a while to get good enough that there are
    significant challenges arising from the debiasing of self-deception.

    > Our natural habits of
    > thought include our each believing that while most people are biased in
    > many ways, we are more rational than others. This is part of our
    > general over-estimation of our ability and moral value. Most people
    > actually know about most of the standard biases that people fall for.
    > Literature has for many hundreds of years relied on this fact when
    > making fun of such biases. Most people pay sincere lip service to
    > rationality, and we nod knowingly when someone mentions one of the
    > standard biases. But while we might admit in general that we are
    > subject to the same biases, we each usually manage to avoid believing
    > that a particular bias applies to us in a particular situation. If it
    > appears otherwise, we can point to particular factors that make this an
    > exception.

    According to your theory, wouldn't this *reduce* the cost of adhering to
    rationality during the early stages?

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 14:27:32 MDT