Re: Why believe the truth?

From: Eliezer S. Yudkowsky (
Date: Tue Jun 17 2003 - 12:23:27 MDT

  • Next message: Alex Ramonsky: "Re: META: Time to enforce the List Rules! (personal revelation)"

    Robin Hanson wrote:
    > On 6/17/2003, Eliezer S. Yudkowsky wrote:
    >>> Yes, someone who has really adopted goals that are very different
    >>> from the goals evolution has given him, and/or who lives in a world
    >>> very different from the world evolution adapted to, may well want to
    >>> rely much more on trying to be rational. ... Most people accept the
    >>> goals evolution has given them, a standard mix of wanting food,
    >>> status, children, etc. And the social world they expect to live in
    >>> isn't really that different from the social world that evolution
    >>> expects, at least regarding these issues. Given those assumptions,
    >>> it seems quite reasonable for them to "Just be Natural."
    >> The argument that is accessible to anyone is Just Be Rational, Or The
    >> Unknown Variables Will Eat You. We know a couple of what are, from
    >> the majority perspective, unknown unknowns, and we can confirm that
    >> the presence of those unknown unknowns does in fact operate to
    >> reinforce the rule and create nonobvious incentives for rationality.
    >> But on new cases, the argument is the same for us as for anyone else.
    >> We don't know all the unknown unknowns either.
    > A general argument for "Just Be Rational" must acknowledge general
    > costs, as well as general benefits. It takes a lot of work to try to be
    > rational, and it takes work to lie about what you believe, and there are
    > social costs when you fail to successfully lie about your rationality.
    > Without some reason to believe your evolutionary heritage will
    > substantially mislead you, "Just be Natural" seems a better strategy. I
    > do agree that we are entering an era when in fact our evolutionary
    > heritage does mislead the types of people who populate this list. So
    > "Just Be Rational" may well soon be best for such people. But that is a
    > much narrower claim.

    As it happens, I don't advocate the strategy of lying. I also don't think
    that it's a lie to tell your girlfriend that she's the most beautiful
    woman in the world, providing she does seem that way to you; I consider
    beauty a two-place predicate between observer and observed, rather than an
    objective constant property of a given girlfriend; it may well be that
    this predicate is modified by falling in love. And the other forms of
    self-deception you cite, such as overconfidence, seem much iffier and more
    dangerous. Such issues aside...

    I think you're overlooking the incremental nature of striving for
    rationality. Someone who says, "I'm going to try and be rational" does
    not instantly acquire the realization that certain actions are
    self-deceptive in nature. It takes time to build to that level of
    self-awareness. You seem to be thinking of some ordinary bloke who, in
    espousing rationality, is instantly emptied of all self-deceptive content
    (if only this were true!) but goes on living an ordinary life, without
    acquiring any of the higher sorceries. This sounds to me like an
    anachronistic picture. To the extent that you can even identify
    self-deceptions in order to give them up, you are wielding rationality on
    a very high level, at least relative to current norms. By the time
    someone is capable of giving up all the self-deceptions that Robin Hanson
    knows how to identify, he or she will no longer be "most people".

    >> ... Truthseeking is a simple and intuitive idea. The idea that we
    >> execute adaptive self-deception is a complex counterargument. A box
    >> of Little Debbie snack bars is a simple but iffy illustration of the
    >> complex reply to the complex counterargument; the Singularity is a
    >> more valid illustration, but less accessible.
    > Being natural is also simple, and even more intuitive. For most people
    > the possible benefits of avoiding snack bars just do not outweigh the
    > many costs of being rational. If that was the main benefit, it's not
    > worth it.

    Funny thing... I'd bet that most people would end up saying: "For most
    people the possible benefits of avoiding snack bars just do not outweigh
    the many costs of being rational... but me, I'm different." So are they
    being self-deceptive? Or are you overestimating the extent to which your
    own circumstances and motives are special? I would advise such people to
    worry about their own individual choices to be rational - to ask whether
    being rational makes sense for them, rather than imagining how someone
    else might argue it.

    Eliezer S. Yudkowsky                
    Research Fellow, Singularity Institute for Artificial Intelligence

    This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 12:33:15 MDT