Re: Why believe the truth?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 17 2003 - 18:57:11 MDT

  • Next message: Mike Lorrey: "Re: who loses with wind power?"

    Robin Hanson wrote:
    > At 05:46 PM 6/17/2003 -0400, Eliezer S. Yudkowsky wrote:
    >
    >> ... review: what is it, if anything, that we still disagree about? I
    >> would say that, even instrumentally, the benefits of rationality are
    >> higher than the losses, and that the costs involved do not alter
    >> this. I am under the impression you still disagree with this, ...
    >
    > The part I'm not sure whether we agree is on the best strategy for
    > someone who generally accepts the goals that evolution has given them,
    > a standard mix of life, status, children, etc., and in a situation
    > where the future will not be that different from the 20th century or
    > before.

    But this is a basic choice inherently tied up with the choice to seek
    truth. You cannot take failure on that choice as assumed, then ask
    whether it is rational to be rational (note recursion). And why the
    counterfactual?

    > The vast majority of humanity believes they are in this situation.

    What of it? If all our beliefs were correct, what need would there be to
    guard against unknown unknowns by being rational? As it happens the
    belief is wrong, and isn't that the point? What need would there be to be
    rational "if" all truth were contained in the Bible, as many people believe?

    > In this situation, I claim the extra effort to substantially overcome
    > our inherited biases and be more rational than we naturally would be
    > seems usually more effort that it is worth, given the goals specified.

    But why should this imaginary scenario matter? Was there ever a point in
    human history where the academic grounding of rationality was strong
    enough to describe evolutionary motives for self-deception and contrast
    them with Bayesian reasoning, while the Singularity was yet far off?

    > Evolution has in fact equipped humans with behavioral strategies
    > roughly appropriate to such situations. I do grant that in the
    > situation you think you are in, where your goals and the future you
    > face are very different from what evolution has equipped you to deal
    > with, being unnaturally rational may well be a better strategy. So, do
    > we disagree or not?

    I'm not sure. What is the utility of thought experiments in which
    humanity develops sophisticated evolutionary psychology a century in
    advance of the Singularity?

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 19:06:38 MDT