From: Mark Walker (mark@permanentend.org)
Date: Wed Jun 18 2003 - 09:26:48 MDT
----- Original Message -----
From: "Robin Hanson" <rhanson@gmu.edu>
> At 05:46 PM 6/17/2003 -0400, Eliezer S. Yudkowsky wrote:
> >... review: what is it, if anything, that we still disagree about? I
> >would say that, even instrumentally, the benefits of rationality are
> >higher than the losses, and that the costs involved do not alter this. I
> >am under the impression you still disagree with this, ...
>
> The part I'm not sure whether we agree is on the best strategy for someone
> who generally accepts the goals that evolution has given them, a standard
> mix of life, status, children, etc., and in a situation where the future
> will not be that different from the 20th century or before. The vast
> majority of humanity believes they are in this situation. In this
> situation, I claim the extra effort to substantially overcome our
inherited
> biases and be more rational than we naturally would be seems usually more
> effort that it is worth, given the goals specified. Evolution has in fact
> equipped humans with behavioral strategies roughly appropriate to such
> situations. I do grant that in the situation you think you are in, where
> your goals and the future you face are very different from what evolution
> has equipped you to deal with, being unnaturally rational may well be a
> better strategy. So, do we disagree or not?
>
> >>As an aside, I actually have high hopes that we can improve people's
> >>incentives to be rational in their contributions to collective consensus
> >>via wider use of betting markets. People are more rational when they
> >>bet, for obvious reasons.
> >
We've seen arguments that _explain_ why it might be important to pursue or
belief the truth in some contexts and not others. (E.g., it is good to know
the truth about whether there is a saber tooth tiger outside the cave; but
it may increase your chances of having more off-spring if you believe
falsely that your partner is an above average mate). This tells us why we do
and do not pursue the truth, but it doesn't answer the question of whether
we _ought_ to. I take it from the betting market idea that the idea is that
we ought to pursue greater truth. But why should we suppose that it is
better to throw off "the shackles" of self-deception rather than truth
seeking? Perhaps with better technology our self-deception could be more
complete. With better control of our minds we might be able to believe even
more outlandish things that we find satisfying to believe. (E.g., not only
am I a better than average lover, I am the world's greatest lover, etc.).
Perhaps we might employ AI "truth minders" to steer us clear of trouble when
necessary (as any good minder does). Perhaps we might have a truth toggle
switch: every day you go into truth mode for a few moments to get done those
things in your life that require knowing the truth (say managing your
money). Perhaps in truth mode you realize that your wife is not particularly
attractive but once you flick the toggle switch you go back to believing she
is the most beautiful thing in the universe. (Somewhat reminiscent of the
movie Shallow Hal). So, I take it then that technology will allow us to
recreate ourselves as better truth seekers or self-deceivers, and, I take it
that most of us would take the former option if we could. Of course this
raises the question of whether given our present partially self-deceiving
natures we might have deceived ourselves about the value of truth seeking. I
suppose if there is a fact of the matter whether the greater truth seeking
life or the greater self-deception life is better then the best (and perhaps
the only way) would be to run the experiment (i.e., live for a while each
conception of the good life).
Above averagely yours,
Mark
Mark Walker, PhD
Research Associate, Philosophy, Trinity College
University of Toronto
Room 214 Gerald Larkin Building
15 Devonshire Place
Toronto
M5S 1H8
www.permanentend.org
This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 09:36:30 MDT