From: Robin Hanson (rhanson@gmu.edu)
Date: Mon Jun 16 2003 - 22:10:26 MDT
On 6/16/2003, Eliezer S. Yudkowsky wrote:
>>>The rule "Just Be Rational" is more reliable than you are. Sticking to
>>>the truth is a very simple procedure, and it is more reliable than any
>>>amount of complex verbal philosophization that tries to calculate the
>>>utility of truth in some specific instance.
>>We actually usually follow the rule "Just be Natural". Evolution has
>>equipped us with vast and complex habits of thought, which people mostly
>>just use without understanding them. This isn't a particularly dangerous
>>strategy for the usual evolutionary reasons: your ancestors did pretty
>>well following these habits, so you probably will too.
>
>That works to the extent that:
>1) Your sole and only goal is to reproduce ...
>2) You are in the ancestral environment. ...
>Even if we consider only personal survival, what are, on the average, the
>greatest probable threats to the survival of any given human living today?
>1) Existential risks of transhuman technologies.
>2) Answering "no" to an FAI who asks "Do you want to live forever?" ...
>If you plan on living forever and growing up, then admitting to such
>problems is the first step toward actually doing something about them.
Yes, someone who has really adopted goals that are very different from the
goals evolution has given him, and/or who lives in a world very different
from the world evolution adapted to, may well want to rely much more on
trying to be rational. But you have to get pretty far into your
singulatarian world view to make this argument, which then becomes
unpersuasive to the vast majority of humanity. For most people, who expect
to live an ordinary lifespan, who have the usual goals that most people do,
and who have the usual expectations about future trends, your argument has
very little purchase. Most people accept the goals evolution has given
them, a standard mix of wanting food, status, children, etc. And the
social world they expect to live in isn't really that different from the
social world that evolution expects, at least regarding these
issues. Given those assumptions, it seems quite reasonable for them to
"Just be Natural."
>... Even if we suppose that it might *turn out* to make sense to deceive
>yourself *in that instance*... how's J. Layman Boyfriend supposed to
>compute that answer? If he is ignorant enough to still have the
>opportunity of self-deception, how can he possibly know enough to figure
>out whether self-deception is safe?
He doesn't know. He just acts, naturally. And for all of his ancestors,
this was the right thing to do.
Now I actually do think that the world our descendants will live in within
the next century may well be different enough that acting with the usual
level of self-deception may lead to very bad outcomes. So I personally can
accept this as a strong argument for seeking truth. (And I even once had
this argument in my self-deception paper.) But for most audiences, this is
a very weak argument. And my original claim was just that I could only
find disappointingly weak arguments for believing truth.
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Assistant Professor of Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030-4444
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Mon Jun 16 2003 - 22:20:22 MDT