From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 16 2003 - 23:14:07 MDT
Robin Hanson wrote:
>
> Yes, someone who has really adopted goals that are very different from
> the goals evolution has given him, and/or who lives in a world very
> different from the world evolution adapted to, may well want to rely
> much more on trying to be rational. But you have to get pretty far into
> your singulatarian world view to make this argument, which then becomes
> unpersuasive to the vast majority of humanity? For most people, who
> expect to live an ordinary lifespan, who have the usual goals that most
> people do, and who have the usual expectations about future trends, your
> argument has very little purchase. Most people accept the goals
> evolution has given them, a standard mix of wanting food, status,
> children, etc. And the social world they expect to live in isn't really
> that different from the social world that evolution expects, at least
> regarding these issues. Given those assumptions, it seems quite
> reasonable for them to "Just be Natural."
The argument that is accessible to anyone is Just Be Rational, Or The
Unknown Variables Will Eat You. We know a couple of what are, from the
majority perspective, unknown unknowns, and we can confirm that the
presence of those unknown unknowns does in fact operate to reinforce the
rule and create nonobvious incentives for rationality. But on new cases,
the argument is the same for us as for anyone else. We don't know all the
unknown unknowns either.
In any case, we should distinguish between the question of whether the
right answer is Just Be Rational, and the question of how persuasive the
right answer happens to be. People are playing the game for enormous,
nonancestral stakes, even in their own lives, whether they realize it or
not. How wonderful that the Just Be Rational strategy, accessible to any
audience, returns the right answer even in the face of such tremendous
surprises!
> Now I actually do think that the world our descendants will live in
> within the next century may well be different enough that acting with
> the usual level of self-deception may lead to very bad outcomes. So I
> personally can accept this as a strong argument for seeking truth. (And
> I even once had this argument in my self-deception paper.) But for most
> audiences, this is a very weak argument. And my original claim was just
> that I could only find disappointingly weak arguments for believing truth.
The universal argument is "Just be rational or you won't even see the
bullet that kills you." We know several specific instances of such
bullets that are not widely known, which shows that the universal argument
has turned out to work so far; as we accumulate additional experience and
knowledge, it tends to show that the heuristic was correct. Speaking from
my experience, this heuristic not only turned out to lead to the correct
answer, in retrospect it often seems like this heuristic is the only
possible way for a human to arrive at the correct answer without
anachronistic advance knowledge.
If you are worried about finding disappointingly weak *arguments* for
rationality, *arguments* as opposed to *rational support*, then that is
quite a different question from the one I thought you were asking. I'm
worried that quantum mechanics also has disappointingly weak arguments
relative to a layman audience; for example, it lacks the dramatic appeal
and intuitive animism of elephants on the back of a giant turtle. But
that is a question of needing to write lengthy explanations, not of the
principle itself being wrong.
But I don't think JBR is, in fact, inaccessible. The Just Be Rational
strategy is not only employable by anyone, it has arguments which are
accessible to anyone - as widely available as the nutritionless package of
fat and sugar at the grocery store, the history of scientific discovery,
the innate drive to know the truth. The list of reasons I originally
offered were simply put; only the replies to your complex counterarguments
were complex. Truthseeking is a simple and intuitive idea. The idea that
we execute adaptive self-deception is a complex counterargument. A box of
Little Debbie snack bars is a simple but iffy illustration of the complex
reply to the complex counterargument; the Singularity is a more valid
illustration, but less accessible.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Mon Jun 16 2003 - 23:24:06 MDT