From: Hal Finney (hal@finney.org)
Date: Wed May 28 2003 - 15:44:40 MDT
Oops, sorry about the truncated message. Let's try again.
Several people have speculated about what Lee Corbin calls the "annoying
result" discussed in Robin Hanson's paper at
http://hanson.gmu.edu/deceive.pdf. I would refer to it rather as
a marvelously surprising and paradoxical discovery that strongly
challenges our understanding of our own thought processes. I have
tried to understand it in different ways and from different directions.
Here is a new take on it.
We are accustomed to taking a rather parochial and self-centered view
of the world. Indeed, our very nature is such that this viewpoint is
almost impossible to escape. But to understand this result, you have to
try to think in a more general and universal manner. "Free your mind",
as they said in the first Matrix movie. Try to reason not just from your
own personal perspective, but from a point of view which sees everyone's
opinions in an objective and unbiased manner.
Imagine two people, A and B, coming together to discuss a contentious
issue. You are a telepathic third party, an objective, rational observer
who has full access to the content of their minds. You can see all
of the memories, experiences, opinions, and goals of both of them.
You are free of prejudices of your own and are just going to reason
logically based on the information you receive from A and B.
It seems clear that you will be able to put that information together and
come to a conclusion about the issue in question. You will be able to
judge which side of the issue is more likely to be true. This is based
on the information you get from A and B. Even where their information
is contradictory based on different experiences, you will be able to
judge the relative accuracy of the different information that they
have received.
Now, you've done this as an objective observer. But the point is, either
A or B could have done the same thing, if they had had the same kind of
access you did. This is the key, that A and B, if they fully understood
the mind of the other and their reasons for holding their opinions, would
come to the same conclusion about the issue. It is the same conclusion that
you came to, because it is based on the same information (that is, the
total information of A+B), along with logic and reason.
Therefore, if you fully understood the mind of another, and he fully
understood you, and both of you were rational, you would not be able to
disagree about any factual matter.
That's the first step. The result that Robin discusses goes farther,
and arises even without the need for magical telepathic understanding.
But the foundation is the belief which you both have, that if you did
have full access to all of the information the other guy has, you would
both agree.
Given that you both know this, if you disagree initially, you must realize
that it is something of an artifact caused by incomplete information.
And a priori there is no particular reason to suppose that one party
or the other is more likely to hold the position which you would both
eventually come to agree on. In particular, you should have no prejudice
that you are likely to be right, just because you are you; for you would
understand that such reasoning is equally applicable to the other side,
and so this reasoning predicts that each side is more likely to be right,
which is impossible.
Now, you may have particular reasons to believe that you are more
likely to be right; maybe you have unusually high-quality information
about this issue; or maybe you are more likely to be right about things
than the average human. But it's possible that the other person shares
these attributes, too, perhaps even to a greater degree than you.
As you discuss the issue, you may be reluctant to change your mind, if
you have some of the reasons above to believe that you are likely to be
right. On the other hand, if it is an issue about which you know little,
you might be relatively willing to switch sides. So what should happen
is as the discussion goes on, the reluctance by the other party to switch
sends a signal that the quality of their information is good. With
continued discussion, eventually one side should rationally conclude that
the other side's information is of higher quality than their own, and
switch sides, so that they eventually agree.
The basic result, which goes back to the 1970s, is that you can't
rationally "agree to disagree", that is, to each hold to opposing
positions which you will not change. This is true even though you both
know that if you both knew what the other knew, one of you would change
his mind (the mind-reading scenario). The only way you can refuse to
change even in the face of continued refusal by the other side is if you
have infinitely strong grounds to hold your position, which is impossible.
Now, you may have found yourself in situations like this, where you
were faced with a seemingly rational opponent but you simply could not
accept that his position might be right. In that case, you probably
told yourself that he was being emotional, or stubborn, or stupid,
or manipulative, or for some other reason was acting irrationally.
You did not truly believe that he was acting as a rational truth-seeker.
If you did think that, I believe you would find it much more difficult
to accept the "agree to disagree" outcome. There is a fundamental
contradiction in adopting that position if both sides know each other
to be rational and honest.
Hal
This archive was generated by hypermail 2.1.5 : Wed May 28 2003 - 15:59:01 MDT