"Eliezer S. Yudkowsky" wrote:
> firstname.lastname@example.org wrote:
> > The part about Robin's paper (http://hanson.gmu.edu/deceive.pdf) that
> > I have a hardest time understanding is the discussion of common priors.
> Hm. Well, I understood it perfectly. Therefore, you should take this
> into account in deciding whether or not the paper makes sense, unless you
> don't think we have common priors.
> Essentially, Robin's paper gives a rigorous mathematical proof that for
> two people to (a) disagree and (b) maintain their disagreement after
> interaction, one or both of the parties must believe that they are more
> likely to be rational than the other person. This does not necessarily
> prove irrationality in *all* cases but it proves irrationality for *most*
> cases. If we take into account the evolutionary-psychology arguments,
> Robin's paper makes a strong case for a built-in irrationality factor
> common to all humans.
This seems a bit strained to me. That A has a strong argument for X
that I cannot defeat logically does not compell me to accept X at that
time. I may believe the argument leaves out something crucial than I
have as yet been unable to identify. I may be correct or incorrect in
that evaluation and able or unable to express solid reasons for the
evaluation that others will be swayed by. But to conclude I am simply
irrational or do not care about truth because I did not agree with A
about X in such circumstances does not follow.
Humans are not pure reasoning machines and certainly not effective
Bayseian processors. Much of our "reasoning", our processing of data,
concepts and so on is not even in the conscious realm. It is quite
unlikely that only our conscious evaluations and adhering only to them
at all points will generally bring us closer to truth in all
circumstances. The argument applies much better to a SI than to a
creature organized as we are.
> If a 100,000:1 genius is interacting with a 10,000,000:1 genius, but
> neither of them knows the other's percentile, both will rationally assume
> that they are more likely to be rational than the other person. However,
> Robin's paper does prove that in *most* cases, rather than in the rarer
> instances where two geniuses unknowingly interact, people must be
> overestimating their own rationality relative to others, or else must not
> be using rigorous Bayesian reasoning with respect to what they are
> licensed to conclude from their own thoughts.
Rationality is not only a matter of how much general intelligence you
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:03 MDT