From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed May 28 2003 - 15:24:29 MDT
Rafal Smigrodzki wrote:
>
> ### Certainly a quite complex article. I think that what you quoted above
> means that the Bayesian would treat the output of another Bayesian as data
> of the same validity as the output of his own reasoning. If you know that a
> fellow Bayesian sincerely believes in flying saucers, you have to believe in
> them, too, unless your priors are wildly divergent ("having a memory of
> seeing a flying saucer as clear as my memory of seeing my car is sufficient
> to profess belief in flying saucers" vs. "no amount of subjective visual
> experience is sufficient to profess belief in flying saucers"). If the
> honest Bayesian says he saw a flying saucer, you have to believe him, or
> else assume he is not Bayesian at all, or has a higher visual/cortical
> malfunction rate than you (i.e. is less Bayesian than you). Barring these
> doubts, you would become as convinced about the existence of flying saucers
> as the person who actually saw them, despite not having the direct sensory
> input that he had. In effect, his beliefs are as valid an input for your
> future reasoning as your own sensory and logical subsystem outputs.
Bear in mind that one should distinguish between *real*, *genuine*
Bayesians like AIXI, and mere Bayesian wannabes like every physically
realized being in our Universe.
Bear in mind also that the above result holds only if you believe with
absolute certainty (itself a very non-Bayesian thing) that the Bayesian's
reasoning processes are perfect.
And finally, bear in mind that, given the above assumptions, we would not
actually be confronted with a Bayesian saying he believed in flying saucers!
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed May 28 2003 - 15:38:00 MDT