From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 03 2003 - 11:01:56 MDT
Robin Hanson wrote:
> On 6/3/2003, Eliezer Yudkowsky wrote:
>
>> Suppose that you have two barrels; one barrel contains a billion
>> billiards, and the other barrel contains ten billiards. The billiards
>> in each barrel are sequentially numbered - in the first barrel, the
>> billiards are numbered from one to a billion; in the second barrel,
>> the billiards are numbered from one to ten.
>>
>> Suppose that the billiards are self-aware observers. A billiard
>> numbered '5' can look at itself and reason: "Gee, I'm numbered '5'!
>> What a surprise! Now how likely is *that*, if I'm not in the second
>> barrel? It's got to be virtually certain that I'm in the second
>> barrel! Such a tremendous surprise has to be evidence for *something*!"
>>
>> But we *know* this is wrong. Within the whole physical universe of
>> discourse, two billiards reason this way. Two billiards have these
>> thoughts. Of these two billiards, one is in the first barrel and one
>> is in the second barrel. Whatever line of reasoning they carry out,
>> it should end in the conclusion: 50/50. ...
>
> Er, Eliezer, this *is* exactly an example of Doomsday type reasoning.
> If you are a billiard who does not know his number, you reason that
> since only one in a hundred million balls are in the small barrel, there
> is only a one in a hundred million chance that you are in the small
> barrel. Then you look at your number and decide that there is now a
> 50/50 chance you are in the small barrel. Your estimate of "doom"
> (living in a small barrel) just went way way up.
Actually, it's an example of applying the Self-Indication Assumption
followed by the Doomsday Argument, which I think might turn out to be a
case of making two sequential mistakes that cancel out. I agree that if
you look at a complete set of indistinguishable cognitive pointer states
corresponding to each billiard in all barrels, then a well-calibrated
estimate for these pointer states is 'one in a hundred million'. If a
billiard which *starts out in this state* looks at itself and discovers
the number '5', then the Doomsday Argument would apply as evidence and the
probabilities would change.
But you can also be born seeing the number '5', and you'd find your
probability was 50/50 when you started. You can only get a 'Doomsday
Argument' by imagining that you cancel out a fact that you already know,
and cancel it in a certain specific way, which - given that your *final*
answer must equal 50/50 to be correct - leads you to think that your
"prior" before "learning" this piece of "evidence" must have been very
high, i.e., the SIA prior. Personally I look on all of this with
suspicion. There are millions of times more grains of sand than humans.
Should I conclude that I started out with an enormous prior probability of
being a grain of sand? There is nothing a cognitive pointer state *could
have been*; it just *is*. A mind is just something that the physical
universe coughs up from time to time, and all you can do is try and
calculate the correlation of that pointer state with environmental
variables. You can go forward from the discovery of new evidence; I'm not
sure it makes sense to selectively eliminate evidence you were born with
and ask what your "priors" were before that. I didn't start out with a
one-in-six-billion chance of being Eliezer Yudkowsky. I'm just Eliezer
Yudkowsky, where an Eliezer Yudkowsky is something the universe coughs up
from time to time. My prior probability of being Eliezer is 1.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Jun 03 2003 - 11:13:28 MDT