From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 03 2003 - 08:31:43 MDT
Rafal Smigrodzki wrote:
> Hal Finney wrote:
>
>>If you started off confident that humans would survive because so many
>>other peoples have done it, the analysis of the DA ought to shake your
>>confidence at least somewhat. It ought to cause you to revise your
>>estimate downward of the human race's chances for success. Your early
>>birth rank is evidence against a successful human diaspora. You may
>>still believe that it can happen, if the evidence from alien success
>>is even greater than the odds-against generated by the DA. But
>>still, you have to count the DA as evidence in the negative
>>direction. And that's all that the DA asks, that you take it into
>>consideration and revise your estimates downward.
>
> ### I fully agree here. The DA should indeed reduce the estimate I might
> have of our civilization surviving much longer.
As near as I can figure, the DA should have *no* impact on our estimates.
It is not *any* kind of evidence.
Suppose that you have two barrels; one barrel contains a billion
billiards, and the other barrel contains ten billiards. The billiards in
each barrel are sequentially numbered - in the first barrel, the billiards
are numbered from one to a billion; in the second barrel, the billiards
are numbered from one to ten.
Suppose that the billiards are self-aware observers. A billiard numbered
'5' can look at itself and reason: "Gee, I'm numbered '5'! What a
surprise! Now how likely is *that*, if I'm not in the second barrel?
It's got to be virtually certain that I'm in the second barrel! Such a
tremendous surprise has to be evidence for *something*!"
But we *know* this is wrong. Within the whole physical universe of
discourse, two billiards reason this way. Two billiards have these
thoughts. Of these two billiards, one is in the first barrel and one is
in the second barrel. Whatever line of reasoning they carry out, it
should end in the conclusion: 50/50. You can argue about what line of
reasoning this is, whether it amounts to the Self-Indication Assumption or
something else, but if a chain of reasoning ends with anything other than
50/50, it is not well-calibrated.
Please note that nobody is "picking" a billiard "at random", or "picking"
a barrel "at random", or anything of that sort. The barrels, and
billiards, just *are*. Let's say there are a hundred barrels, fifty big
barrels with a billion billards, fifty small barrels with ten billiards.
A hundred billiards are numbered '5'. Fifty are in big barrels, fifty are
in little barrels. A billiard that looks at itself and finds the number
'5' is a cognitive pointer state that points to a big barrel in fifty
cases and points to a little barrel in fifty cases. So if these cognitive
entities come to any conclusion other than "50% probability", they are
not, in fact, well-calibrated.
Forget the whole question of sentient beings being picked "at random" from
some distribution or other. Sentient beings aren't picked. They just are.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Jun 03 2003 - 08:43:17 MDT