>That you should
>have a rank of sixty billion or so is much more likely is
>only 100 billion persons will ever have lived than if there will be
>many trillion persons. Therefore, by Bayes' theorem, you should
>update your beliefs about mankind's prospects and realise that
>an impending doomsday is much more probable than you have
>hitherto thought.
>Well I've just tossed heads ten times in a row.
I don't believe you ;-)
>Do you want to give me
>better than fifty-fifty odds on heads coming up next?
No, from this information it's obviously 50%.
>If we were able to
>randomly select a new life out of all the possible lives that ever lived
>and will live and found ourselves in the 20th C then I grant your argument
>has some weight. But someone has to come first.
Indeed.
>I suppose you were standing
>next to Neil Armstrong as he stepped up to the launch pad saying "If the
>moon is reachable by people then its likely that lots of people will reach
>it in the future. The chances that you would be the first are vanishingly
>small. Many people have died in the past. Therefore its much more likely
>that you will die rather than be the first person to reach the moon".
This is an interesting example. As far as I can see it doesn't
involve the Doomsday argument, though. It is just standard induction:
if more people have died than reached the moon, then, ceteris
paribus, Neil shouldn't believe he will reach the moon. Of course,
the cetera weren't paria in this case: he knew that he entered
a space vehicle. This fact outweighs the priors created by induction.
But in the same way, the DA can also be overridden if we have
strong independent reason to believe that mankind will continue
to exist a long time (which we havn't, of course).
>This has a very biblical ring to it "the first shall be last". Personally I
>prefer a more Extropian version "the first shall be first..."
I prefer that too.
Nicholas Bostrom
Dept. of Philsophy, London School of Economics
http://www.hedweb.com/nickb
n.bostrom@lse.ac.uk