Re: reasoning under computational limitations

Eliezer S. Yudkowsky (sentience@pobox.com)
Sun, 28 Mar 1999 14:48:46 -0600

Nick Bostrom wrote:
>
> Eliezer S. Yudkowsky wrote:
>
> > Of course. There are eleven people with the correct digit and nine
> > people with nine different incorrect digits. Ergo, your digit is
> > probably the right one.
>
> Are you aware that this is the same reasoning that gives rise to the
> Doomsday argument? Do you accept that argument? If not, why?

What are we talking about, here? The idea that we probably live in the time with the largest population, ergo there are no galactic civilizations?

Nope. Argument falls apart if the population keeps growing infinitely. And, if an infinite population lets us be here now, I don't see why a finite population would make it harder.

The problem is that all situations "require", "predict" our modern environment, and while a situation that predicts it with a higher absolute probability would be more likely, I don't think that the relative probability can yield predictive heuristics.

I mean, what you're saying is that if I don't know whether I'm in an environment where everyone has different numbers or everyone has the same number, no matter which number I get, I should predict that everyone has the same number. Or at least that's sort of what it sounds like. Now, if I had to guess everyone else's number, I'd guess it was the same as mine, because I don't have anything better to guess. But that doesn't mean I know anything more about whether everyone really does have the same number.

If you know the relative probabilities, you can assume you're any observer. I don't think it works in reverse.

The Doomsday argument doesn't look very predictive. I mean, every single generation except ours that tried to use it would be wrong, right? And we have no reason to think that we'll be different, right? According to the "predict the present given the past" clause, this heuristic is no good.

Actually, I don't know if I accept the Doomsday argument. Maybe when I decide, I'll add another adjective to Yudkowsky's Modified Anthropic Occam's Razor.

But if I did accept the Doomsday argument, given the number of times I have to use the Anthropic principle to explain my own existence - once for the noncomputable qualia, and another time for being a Specialist, and another time for having the chance to do something fun and important, e.g. Singularity - I'd have to assume I was a computer simulation, right? I mean, I'm something out of science fiction, so I'd have to assume I was *actual* science fiction. There are more books than people, right? This set of assumptions explains all the facts at a much higher relative probability, right?

So, insofar as I still tentatively think the world is real, and I'm not just being run as a test case to determine whether a certain type of IEer is trustworthy, and I'm not a book, then I'd say I'd also have to reject the Doomsday argument. Of course, if you really exist, you're welcome to think differently.

-- 
        sentience@pobox.com          Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/singul_arity.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.