Re: Hiveminds and the Great Filter

Eliezer S. Yudkowsky (sentience@pobox.com)
Wed, 10 Mar 1999 10:41:33 -0600

I can't speak to whether or not this is a Steady State Universe, but I would like to note that this *would* solve the Great Filter Paradox. Even if the Powers expand at lightspeed, or faster, or MUCH faster, there would still, always and eternally, be islands with no Singularities for an arbitrary distance.

The question would be figuring the formula, using (say) lightspeed geometric vs. exponential growth-rate assumptions, the probability of a given island being absorbed during a given time-period. If this number is too high, then we have the result that, even though we've managed to survive until today, the probability is still 1000:1 (for!) that we'll be absorbed by tomorrow, or maybe even during the next second.

In this case, Yudkowsky's Modified Anthropic Occam's Razor kicks in and basically says "This theory has failed so many times that we might as well assume it's wrong."

YMAOR: That simplest explanation is true, which most uniquely predicts the present given the past, with the least reference to the Anthropic Principle.

MAOR: That simplest explanation is true, which predicts the present given the past, with the least reference to the Anthropic Principle.

AOR: That simplest explanation is true which least invokes the Anthropic Principle.

OR: The simplest explanation is the most probable.
--

        sentience@pobox.com          Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/singul_arity.html
Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.