Re: Is this world a computer simulation?

Eliezer S. Yudkowsky (
Mon, 06 Sep 1999 15:36:56 -0500

Matt Gingell wrote:
> This is an interesting line of speculation, but not one that I think is really
> worth worrying about. Given the amount of horror humanity’s gone through this
> century, if we are being simulated by an intelligence interested in minimizing
> suffering, then either it has fundamental reasons for not getting involved – it
> doesn’t want to damage the integrity of the simulation, for instance – or it’s
> motivations are sufficiently inscrutable to make discussion pointless. If we
> were going to raise a general morality violation, we would have done so by now –
> the sky would have turned dark blue and novas would have lined up to form a
> register dump and the vendors 800 number.

Which makes sense, except that I'm speculating that the Way Things Are has "evolved" to maximize the reproductive rate of the simulations. In which case, there's an obvious adaptive selection pressure for rewriting nonconformist Powers - or, on the very dimly bright side, stopping nanodisasters - that doesn't exist for stopping random suffering or optimizing for pleasure. Such worlds may exist, but they don't reproduce. Besides, at least one major sub-hypothesis is that the Powers involved are insane.

On the other hand, one has to wonder at forty thousand years of nontechnological history and fifteen billion years of simulation dangling uselessly, assuming all that was real, which my intuitions say they are. Oh, hell. Who am I to pretend that I have any idea what constitutes a "selection pressure" when it's acting on a recursive chain of gods, most of which are probably insane? I don't even know what the Anthropic Principle means any more; I'm not sure how to compute the relative probabilities.

> If pleasure were being maximized, we’d be disembodied strings of code floating
> in virtual tanks full of virtual opiates. If pain were being minimized, we
> wouldn’t be here. If an optimal compromise between the two had been found, there
> ’d be nothing in the universe but endless mirrors of Earth. The incredibly
> arbitrary nature of the universe at a macroscopic level makes me doubt there’s a
> God paying attention to us, synthetic or otherwise.

Oh, you mean the way that if Powers exist they should expand at lightspeed, and if alien races exist they should expand pretty fast anyway, and Earth's sun is hardly old, so therefore intelligent life is impossible and we aren't here?

"Unlightenment": The stage at which you know so much about the Universe, and you've accumulated so much to be explained and have learned to create such rigorous explanations, that you have more constraints than degrees of freedom - and they don't fit. I cannot think of *any* good explanation for the Great Filter Paradox. My visualization of the Universe has now reached the point where it contains active impossibilities. I give up. Did I mention that I've given up? Well, I've given up. All I bloody know about the bloody facts is that the bloody answer is probably bloody obvious to any entity with half a bloody brain, so I'm going to treat the bloody world like it's actually real, bloody impossibilities and all, and try to manipulate the observed bloody regularities so a transhuman pops out, at which point my *bloody* job is bloody well OVER.

> ps. You used the word 'culture' with a capital C the other day. Was that an Iain
> Banks reference?

Hm. Where?

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way