Re: Is this world a computer simulation?

Matt Gingell (mjg223@nyu.edu)
Mon, 6 Sep 1999 18:49:03 -0400



From: Eliezer S. Yudkowsky <sentience@pobox.com>

>Which makes sense, except that I'm speculating that the Way Things Are
>has "evolved" to maximize the reproductive rate of the simulations. In
>which case, there's an obvious adaptive selection pressure for rewriting
>nonconformist Powers - or, on the very dimly bright side, stopping
>nanodisasters - that doesn't exist for stopping random suffering or
>optimizing for pleasure. Such worlds may exist, but they don't
>reproduce. Besides, at least one major sub-hypothesis is that the
>Powers involved are insane.

If a universe were adapted enough to prevent nano-disaster among its children, then surely it would be pushing us towards Singularity hard enough we'd notice. In any case, you'd expect something that highly evolved to short-circuit the process by launching a simulation of itself at the instant it started a simulation of itself.

But there's no point in trying to generalize from just our node of the search tree. There are too many unknowns to make this an interesting discussion, even if God isn't nuts.

>> If pleasure were being maximized, we'd be disembodied strings of code
floating
>> in virtual tanks full of virtual opiates. If pain were being minimized, we
>> wouldn't be here. If an optimal compromise between the two had been found,
there
>> 'd be nothing in the universe but endless mirrors of Earth. The incredibly
>> arbitrary nature of the universe at a macroscopic level makes me doubt
there's a
>> God paying attention to us, synthetic or otherwise.
>
>Oh, you mean the way that if Powers exist they should expand at
>lightspeed, and if alien races exist they should expand pretty fast
>anyway, and Earth's sun is hardly old, so therefore intelligent life is
>impossible and we aren't here?

I'm not going to list all the hypothetical solutions to Fermi's paradox here.

>"Unlightenment": The stage at which you know so much about the
>Universe, and you've accumulated so much to be explained and have
>learned to create such rigorous explanations, that you have more
>constraints than degrees of freedom - and they don't fit. I cannot
>think of *any* good explanation for the Great Filter Paradox. My
>visualization of the Universe has now reached the point where it
>contains active impossibilities. I give up. Did I mention that I've
>given up? Well, I've given up. All I bloody know about the bloody
>facts is that the bloody answer is probably bloody obvious to any entity
>with half a bloody brain, so I'm going to treat the bloody world like
>it's actually real, bloody impossibilities and all, and try to
>manipulate the observed bloody regularities so a transhuman pops out, at
>which point my *bloody* job is bloody well OVER.

Here's my plan - focus on the one tiny corner of the world I think I can make some sense of, think I'm on to something, work like a dog for years and years, let my ego get away from me and publish ludicrously overblown projections, fail spectacularly and get laughed out of the scientific community in a flap that makes the cold-fusion thing look friendly, drink myself half way to oblivion, and end up choking on my tongue in a Budapest hotel the day before Science publishes the article explaining everything.

>> ps. You used the word 'culture' with a capital C the other day. Was that an
Iain
>> Banks reference?
>
>Hm. Where?
>--

From: Eliezer S. Yudkowsky <sentience@pobox.com> Sent: Thursday, September 02, 1999 10:34 AM Subject: Re: >H RE: Present dangers to transhumanism

So then why aren't the aliens here already? Equipped with nanotechnology, they sweep from star to star at slower-than-light speeds, engaging in Dysoneering and playing Culture to primitive cultures, and run out of stars in any given galaxy in, say, less than a million years. No spacetime engineering; that's a Power's game.