>Which makes sense, except that I'm speculating that the Way Things Are
>has "evolved" to maximize the reproductive rate of the simulations. In
>which case, there's an obvious adaptive selection pressure for rewriting
>nonconformist Powers - or, on the very dimly bright side, stopping
>nanodisasters - that doesn't exist for stopping random suffering or
>optimizing for pleasure. Such worlds may exist, but they don't
>reproduce. Besides, at least one major sub-hypothesis is that the
>Powers involved are insane.
If a universe were adapted enough to prevent nano-disaster among its children, then surely it would be pushing us towards Singularity hard enough we'd notice. In any case, you'd expect something that highly evolved to short-circuit the process by launching a simulation of itself at the instant it started a simulation of itself.
>> If pleasure were being maximized, we'd be disembodied strings of code
floating
>> in virtual tanks full of virtual opiates. If pain were being minimized, we
>> wouldn't be here. If an optimal compromise between the two had been found,
there
>> 'd be nothing in the universe but endless mirrors of Earth. The incredibly
>> arbitrary nature of the universe at a macroscopic level makes me doubt
there's a
>> God paying attention to us, synthetic or otherwise.
>
>Oh, you mean the way that if Powers exist they should expand at
>lightspeed, and if alien races exist they should expand pretty fast
>anyway, and Earth's sun is hardly old, so therefore intelligent life is
>impossible and we aren't here?
I'm not going to list all the hypothetical solutions to Fermi's paradox here.
>"Unlightenment": The stage at which you know so much about the
>Universe, and you've accumulated so much to be explained and have
>learned to create such rigorous explanations, that you have more
>constraints than degrees of freedom - and they don't fit. I cannot
>think of *any* good explanation for the Great Filter Paradox. My
>visualization of the Universe has now reached the point where it
>contains active impossibilities. I give up. Did I mention that I've
>given up? Well, I've given up. All I bloody know about the bloody
>facts is that the bloody answer is probably bloody obvious to any entity
>with half a bloody brain, so I'm going to treat the bloody world like
>it's actually real, bloody impossibilities and all, and try to
>manipulate the observed bloody regularities so a transhuman pops out, at
>which point my *bloody* job is bloody well OVER.
>> ps. You used the word 'culture' with a capital C the other day. Was that an
Iain
>> Banks reference?
>
>Hm. Where?
>--
From: Eliezer S. Yudkowsky <sentience@pobox.com> Sent: Thursday, September 02, 1999 10:34 AM Subject: Re: >H RE: Present dangers to transhumanism