Singularity Mind-Benders

Eliezer S. Yudkowsky (
Mon, 11 Jan 1999 21:18:02 -0600

Fun Problems for Singularitarians:


If mortal life is totally meaningless, it would be logical to exterminate them for their spare atoms. Mortals, knowing this, will refuse to create Singularities. If mortals could bargain with the Singularity, it would obviously be to the Singularity's advantage to set aside a quadrillionth of computing power for Permutation-City-style accomodations, in return for existing at all. But we can't bargain with the Singularity until after we've created it and our hold is gone. Bearing this in mind, how can you bind the Singularity to the bargain? What is the Singularity's logical course of action?

Bonus question one: Suppose you have a time machine that can ONLY convey the information as to whether or not the Singularity will happen. If you change your plans as a result of the indicator, it resends. This is one bit of information and thus can be modulated to convey messages from the future. How do you negotiate?

Bonus question two: In both cases above, it is necessary to plausibly threaten not to create a Singularity. The only other option, in the long run, is exterminating the human race. This has to be able to plausibly happen, either as a logical consequence or as an alternate future. How do you force yourself to destroy the Earth?


If a Singularity is a good thing, why haven't earlier Singularities sent robot probes to help it happen? If SIs commit suicide, why isn't the whole Universe full of mortals?


How can you fight your future self, who automatically knows all of your plans, including the ones you're making right now? What if the future self is a transhuman?


Is there any way to oppose a Power running a simulation of you?

--         Eliezer S. Yudkowsky

Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.