Eliezer S. Yudkowsky wrote:
> If mortal life is totally meaningless, it would be logical to
> exterminate them for their spare atoms. Mortals, knowing this, will
> refuse to create Singularities. If mortals could bargain with the
> Singularity, it would obviously be to the Singularity's
> advantage to set
> aside a quadrillionth of computing power for Permutation-City-style
> accomodations, in return for existing at all. But we can't
> bargain with
> the Singularity until after we've created it and our hold is gone.
> Bearing this in mind, how can you bind the Singularity to the
> bargain? What is the Singularity's logical course of action?
If the life of a Power is meaningless, there is no reason to bother disassembling the mortals. If it isn't, it would be logical to upgrade the mortals to a state in which their existence can be meaningful (resource scarcity might change that, but seems unlikely to actually be a factor).
If that logic fails, I see no way for mortals to bargain with Powers. There is no way to enforce the contract, after all.
> Bonus question one: Suppose you have a time machine that can ONLY
> convey the information as to whether or not the Singularity
> will happen..
> If you change your plans as a result of the indicator, it resends.
> This is one bit of information and thus can be modulated to convey
> messages from the future. How do you negotiate?
This is another instance of the identity issue. Every time you change the future, you are dealing with a different Singularity. You aren't negotiating, you are exploring the set of possible futures - but you have no way to tell when you hit the one you want.
Important side issue: The future hasn't happened yet. Any communication that moves backwards in time, and is important enough to have an effect on human activities, will significantly change the future it came from. This implies that it is impossible to have a conversation across time, even if you have a temporal communicator - the party in the past is talking to a different future each time he sends a reply.
> Bonus question two: In both cases above, it is necessary to plausibly
> threaten not to create a Singularity. The only other option, in the
> long run, is exterminating the human race. This has to be able to
> plausibly happen, either as a logical consequence or as an alternate
> future. How do you force yourself to destroy the Earth?
Given that you could negotiate, a threat not to create a particular Singularity would seem to be sufficient. But if you needed the big stick, the way to do it is simply to pick the right human to do the negotiating and give him the controls to the doomsday device.
Of course, IMO we couldn't actually build a doomsday device that would work reliably the first time - the only feasible approach would be a series of gigantic nukes (at least 10^4 megatons, maybe bigger), and testing the design would be rather difficult.
> If a Singularity is a good thing, why haven't earlier
> Singularities sent
> robot probes to help it happen? If SIs commit suicide, why isn't the
> whole Universe full of mortals?
The only response I've ever seen that fits the observations is that there isn't anyone out there. It doesn't seem to make much sense, but then, that just implies that there are important things we don't know yet.
> How can you fight your future self, who automatically knows
> all of your
> plans, including the ones you're making right now? What if the future
> self is a transhuman?
If its important enough, kill yourself. Less drastically, become someone you won't end up needing to oppose. Personal evolution is not a random process for rational minds.
> Is there any way to oppose a Power running a simulation of you?
No. A transhuman, sure, but not a Power. At best, it sees everything you will ever think of and lots of things you won't. At worst, it can always foresee exactly what you will do. Either way, you can only beat it if the odds are stacked so hard in your favor that nothing the Power does will let it win - which doesn't seem to be a likely situation.
BTW, what does it need the sim for?
Billy Brown, MCSE+I