Re: Singularity Mind-Benders

Samael (Samael@dial.pipex.com)
Tue, 12 Jan 1999 10:40:05 -0000

-----Original Message-----
From: Eliezer S. Yudkowsky <sentience@pobox.com>
>If mortal life is totally meaningless, it would be logical to
>exterminate them for their spare atoms. Mortals, knowing this, will
>refuse to create Singularities. If mortals could bargain with the
>Singularity, it would obviously be to the Singularity's advantage to set
>aside a quadrillionth of computing power for Permutation-City-style
>accomodations, in return for existing at all. But we can't bargain with
>the Singularity until after we've created it and our hold is gone.
>Bearing this in mind, how can you bind the Singularity to the bargain?
>What is the Singularity's logical course of action?

Lie to us. "Certianly. You'll have whole universes to pay with. No problem <snicker>"

>Bonus question one: Suppose you have a time machine that can ONLY
>convey the information as to whether or not the Singularity will happen.
> If you change your plans as a result of the indicator, it resends.
>This is one bit of information and thus can be modulated to convey
>messages from the future. How do you negotiate?

Get your advanced AI that's could become a singularity. Bargain with it. Expect it to lie to you. Give up.

>Bonus question two: In both cases above, it is necessary to plausibly
>threaten not to create a Singularity. The only other option, in the
>long run, is exterminating the human race. This has to be able to
>plausibly happen, either as a logical consequence or as an alternate
>future. How do you force yourself to destroy the Earth?

I thought we were just going to allow humanity to die out. who's destroying the earth (still beyond our capability, I believe)?

All structures break down in time, unless the singularity finds an endless supply of energy (not just a very large one, an infinite one), it will eventually succumb to entropy. What we need to do is maximise our overall life worth. Whether this is 5 million years of 'quite a good time' or a thousand years of 'extremely good time', if creating a singularity reduces it to 'fifteen nminutes of being dissassembled', I'm not in favour of it. Unless being dissassembled is a lot more fun than it sounds.

>If a Singularity is a good thing, why haven't earlier Singularities sent
>robot probes to help it happen? If SIs commit suicide, why isn't the
>whole Universe full of mortals?

Too early in the universe? SI's don't want competition and space travel is too much of a pain in the neck to be worthwhile? God doesn't like it and has put impervious crystal shells around each solar system (cf David Brin short story)?. They are out there, but are making too much money broadcasting the game show "Humans" back to their home planet to interrupt our fun (cf Robert Rankin - Armageddon the Musical)?

>How can you fight your future self, who automatically knows all of your
>plans, including the ones you're making right now? What if the future
>self is a transhuman?

I'm sure I read this somewhere, possibly a very old issue of Warlock, but I can't be sure. If your future self can fight you, it implies that the future is set, in which case whatever happens, happens - it's all predestination.

>Is there any way to oppose a Power running a simulation of you?

Transfer your simulation into a body outside the simulation and find it's off switch. (large red button right next to the 'self destruct button')

Humans can no more fight powers and singularities than ants can fight humans. Sheer coincidence can once in a while allow an ant to give a human an infectious disease, but that's about your limit.

Samael