Re: Singularity Mind-Benders

Scott Badger (wbadger@psyberlink.net)
Tue, 12 Jan 1999 07:44:44 -0600

> From: Eliezer S. Yudkowsky <sentience@pobox.com>
> Fun Problems for Singularitarians:
>
> ==
>
> If mortal life is totally meaningless, it would be logical to
> exterminate them for their spare atoms. Mortals, knowing this, will
> refuse to create Singularities. If mortals could bargain with the
> Singularity, it would obviously be to the Singularity's advantage to set
> aside a quadrillionth of computing power for Permutation-City-style
> accomodations, in return for existing at all. But we can't bargain with
> the Singularity until after we've created it and our hold is gone.
> Bearing this in mind, how can you bind the Singularity to the bargain?
> What is the Singularity's logical course of action?

Sounds to me like your premise is strained. A completely logical SI would recognize that it's own existence has no more "meaning" than does human existence. It would have to self-destruct if it determined that this was a sufficient criterion for extermination. IMO, there's no such thing as meaning. There's only relevance. And if it came to the point where our existence became irrelevant to the SI, why would it logically follow that it would then go out of it's way to exterminate us?

> Bonus question one: Suppose you have a time machine that can ONLY
> convey the information as to whether or not the Singularity will happen.
> If you change your plans as a result of the indicator, it resends.
> This is one bit of information and thus can be modulated to convey
> messages from the future. How do you negotiate?

This riddle is too fuzzy to me. Sorry.

> Bonus question two: In both cases above, it is necessary to plausibly
> threaten not to create a Singularity. The only other option, in the
> long run, is exterminating the human race. This has to be able to
> plausibly happen, either as a logical consequence or as an alternate
> future. How do you force yourself to destroy the Earth?
Dunno, still working on my logical skills
>
>
> If a Singularity is a good thing, why haven't earlier Singularities sent
> robot probes to help it happen? If SIs commit suicide, why isn't the
> whole Universe full of mortals?

I agree with Justin Jones who succinctly said:

"I dont care if people say there is a certain probability of other intelligent life out there, I see no evidence of past singularities or other civilizations so I don't assume there have been any."

> How can you fight your future self, who automatically knows all of your
> plans, including the ones you're making right now? What if the future
> self is a transhuman?

You enlist the aid of comrades whose actions will be unknowable by your future self. If my future self is a transhuman . . . I'll leave him alone! Whatever he's doing will be in my best interest regardless of how it appears. :-)

> Is there any way to oppose a Power running a simulation of you?

Isn't there a sci-fi novel about this? "Permutation City" perhaps?

Scott Badger