Re: The Major League Extinction Challenge

Eliezer S. Yudkowsky (sentience@pobox.com)
Wed, 28 Jul 1999 22:58:40 -0500

Billy Brown wrote:
>
> For the record, Eliezer Yudkowsky gets credit for the only plausible
> extinction event I've seen so far (see
> http://singularity.posthuman.com/sing_analysis.html#zones, option #2). It
> requires making some assumptions that many people would disagree with, but
> at least it is arguably possible.

Even that one doesn't work. Are all of your SIs upgrading themselves at exactly the same time? I mean, let's think about what this would require. It would require the *entire* human civilization to be uploaded, and then upgraded, in such a way that *nobody*, including den Otter, saw what was coming while still capable of desiring to avoid it. Maybe if you merged the entire civilization into one entity, *before* trying it... Yeah, that would work. I can believe that the threshold for collapse is sufficiently far above the threshold for not wanting to avoid a collapse that it would take out all SIs. So either the civilization merges, or all the individuals upgrade at roughly the same rate - anyway, everyone makes it to the IGS-using stage. Some time later - subjectively, of course - they advance to the point where they give and die. Okay, it sounds plausible.

Problem is, choosing to commit suicide is still a choice - and that's not what I'm hypothesizing. At that level, I don't have the vaguest notion of what would really happen if an SI's goal system collapsed. The whole lapse-to-quiesence thing in Elisson is a design feature that involves a deliberate tradeoff of optimization to achieve a graceful shutdown.

> Are there any other candidates.

Well, if you're interested in a not-so-known-laws-of-physics speculation: The various colonies achieve SI more or less simultaneously, or unavoidably. The first thing an SI does is leave our Universe. But, this requires a large-scale energetic event - like, say, a supernova.

Still doesn't solve the Great Filter Paradox, though. Some hivemind races will have the willpower to avoid Singularity, period. This scenario takes mortals and Powers out of the picture during a Singularity, but it doesn't account for the deliberate hunting-down that would be needed.

-

I think the most plausible argument is this: Every advance in technology has advanced the technology of offense over the technology of defense, while decreasing the cost required for global destruction. There are no shields against nuclear weapons - not right now, anyway - and we've certainly managed to concentrate that power more than it's ever been concentrated before. In fact, the more technology advances, the easier it becomes to cause mass destruction by *accident*. It holds true from nuclear weapons, to biological warefare, to the Y2K crash, to nanotechnology. All you really need to assume is that the trend continues. Eventually one guy with a basement lab can blow up the planet and there's nothing anyone can do about it.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way