Eliezer S. Yudkowsky wrote:
> Nanotechnology is a wild card that could stay unplayed or enter the game
> at any time. Nanotech's first applications will be entirely
> destructive. The researchers at Zyvex or Foresight will naively release
> the information and someone will reduce the Earth to grey goo.
> Most probable kill: Grey goo; nuclear war provoked by a
> nanotech threat.
Nobody does pessimism like a countersphexist, hmm? We could argue all day about the potential for gray goo, but I can at least assure you that the Foresight people don't take it lightly. They've put a good bit of thought into how to avoid it, and I expect they will continue to do so.
As far as the puny stuff goes, nukes won't end civilization. This is a myth perpetuated by people who haven't studies the numbers. It would be feasible to build enough high-yield weapons to do the job, but even at the height of the cold war we never came close to doing it. Today, the best we could do would be to knock ourselves back to a pre-WWII industrial base for a couple of decades. The death toll would be huge, but we would still end up with a Singularity.
> Humanity's primary hope of survival lies in a quick kill via AI, and the
> best way I see to do that is an Open Source effort on the scale of
> Linux, which I intend to oversee at some point. Some IE via
> neurohacking may be developed fast enough to be decisive, and the
> existing Specialists (such as myself) may be sufficient.
Where do I sign up? You've seen my own projection by now - I want to make sure that if you get hit by a truck halfway through the project, the damn thing still has a decent chance of being sane.
Billy Brown, MCSE+I