Eliezer S. Yudkowsky [sentience@pobox.com] wrote:
>Once Zyvex has nanotechnology,
>I'd be fully in favor of their immediately conquering the world to
>prevent anyone else from getting it. That's what should've been done
>with nuclear weapons. If the "good guys" refuse to conquer the world
>each time a powerful new weapon is developed, sooner or later a bad guy
>is going to get to the crux point first.
You know what scares me the most about the future? All these control freaks and their desire to take over the world to protect themselves from the "bad guys"; Eliezer and den Otter the most obvious proponents on this list. We must all support the "good guys" in taking over the world and introducing their global surveillance utopia while we await The Coming Of The Glorious Singularity!
Look Eliezer, we know you're a rabid Singularitarian, but to those of us who actually work on developing advanced hardware (my employer designs chips at least as complicated as anything coming out of Intel) the idea that we'll have this new technology appear and then in a few days we'll be surrounded by nanotech death machines and massively intelligent AIs is blatantly absurd. Building hardware at nanoscales is difficult enough, but the software is way, way, behind; there are features we've had in our chips for years which are only just coming to be used by applications, and developers aren't even beginning to use the computing power we're giving them in anything but the most simplistic ways. No matter how powerful the hardware, the software will be a long time coming, even with neural nets or genetic programming.
Mark