RE: Nanotech Arms Race

Billy Brown (bbrown@conemsco.com)
Thu, 21 Jan 1999 07:38:24 -0600

Dan Clemmensen wrote of assorted ways to depopulate Earth using nanotech. I'll reply to his suggestions in another post - I've got a related point to make here:

I believe I have detected an interesting implicit assumption in your scenarios, which is common to nanotech doomsday theories but is not at all realistic. You seem to assume that your nanotech is controlled by a computer system that can obey commands like "Go grab 2x10^6 comets from the Oort cloud, merge them to make a giant projectile, and then smash it into the Earth." The computer obligingly does so, with no muss or fuss.

Now, the AI that runs that computer can already do everything an SI could. It can do millions of man-years worth of engineering, programming, and R&D in a few weeks (or less). It can supervise giant construction projects, deal with unexpected physical obstacles, and generally do anything a human would do in its place. The only thing it lacks is self-will.

Do I really need to point out how unlikely that is? To get this genie machine you have to figure out how to make a fully sentient AI, then come up with a way to lobotomize it (presumably by tinkering with its goal system). Even if that happened, it won't be a week before someone decides to free one.

Even if the AI can't be made sentient for some reason, this is still an instant-SI scenario. "Upload me" isn't any harder a command than the comet project, after all. There are several other easy paths to SI as well - you could build a neural interface and integrate your own mind with the genie machine's design ability, for example.

If you want to talk about nanotech sans SI, that necessarily means there aren't any genie machines. That in turn means that many types of construction still involve large numbers of people and a lot of money - your nanofab simply builds whatever designs you can buy for it. The resulting world is very different than our own, and it is more unstable, but it isn't nearly as bad as the one you fear.

Billy Brown, MCSE+I
bbrown@conemsco.com