Security and Bad Tech (was Re: LIST: the Gooies)

O'Regan, Emlyn (
Tue, 20 Apr 1999 16:41:01 +1000

There appears to be a quandry over what to do about Naughty Technology (NT) which can do bad, bad, naughty things to everybody. For instance, nanotechnology:

Nanogirl says:
>There are security problems that will have to be considered in lue of
>new technology, molecular and nano. Policy makers and legislation
>does have committee's to evaluate these advancing technologies. But I
>have no idea how well they can handle them. Hacker's will be a threat
>to these arena's. Hackers, phreakers and crackers will be brought
>forward and given legite jobs at sealed labs to hack thru previously
>constructed systems within assemblers etc. to exploit the leaks and
>varify what need be.
>Gina "Nanogirl" Miller
>Greg Burch writes,
>>> I have been convinced by some pretty rigorous reasoning,
>>> explained by some pretty smart people, that one does NOT need
>>> to make a "genie machine" or even a "general-purpose assembler"
>>> to cause Very Bad Things to happen with molecular scale
>>> In fact, I'm so convinced of this that I think it's irresponsible

>>> discuss the details of such ideas in an open, public forum.
> Lyle says...

>>As for discussing it openly, I think any hacker who can attempt such
>>thing can figure out how to do it by himself.
>>If they really start doing this, God help us all.

But what should be done about NT? All kinds can spring up: seed AIs that go transhuman, nanotech, timetravel?. Have I missed some? Nuclear weapons count. Computer viruses?

Is it a concern to discuss these things in an open, public forum? Much criticism of the nuclear industry + arms race derives from the behind-closed-doors nature of most of it. Luckily it appears to be very expensive, so every little tinpot country hasn't got its own nukes (until now!). Nevertheless, we have lived in the shadow of total annihilation for a significant period of time.

Would it have helped to hide the science, sweep discoveries under the carpet? Probably not, because it was an idea whose time had come. This kind of knowledge will come out.

So with the future technologies, do they get locked away and never looked at? Legislated against? Held protectively by a small group of "guardians" (whomever that might be). Published a bit, but made to look boring and trivial? Or flaunted publicly, openly.

My guess is that, no matter what you do, the information will come out. That doesn't help if you're worried about being eaten by grey goo. Maybe the information can come out slowly, to give people (who?) time to prepare antidotes ahead of the plague. But how is this handled? Who gets to be trusted with the future of humanity? Scientists (there's a scary thought)? Industry (there's a scary thought)? Government (there's a scary thought)? How about ten people chosen at random through the lotteries?

I think even the complex answers to this complex question are wrong. But the nuclear arms race is surely some comfort to those worried about NT. It hasn't been pleasant, and it still isn't, but we're not dead yet. It's pretty similar - one slip, and BLAM! Actually, grey goo must be slower than total nuclear war, which is some help.

I remember reading Andrew Tannenbaum on Computer Security. He had numerous case studies of systems where the main security was in the idea that no-one knew how the security system worked, and so no-one knew the backdoors. This, of course, is a fantastic fallacy, because people will find out, and you wont be ready for it. You must always assume that everyone (all the bad guys) knows how your security system works, and make sure that they can't get in anyway.

Similarly with NT, I don't think that you can hide it away in secret government files, and hope to keep the world safe. You could try, but you must always assume that everyone knows all about it anyway. That way, there are no surprises.

Maybe a combined approach - hide the info, but work as though you've published it on the 'net. Ultimate paranoia. But then, should such paranoiacs be in charge of such dangerous info? What other options are there?

Then what do you do if someone else less diligent discovers the same technology seperately. Send them a stern e-mail? Nuke-em? Sick grey goo on 'em?

You know that if you developed a fully software AI that could run on someone's Pentium (or Mac or unix thingy), rewrite itself, and replicate around the 'net (singularity!), and you wanted to actually put it to use in the world, that it would be impossible to guard it properly. Soon enough, the source, or similar source, or wildly different source with similar/better results, would turn up on someone's homepage, with a link under their resume, next to a picture of their dog.

I don't think you can hide this stuff, but I don't think you can seriously distribute it widely either, especially if it is cheap to implement.

Is someone still working on that Mars colony? I've changed my mind, I'm coming...