RE: Security and Naughty Tech (was Re: LIST: the Gooies)

O'Regan, Emlyn (Emlyn.ORegan@actew.com.au)
Tue, 20 Apr 1999 18:47:48 +1000

I blabbed on about NT (naughty technology, ie: that which could potentially kill us all, like evil hairdryers getting together with even more evil bath tubs, or chickens genetically engineered to contain more cholesterol, the bad kind).

>Nanogirl said:
>However there are other approaches. For instance, my interest, as you
>call it in NT (naughty technology-cute) particularly is
>nanotechnology. Although I forsee a awesome benifit to society with
>the evolutin of nanotech, I am not blind the dangers. This thing is
>utopia or doomsday! So, I am involved in groups that relate to my
>subject of concern (and support) Foresight is an example, I have
>become a senior associate and intend to attend meetings. Some of
>these meeting's ie. the Gathering in May are "brainstorming"
>sessions, and may involve policy issues. I myself keep tabs on the
>news and groups striving towards the advances of the science. You
>better believe I'll be sitting there to voice my public concerns when
>it reaches that level, where ever that may be. You just have to get
>involved, raise your voice. You don't have to sit back if you don't
>want to.
>
But it's an old problem with no solution so far... the information will come out, and someone could potentially use it wrongly. The proliferation of computer viruses seems to be a good example of how people love to misuse technology, and ignore common sense on these issues.

Maybe I should refine NT to INT (Inexpensive Naughty Technology - I thought Cheap & Nasty Technology might be good, but I don't like the acronym).

Nanotechnology might not be a good example, because it is probably going to be out of the reach of the everyday hacker (although it'd probably turn up in Uni's, which is bad enough). But software AIs (on Von-Neumann machines) is where things could get scary, to take just one example of INT. Accessible and totally destructive on a bad day.

What can you do about this kind of thing? No regulations, or set of conventions, or anything like that is going to stop such an advance (if it is possible, which I am betting it will be). We could legislate and regulate until we're blue in the face, and end up being wiped out (in any number of ways) by a program which starts "This Krazy Brane created by the Phunky Phreaky Hacker Whackers club, yeah, we are the best".

I've got no idea what you do about this, except hang onto your hat. I'd say some kind of world government might help (sounds of indignant screams, rotten fruit flying). Given the internet, your going to need someone who can make rules for *everybody*, if you're even going to begin to look after such a thing.

Or just be nice to the AIs, and hope they don't see the hammer behind your back.

Just when they thought humanity was crushed, a lone rebel managed to load a self-replicating copy of Windows 98 onto the net. Straight genocide would have been kinder...

Actually, I think the crossbow was hailed as the end of civilisation, and the Jawbone-with-big-bruising-lump probably was too, before that. Or we could all end up smashed by the Y2K bug (Y2Care bug).

Back on the soft AI: Flaming might get scary if instead of sending someone a harsh rebuke (ouch), you might send a nasty psycho-AI which infests their building management system and locks them in an elevator or turns off power to their nanobrain or whatever.

Hmmm... Maybe I should spread that particular meme, and send stupider and stupider e-mails to people who are involved in AI research. If I am particularly irritating, I might be able to stimulate an acceleration in the field, just so people can MAKE ME STOP. I can bring forward the Singularity!

I'm already doing it!

Emlyn
Martyr to the cause

PS: I've worked out how to keep the AIs under control - give them televisions.

>