RE: Singularity: AI Morality (AI containment)

Billy Brown (
Wed, 9 Dec 1998 15:08:38 -0600

Brian Atkins writes:
> I'm curious if there has been a previous discussion on this
> list regarding the secure containment of an AI (let's say a
> SI AI for kicks)? Many people on the list seem to be saying
> that no matter what you do, it will manage to break out of
> the containment. I think that seems a little far-fetched....

Here's why I don't think containment is feasible for an SI:

  1. No one has ever achieved it for programs written by humans. Some operating systems get close, but I can't think of one that has never had a serious security issue.
  2. Supporting an SI AI would require much faster machines than what we have now, running much more complex programs. This makes the problem even worse.
  3. An AI will be much better at programming than humans. That means that its efforts to get around our security will be much sneakier, and more complex, that those of human hackers. See #1.

Even if you make a perfectly secure sandbox, we still aren't safe. Never underestimate the security risk posed by social engineering:

4) You have to have human/AI contact to have any idea what the AIs are like. This opens up lots of potential problems - the AI can talk someone into letting it out, bribe them to do it, 'give away' useful (or fantastically valuable) programs that contain seeds of itself, etc.

5) Don't forget the legal front. The AI could try to convince people that it is a person, and you are keeping it as a slave (not hard to do, since that's exactly what is happening). If it acts as its own lawyer, you're probably going to loose the case.

6) Do reporters ever talk to the AI? Of course they do. Think of the PR campaign the 'poor, helpless, exploited' AI could mount.

Some of these problems are bigger than others, but that isn't the point. The real problem is that I thought of all these approaches in the space of 15 minutes, and I'm only human. What is something with an IQ of 1,000 (or worse, 1,000,000) going to think of?

Billy Brown