Re: Posthuman mind control (was RE: FAQ Additions)

Michael S. Lorrey (retroman@together.net)
Thu, 25 Feb 1999 15:29:24 -0500

Billy Brown wrote:

> Nick Bostrom was arguing in favor of programming a fundamental moral system
> into the AI, and then turning it loose with complete free will. My argument
> is that this is very unreliable - the more complex a mind becomes, the more
> difficult it is to predict how its moral principles will translate into
> actions. Also, an intelligent entity will tend to modify its moral system
> over time, which means that it will not retain an arbitrary set of
> principles indefinitely.

Yes, however how the individual's principles evolve will have a direct impact on that individual's fitness to survive. I think though, oppositely from you. The more intelligent an AI is, the more rigorously logical it will be, and thus will actually be far more predictably reliable than a less intelligent AI. Free will has far more to do with how we rationalize the things we do. When we do something, we or others wonder why we did what we did. We rationalize an explaination for it, and thus program ourselves to respond similarly to associated situations. An AI which is more intelligent than we are will likely be far more logical in its rationalizations for its actions than we are.

>
> Now, I don't think that ongoing mental coercion is a good idea either, but
> that's a different line of argument. I would expect that you could devise
> an effective scheme for controlling any static mind, so long as it isn't too
> much smarter than you are. If you want to control something that is
> self-modifying you've got big problems - how do you design a control
> mechanism that will remain effective no matter what your creation evolves
> into?

You create a blind spot. In the blind spot is the 'concience' kernel, which cannot be directly manipulated. It can only be programmed by the experiential data input which it analyses for useful programming content. It colors this new content by its existing meme set before it integrates the new content into its accumulated database of 'dos' and 'donts'. The entire database gets to vote on every decision, so new content cannot completely wipe out old content, except under extremely stressful circumstances (i.e. HOT STUFF HURTS!).

Mike Lorrey