Re: AI

James Rogers (jamesr@best.com)
Tue, 14 Dec 1999 13:05:13 -0800

On Tue, 14 Dec 1999, Clinton O'Dell wrote:
>>thinking about constructing a solid definition for "intelligence", then
>>think about how you might program a computer to posess this quality, and
>>what use it would be.
>
> This is a very BAD BAD way to go. Instead work on making it
> self-aware. Consciousness has NOTHING to do with intelligence. Many
> people consider me several times more intelligent than most of my peers,
> does that make me more conscious than them?

Quite possibly you *are* more conscious than them. If you accept that consciousness is "self-awareness", then I think an argument can be made that consciousness is just another way of describing intelligence in action. To me consciousness is the difference between observing a world, and observing the world with the knowledge of the consequences of one's interaction with the world. Intelligence would then clearly be the limiting factor on the amount of "consciousness" one could have.

"Awareness" describes the recognition of an information pattern of some type. "Self-aware" simply means that the body of information that is "you" is part of the environment being observed. We as humans have a very limited awareness; it is very easy to create environments and situations where the brain does not possess enough awareness to function adequately (which is why we have reflexes). A sufficiently intelligent system will not only be able to find patterns in an environment, it will also be able to find patterns in the way the environment interacts with its existence and behaviors. In the real world, it is difficult to observe an environment without one's presence causing subtle changes to the environment you are observing. Of course, few AI systems actually have the opportunity to observe the consequences of their existence.

Therefore, I would say that consciousness can be defined as the point where a system has sufficient intelligence that it can analyze patterns in the way it interacts with its environment. Every AI system that I can think of simply analyzes the environment. I could be wrong, but I don't know of any that are capable of analyzing the impact of their own existence on the environment they are in. This feedback loop seems to be the only requirement for self-awareness that I can see.

-James Rogers
jamesr@best.com