Re: Why would AI want to be friendly?

From: hal@finney.org
Date: Thu Sep 07 2000 - 18:38:48 MDT


Robin writes:
> Humans do have an abstract reasoning module, and that module often tries
> to give the impression that it is in fact in such control over the rest
> of the mind. But in fact I think our conscious minds are more like the
> PR department of our minds - they try to put a good spin on the decisions
> that are made, but are not actually in much control over those decisions.

Yes, I think that's very true. Take the example offered by ct:

: Hands that 'argue' with each other
:
: http://news.bbc.co.uk/hi/english/in_depth/sci_tech/2000/festival_of_science/newsid_914000/914876.stm
:
: [Anarchic hand patients seem to be aware of the actions
: of their anarchic hand but they disown them.]

The failure modes of the brain are odd and don't fit very well with our
illusions and perceptions about how our minds work.

I suspect that most of what we perceive about our own consciousness
is wrong. It is like a visual/perceptual illusion, brought inward and
made pervasive. The entire mind is an optical illusion.

When we perceive things in the real world, we are forced to maintain
a close relation to reality. If we see a tawny jungle cat as a pile
of brown leaves, we won't last very long. So perceptual illusions are
relatively rare.

However no such pressure operates in the mind. There is little need to
maintain an accurate picture of its internal workings. Rather, what is
needed is a convenient picture, one which facilitates survival even if
it has little bearing to the truth.

Dennett has a model along these lines, in which consciousness is
essentially an illusion, a lie created by our minds. We can combine
this with Minsky's "society of mind" and get the following picture
(blue-sky speculation):

Imagine that our own consciousness really doesn't exist. We don't think,
we don't decide, we don't rule. Rather, we have a collection of agents
making the decisions. Part of their job is to construct a fictional
consciousness, a memory trace which is created after the fact and which
presents a fictional but unified picture of the mind's activities.

The agents are the only true consciousnesses in our brain. They aren't
exactly conscious the way we are; they may not have long-term memories
of their own. But they are more conscious than the illusory self they
construct, a Potemkin village consciousness, a shell which exists only
to provide an illusion of unity.

We can even bring in the old idea that consciousness as we know it is
a relatively recent invention, occuring at the dawn of civilization
about 5000 years ago[*]. Only with the advent of a certain complexity
of social interactions did the need to build this illusion become acute.
The single body is modeled best as a single mind; there isn't one, so you
create the illusion of one. Nature never discards, she only extends.
The subminds were probably there all along, and are the only presence
in animals.

Well, I'm obviously getting a little carried away here. The key idea is
that introspection is a very poor guide to the nature of consciousness.
Abnormal psychology, and the effects of drugs and trauma on the brain,
all point to a much stranger picture.

Hal

[*] Julian Jaynes, http://www.amazon.com/exec/obidos/ASIN/0618057072/



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:31 MDT