Re: The AI revolution (Was: Re: >H ART: The Truman Show)

Anders Sandberg (
23 Jun 1998 13:32:47 +0200

den Otter <> writes:

> There is also no reason to assume that you *can't* strip (most)
> emotions out of intelligence, and still have something useful.

Actually, there are good reasons to expect that you cannot do that, at
least not from our kind of intelligence. See Antonio Damasio's work
(e.g. _Descartes' Error_ and a lot of research papers) on how
disruptions of the emotional subsystems of the brain impair rational
thinking. He has some quite clear descriptions of patients that have
perfect memory, logical reasoning and knowledge, and still make
completely stupid decisions due to lack of feedback from emotional
subsystems. They are necessary to keep us from needlessly exploring
unfavorable or negative avenues of action, and to motivate us to seek
those actions that benefit ourselves. The same would go for an AI: if
it didn't care if the results would be positive or negative, it would
just act randomly or not at all ("Do something! The reactor is
exploding and will kill all of us!" "So what?"). Emotions seem to act
as heuristics for alpha-beta pruning of our decision trees.

> When people (like scientists) deal with a difficult problem,
> they tend to focus fully on it, blocking out cluttering
> emotions. That's what you want the AI to be: a focussed
> thinker, not disturbed by the urge to mate or whatever.

Without emotions, it would have no motivation to focus. And as a
scientist, I can definitely tell you that working on difficult
problems is highly emotional. I might not be thinking of mate finding,
but I am certainly experiencing various shades of curiosity,
frustration, surprise, boredom and ambition that definitely help my
cognitive processes.

Anders Sandberg                                      Towards Ascension!                  
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y