Just a gentle reminder that some emotions trigger specific cognitive
abilities, such as frustration->flaw-finding.
Also, goal and subgoal are cognitive mechanisms as well as survival
mechanisms. When thinking, you set out to solve a specific problem and
may even spin off sub-problems to be solved first. Since emotions are
the affective aspects of our goal systems, there may be ground for
thinking that a reasoning AI would require something recognizable as
emotional substrate, if none of the *specific* emotions we bear. I
don't know whether suppressing, repressing, mastering, understanding,
switching off, flexing, or ignoring one's emotions would affect
goal-oriented analysis. It seems to me that the mental systems for
evaluating the *value* of goals may well be intricately intertwined with
our evolved emotional sets.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.