> Emotions are important for us because they help us to survive. AIs
> don't need to fend for themselves in a difficult environment; they
> get all the energy, protection & imput they need from humans and
> other machines. All they have to do is solve puzzles (of biology,
> programming etc). If you program it with an "urge" to solve puzzles
> (just like your PC has an "urge" to execute your typed orders), it
> will work just fine. No (other) "emotions" are needed, imo. It's
> like with the birds and planes: both can fly, only their methods
> are different, and both have their specialties.
I think you're approaching this question from a very different direction
from many of those who subscribe to this list. Many of us suspect that if
and when we create an AI, it will not be any easier to "program" than
people are; that on some level the architecture behind AI will
sufficiently similiar to that within our own brains that we will not be
able to selectively remove anything fundamental from this mix, including
emotions, survival instincts, irrationalities and quirks.
Unlike flight, intelligence may well require something as complicated as a
bird or a person in order to work. If so, there's no reason to assume
that we could strip emotions out of intelligence like we could take the
feathers out of flight.