Yep. Here's some.
Here's one definition from the past for Intelligence
*The capacity to make adaptations for the
purpose of attaining a desired end; and the
power of autocritism.* (Terman, 1916)
*The aggregrate or global capacity of the
individual to act purposefully, to think rationally,
and to deal effectively with his environment.*
(Wechsler, 1958)
More recently...
*. . . a human intellectual competence must entail
a set of skills of problem solving - enabling the
individual to resolve genuine problems or difficulties
that she or he encounters, and, when appropriate,
to create an effective product - and must also entail
the potential for finding or creating problems - thereby
laying the groundwork for the acquisition of new
knowledge.* (Gardner, 1983)
* . . . mental activity in purposive adaptation to, shaping
of, and selection of real-world environment's relevant to
one's life.* (Sternberg, 1986)
Most theorists now agree that intelligence is
hierarchical in that there is a general factor
that plays a role in a large variety of cognitive
tasks while more narrow group factors form a
core of group factors. These conceptualizations
happened prior to the recent interest in
Emotional Intelligence. AI's will need
The way my crappy dictionary defined sentience
made it sound like anything that could interpret
the environment through sensory inputs and was
aware which makes consciousness and sentience
sound pretty similar. I had always thought the
phrase *Sentient Species* was used to describe
*Intelligent/Civilized* organisms.
Consciousness and Intelligence are clearly
distinguishable. Consciousness strikes me
as a necessary but insufficient precondition
for intelligence. I'm starting to lean toward
believing that the odd sense of self that we
possess was the incidental by-product of our
larger and more complex brains and this
glorious accident proved to be highly extropic,
compelling us to create
language, tools, agriculture, industry, technology,
etc.
I think it was Alan Watts who first introduced me
to the idea that our brains trick us into thinking that
we have souls. Our enhanced abilities to recall
the past and forsee the future lead us to spend
mental time outside of the here and now. We
perceive ourselves as having this continuity when
all that there really is is the present moment. I
have since heard more neurologically sound
arguments that lead to similar conclusions, i.e.
that the sense of identity we experience is a
complete illusion...the mind is what the brain does.
The incidental by-product argument makes me
wonder what might happen when a Super AI comes
into being. What sort of spontaneous artifact might
arise as a function of this new level of complexity?
Is Super AI going to be able to communicate with me
better than I can communicate with my dog?
As to how AI's might treat us if we are left in their
dust, I would imagine that it would be somewhat
similar to how we treat animals. We can be very
kind to animals we want to be kind to, we ignore
others, and we can be very unkind to those we
find bothersome.
I hope we can all just get along.
Scott Badger