Re: sentience (was: Re: ECON The Abolition Of Work)

Hal (
Tue, 5 May 1998 11:34:04 -0700 (PDT)

Damien Broderick, <>, writes:
> I assume that `sentience' means `consciousness' in this context, although,
> as has been noted before, it really just denotes any ability to register
> sensations, which snails have despite their gross stupidity and Zen
> egoless-ness. So it could be that reflexive self-awareness is an
> incidental computational by-product of very complex brained bodies
> responding in complex ways (understanding, assessing, planning etc) to a
> complex environment - rather as the red colour of blood is an accidental
> by-product of what it's made out of, and without any teleological merits.

Wouldn't the functionalist perspective suggest that consciousness is not
an "incidental" by-product, but rather an "inevitable" one? In other
words, you automatically get consciousness when you have the kind of
complex system Damien is describing. Any system which has a sufficiently
rich representation of itself and the world is conscious - that is what
consciousness is.

One problem with this suggestion is that it implies that a relatively
simple computer program, say CYC with its thousands of interrelated
facts about the world, would be conscious to some degree. Such a
program is far from being able to pass the Turing test, and we might
not be comfortable setting the bar for consciousness so much lower than
that for human equivalent intelligence. But on the other hand, it
appears in nature that conscious awareness is in fact much easier to
produce than human intelligence, so perhaps this is not so objectionable
after all.