Re: PHIL/AI: Humongous Lookup Table

Mitchell Porter (
Wed, 12 Feb 1997 16:27:10 +1000 (EST)

My $.02: the lookup table presents no problem if you regard passing
the Turing Test as a likely indication of consciousness or intelligence,
but not as a defining criterion of intelligence.

Consider the "Thermometer Test" for the presence of heat. If you
stick a thermometer in an unknown liquid and the mercury shoots up
the column, you have reason to believe that the liquid is hot.
But "being hot" is not the same thing as "being disposed to cause
mercury to expand", and there are other ways to make mercury rise.

In other words: why would anyone think that the Turing Test was
foolproof? If you really want to *know* whether an entity is a
person / is conscious / is intelligent - having first adequately
clarified your concept - you may just need to know something
about its implementation, as well as its behavior. (Actually,
I would have thought that a functionalist would agree on this
count, since functionalism cares about internal causation.)