Re: PHIL/AI: Humongous Lookup Table

Max More (maxmore@primenet.com)
Wed, 12 Feb 1997 09:11:25 -0800


At 04:27 PM 2/12/97 +1000, Mitch wrote:
>My $.02: the lookup table presents no problem if you regard passing
>the Turing Test as a likely indication of consciousness or intelligence,
>but not as a defining criterion of intelligence.
>
>Consider the "Thermometer Test" for the presence of heat. If you
>stick a thermometer in an unknown liquid and the mercury shoots up
>the column, you have reason to believe that the liquid is hot.
>But "being hot" is not the same thing as "being disposed to cause
>mercury to expand", and there are other ways to make mercury rise.
>
>In other words: why would anyone think that the Turing Test was
>foolproof? [snip] (Actually,
>I would have thought that a functionalist would agree on this
>count, since functionalism cares about internal causation.)

Well, I'm a functionalist and I agree. For me, the Turing Test can only be
an indicator, as you say. It really doesn't contain any theory of
intelligence, except maybe of the kind "if it looks like a dog from the
outside then it is a dog" kind. The Turing Test was proposed in the days of
behaviorism. It may be the best we can do for now, but eventually we'll
have a convincing theory of intelligence and awareness.

Max

Max More, Ph.D.
more@extropy.org
http://www.primenet.com/~maxmore
President, Extropy Institute, Editor, Extropy
exi-info@extropy.org, http://www.extropy.org
(310) 398-0375