At 11:18 AM 3/27/2001 -0800, Robert Bradbury wrote:
>While the brain has some lookup tables as has been discussed,
>it doesn't seem (to me) to be 'conscious' if you remove all of
>the active neural activity. A lookup table (potentially with
>the addition of some fuzzy logic to keep them from getting
>astronomically huge) has no 'activity' other than the
>retreival of a stream of bits from the table and running
>them through the output device (this is what I'm calling
>a zombie, though others may call this other things).
While I think many people find this definition of a zombie to be intuitive,
I would submit that a "zombie" in the sense that it is being used here is
not a valid class of intelligence unless you are willing to accept the
concept of magic computation.
The human mind does fuzzy lookups because it has insufficient memory
capacity to do non-fuzzy ones, but it really goes much further than
this. This type of fuzziness is *expected* for any vaguely optimal
small-memory predictor model in terms of generating "correct"
outputs. Given the same amount of memory as the human brain, the best
possible computer model for an AI would also exhibit a similar type of
fuzziness as an artifact of limited model memory. The human mind isn't
this way by coincidence; it represents nearly optimal computational
efficiency for the limited hardware it is built on (evolution in action),
both maximizing entropy and information quality on a memory starved system.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:43 MDT