Re: Can Pure Lookup Tables Be Conscious?

From: Lee Corbin (lcorbin@ricochet.net)
Date: Thu Apr 26 2001 - 19:48:40 MDT


Hal Finney makes two points concerning consciousness in his post
of yesterday in this thread. First he (rightly) reiterates that
playback is to be distinguished from genuine computation, and
goes on to mention that most philosophers regard machine
implementation of consciousness to be a very tricky concept itself
(let alone how to go about it). However, Hal agrees with Chalmers
that "a system implements a computation if the causal structure of
the system mirrors the formal structure of the computation."

He then goes on to write

> (Note though that this objection does not apply to the huge lookup
> table, as that does represent the full complexity of the abstract
> program. There may be other reasons to doubt the consciousness of
> the HLT but this is not one of them.)

But it seems to me that a ridiculously large AI that uses a huge
pure lookup table is perhaps not truly mirroring the formal structure
of a computation. Obviously, this depends on what we mean by the
"formal structure of a computation", but in grossest form, making a
hash of a particular state and using that address to fetch a
subsequent state doesn't seem to me to be mirroring a tight
causal computation (of the microcausality type that I wrote
about earlier today (see below)).

Hal's second point is that consciousness doesn't have to be
localized. Without going to the extremes that Arnold Zuboff
does ("The Story of a Brain" in The Mind's I), it seems
pretty easy to see that. A functionalist is bound to believe
that in principle a brain could be the size of galaxy, though
its thoughts would be slow. Even all the excitations that
normally travel from my amygdala to my hippocampus were caused
to detour through Andromeda, while the rest of my brain waited
patiently in stasis, in principle I'd never know the difference.

So I don't really know where he's going with this, unless... he
had a premonition that someone was going to introduce some sort
of thing like:

1. "Microcausality": Subsequent generations on the life
     board are calculated causally and locally: each glider,
     for example, moves exactly because that glider was in
     the previous generation at an exact location. It has
     nothing to do with, nor is dependent in any way, upon
     what is happening elsewhere on the life board. This
     corresponds to a human's neuron pulse moving down an
     axon exactly because it had a certain potential an
     instant earlier at, or nearly at, the very same place.

But here the condition "locally" is only being used to exclude
non-local influences. Clearly it also wouldn't matter if each
glider-gun in each logic calculation on the life board was
actually halted midway though its electronic dance, and a bunch
of people in Searle's Chinese room were required to complete some
generations of the glider-gun by hand.

Lee Corbin



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:59 MDT