Re: Emulation vs. Simulation

From: hal@finney.org
Date: Wed Mar 28 2001 - 22:59:28 MST


Lee writes:
> This is what's striking about the lookup table scenario. It
> passes the Turing test, yet we hesitate to say that it is
> intelligent. It acts completely indistinguishably from a
> real human being, but we cannot, I claim, believe that it
> feels anything or is even as conscious as an insect.
>
> The reason is: take any five minute sequence of your best
> moments with this "Eugene Leitl": while all the laughing
> and rollicking corresponds to non-trivial data analyses
> and information flow in your head, it's almost a random
> sequence of lookups in "his".

Not random, surely. For one thing, the index into the lookup table, if
expressed as a binary address, presumably carries as much information
as is in your own brain. And each successive change of that index is
a transformation every bit as causal and dependent upon experience as
the changes in your own brain from one moment to the next.

> And, to reinterate, that
> sequence would be no different if it were activated in
> truly random order, or just layed out in a linear order
> somewhere without being fetched at all.

The sequence of what, of states? You are saying that after Eugene has
had an experience, you could write down the series of states, and then
play them back in a simple way? Which of course you could do with your
own brain as well, right? You could write down the sequence of your
own neural brain states and then "play them back".

The question is, are such playbacks conscious. Your initial experience
was conscious; Eugene's experience was at least conscious when you created
the lookup table, and possibly when you jumped around in it again later.
Now you are playing back some of that experience.

What does it mean to ask if a playback is conscious? IMO it is a less
meaningful question than even the hard question of what it means to
be conscious at all. Consider the question of whether an oyster, say,
is conscious. We wonder whether it is "like something" to be an oyster.
While it is not at all clear how to learn the answer to this question,
philosophically we have a sense of what the question means. We poke
the oyster, and we wonder if there is an entity which felt itself being
poked in somewhat the same way that we feel sensations.

But for a playback, the question is harder. Suppose we play back an
experience of drinking orange juice. We know that there is or was a
conscious entity which had that experience. What we want to know is,
when we replay it, does it have that experience again. We want to know
whether it is having the experience right here/now during the playback.
We know the experience itself exists or existed, but we want to localize
it in space and time. We want to know whether it exists in that region
of spacetime where we are doing the replay, in addition to its existence
in the region of the first run-through.

I know little about what consciousness is, and as part of that uncertainty
I have severe doubts about how well it can be localized. Consciousness
may not be something which has a location in space and time. After all,
consciousness is completely intangible; it has no mass, it has none of the
normal properties of matter or energy, so there is no particular reason
to expect it to share with them the property of locality. Therefore it
may not be meaningful to ask whether a replay is again conscious, in
addition to the consciousness which it had the first time through.

In that case, when we lay out a series of pre-recorded mind states, or
create a machine which goes through them by rote, it is not meaningful
to ask whether this produces consciousness. We are doing a replay of
an existing consciousness; there is no question but that the original
consciousness has existed, because it made an impact on the world that
could not have happened in any other way. But to ask whether the replay
or the recording of the states is conscious requires making much stronger
assumptions about the nature of consciousness. I don't think we are in
a position to make those assumptions.

Hal



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:43 MDT