Re: Uploading

Brent Allsop (allsop@swttools.fc.hp.com)
Mon, 7 Jul 1997 15:04:21 -0600


Hal Finney <hal@rain.org> reponded:

> I agree that if a theory were developed which accurately predicted
> the subjective sensations for a variety of brain types, conditions,
> and circumstances, confirmed in their accuracy by the people
> experiencing these events, that it would become widely accepted as a
> valid theory of consciousness...

Great.

> (It would, however, be difficult to know how to extend the theory to
> the more interesting cases of less-conscious beings, like animals.)

Why would they be any different? I'm sure there are probably
some insects (worms?) that have no conscious experience or phenomenal
subjective experience at all, having purely abstract intelligence. In
order to know what echo-location is like for a bat we simply must
reproduce the proper neural corelate (or whatever gives the bat it's
experience) in our conscious world.

Right now we have no idea whether a worm feels pain or not and
what such might be like if it does. But, once we understand what
consciousness is we will be able to both know the answer to this
question and to eventually be able to modify our brain so that it,
too, can produce similar sensations so we can know what it is (or
isn't) like to be a worm.

> Perhaps so. In an earlier debate on this topic, I suggested that
> technologically mediated mind-reading might similarly allow direct
> perception of other people's consciousness. However it raises
> complicated issues of whether I can be said to actually perceive
> your consciousness in the same way you perceive it, or whether I am
> inherently filtering it through my own consciousness, so that
> ultimately what I experience is not identical to what you do.

For more complex sensations such as love I'm sure this will be
very true. But, for more basic sensations like red, or the taste of
salt..., I doubt which sensation our brain uses to represent such
changes much due to experience. The important thing is that more
complex feelings, though much more involved, fleeting, complexly
interdependent..., are built out of the same kind of phenomenally
subjective conscious stuff. Knowing what red is will go a long way
towards telling us what love is.

> It reminds me of a disagreement I had a few years ago on
> comp.ai.philosophy (where discussing such issues is their bread and
> butter) about whether computer simulations of conscious brains would
> be conscious.

Yes, comp.ai.philosophy is a great place. I've had great
conversations with Marvin Minsky, Hans Morovec, and many others over
there.

> The person I was debating with maintained that the people might not
> be conscious while in the computer, that the downloading process
> actually would end up inserting false memories of earlier conscious
> experiences. While logically possible, I think the discrepancy
> between this point of view and the sense people had of their
> experiences would make it effectively unsupportable.

Why? If this was true then I could argue that the artificial
fictional memories implanted in Arnold's mind during Total Recall
prove that the fictional experience was real simply because it
"struck" Arnold that they were real. Of course, even though Arnold
would not be able to distinguish memories of real events from
artificially implanted memories having no corresponding reality or
history, he could know, through other evidences, which were real and
which were fictional. Just as we will know that memories reverse
uploaded from an "abstract" simulation of ourselves are not real
memories of real feelings we actually had but that they are merely
from abstract, unfeeling, simulations.

Brent Allsop