Re: Uploading

Hal Finney (
Mon, 7 Jul 1997 11:19:41 -0700

Brent Allsop, <>, writes:
> "the only feelings I can objectively observe is my own" is
> precisely the point. We will so thoroughly understand the laws of
> physical consciousness and how they correspond and predict what WE
> feel as we observe, manipulate and stimulate our own brain that there
> will be no other possibility but that others with similar neural
> correlates are experiencing similar sensations.

I agree that if a theory were developed which accurately predicted the
subjective sensations for a variety of brain types, conditions, and
circumstances, confirmed in their accuracy by the people experiencing
these events, that it would become widely accepted as a valid theory
of consciousness. There might still be some doubters who maintain that
we have no way of being certain that the theory is accurate. (Other)
people might be zombies who would claim that the theory was correct when
in fact they had no consciousness at all. But I think they would not
be any more than a tiny minority, like the people today who believe that
they are the only conscious beings.

(It would, however, be difficult to know how to extend the theory to the
more interesting cases of less-conscious beings, like animals.)

> Also, just as our brain integrates many sense representations
> into one single conscious world, eventually we'll be able to integrate
> multiple conscious brains into single conscious worlds. Just like we
> know in our own minds that a red sensation is not anything like a
> salty sensation, or just like we know that red produced by one eye is
> the same as red produced by the other eye, we will be able to design
> ways to experience, first hand, what other minds are experiencing.

Perhaps so. In an earlier debate on this topic, I suggested that
technologically mediated mind-reading might similarly allow direct
perception of other people's consciousness. However it raises complicated
issues of whether I can be said to actually perceive your consciousness
in the same way you perceive it, or whether I am inherently filtering it
through my own consciousness, so that ultimately what I experience is not
identical to what you do.

> These are only two possible techniques to accomplish what you,
> for some unknown reason, apparently claim will be forever impossible.
> I'm sure if one is creative enough one can think of many other ways to
> do this kind of stuff. Once we understand how to manipulate,
> engineer, improve, and upload minds and all that, all this kind of
> understanding and proof must come to pass just like we know that the
> earth is not the center of the universe because we are finally dancing
> around in our heliocentric solar system and beyond.

It reminds me of a disagreement I had a few years ago on (where discussing such issues is their bread and
butter) about whether computer simulations of conscious brains would
be conscious. I suggested that it would theoretically possible to do
reversible uploading. A brain is scanned into a form where it can
be modelled (abstractly!) on a computer. It undergoes interactions
and has experiences in this virtual form. Then, using nanotech, the
resulting consciousness is downloaded back into the brain, the neural
pathways remodelled to the conditions described by the virtual form.
This could be repeated multiple times, transitioning between body and
computer (and in fact this might even be a common form of transportation
under some circumstances).

I argued that people would have the sense that the transition between meat
to computer was not significant, that their memories and experiences would
seem much the same in both circumstances. Hence it would strike them as
absurd to suggest that they had not been conscious while in the computer,
since they had exactly the same evidence for their consciousness in that
form as they did when they were in a body, namely their memories of their
sense of self.

The person I was debating with maintained that the people might not be
conscious while in the computer, that the downloading process actually
would end up inserting false memories of earlier conscious experiences.
While logically possible, I think the discrepancy between this point
of view and the sense people had of their experiences would make it
effectivelly unsupportable.