Re: The Copy Paradox

Hal Finney (hal@rain.org)
Thu, 13 Nov 1997 10:12:16 -0800


Harvey Newstrom, <harv@gate.net>, writes:

> Even if my twin could be modified by a plastic surgeon to look exactly
> like me, I still wouldn't want to die. Even if you recorded my exact
> brain state, and reset his brain to that exact configuration, I would
> still not want to die.

Certainly a physical copy would not be enough. But things are different
if you are talking about an exact mental copy. It can be argued that
identity is fundamentally a matter of patterns of processing. Reproduce
the pattern, and you reproduce the identity.

> Note that I am not saying that he is not me, or that he is not another
> instantiation of "me". These are semantic word games which don't really
> matter. I just still desire to continue experiencing life. Knowledge
> that another person very similar to me or exactly like me will continue
> to experience life does not change my opinion. My twin has not directly
> affected my subjective experience of life up till now. Even modifying
> him to be exactly like me does not change my subjective experience of
> life. In fact, you could modify him without my knowledge. I still would
> oppose my personal death. I don't know how making my twin brother more
> like me makes it any more acceptable for me to die. If you told me that
> you had modified my brother to be exactly like me, I still don't see why
> I would change my mind and let you shoot me. Even if I really believed
> that you actually accomplished what you claimed, I still wouldn't change
> my mind.
>
> Does anyone know what is different between my understanding and that of
> those who would be willing to die if there was a copy made of them? I
> am curious to find out why I am not thinking the way other people are.
> Maybe it is my own experience with my own copy (twin) that makes me
> skeptical of this approach?

It's possible that your twin is making you focus too closely on genetic
and physical similarity, which is really not the issue. There is an
enormous difference between someone who looks something like you and
someone whose brain is running exactly the same program as yours.

Here are two arguments which make the pattern theory of identity more
plausible.

The first is to imagine that you are an upload, a computer program which
was originally a copy of some human mind. You live within the computer,
your mind is a program. Now what kinds of transformations will you
allow to occur to your program?

It seems plausible, based on our experience with programs, that it should
be OK to suspend your program for a while, and restart it. You will
of course not experience any passage of time during the suspension.
If you object to this, consider for example that you may be running
on a timesharing system where this happens all the time anyway, or you
could imagine running on progressively slower computers until you were
in effect suspended. Neither of these would seem to give you reason to
believe that you have died.

It is also plausible that you would be willing to run on a computer network
of some kind, with your processing spread over multiple computers. (It may
well be that this is the only practical way to run such a complex program.)
In this situation, it could even be the case that the network is dynamic,
with different processors being available from time to time, and the various
parts of your program jumping from processor to processor over the course
of your run. You would not be subjectively aware of any of this, and your
mental functioning would seem perfectly normal.

Putting these together, you could imagine stopping your run on one computer,
and later resuming it on a different computer. You would not perceive any
break in your train of thought or have any reason to believe that you had
died.

Now, this is effectively the same as starting up a copy of your mind while
destroying the original. It appears that this actually does preserve
identity, at least in the case of uploads, and there is no reason why it
should not do so for organic brains as well - it's just more difficult to
arrange in that case.

Another argument is to consider the case of a world where copying like
you describe is easy and commonly used as a means of transportation.
Someone has copies "on ice" on many different planets, and when they
want to travel elsewhere they transmit a copy of their brain state, get
that programmed into the brain of the copy, which then wakes up as them.
(This is similar to what is done in Linda Nagata's "The Bohr Maker".)

Imagine yourself someone who has participated in this form of
transportation many times over the course of his life. He has memories
of having lived in each of these various bodies for a time, then
transferring his consciousness to a different one, while the original
body was destroyed (or at least its memory erased for future use).

>From his point of view, these transformations preserved his consciousness
and sense of self just as much as going to sleep at night and waking
up in the morning. He felt just the same, behaved the same, remembered
everything he had done before the transfer. There is no reason for him
to view it as death.

This would be the natural point of view of anyone who had tried it a
few times. So it would seem that actually experiencing the kind of
transfer that you are afraid of leads to reassurance that it does preserve
your sense of self, and there is no reason to fear it as death.

Hal