Re: The Copy Paradox

wolfkin@ldl.net
Thu, 13 Nov 1997 15:33:18 +0000


> Date: Thu, 13 Nov 1997 10:12:16 -0800
> From: Hal Finney <hal@rain.org>
> Harvey Newstrom, <harv@gate.net>, writes:
<SNIP>
> Certainly a physical copy would not be enough. But things are different
> if you are talking about an exact mental copy. It can be argued that
> identity is fundamentally a matter of patterns of processing. Reproduce
> the pattern, and you reproduce the identity.

But then we run into a problem of differentiating copies.

<more SNIPed>
> > Does anyone know what is different between my understanding and that of
> > those who would be willing to die if there was a copy made of them? I
> > am curious to find out why I am not thinking the way other people are.
> > Maybe it is my own experience with my own copy (twin) that makes me
> > skeptical of this approach?
>
> It's possible that your twin is making you focus too closely on genetic
> and physical similarity, which is really not the issue. There is an
> enormous difference between someone who looks something like you and
> someone whose brain is running exactly the same program as yours.

Exactly the same to what degree?

> Here are two arguments which make the pattern theory of identity more
> plausible.

<yet more...:)>
> Putting these together, you could imagine stopping your run on one computer,
> and later resuming it on a different computer. You would not perceive any
> break in your train of thought or have any reason to believe that you had
> died.
>
> Now, this is effectively the same as starting up a copy of your mind while
> destroying the original. It appears that this actually does preserve
> identity, at least in the case of uploads, and there is no reason why it
> should not do so for organic brains as well - it's just more difficult to
> arrange in that case.

I still see a difference between stopping and starting the *same*
program (which is held in memory until restart), and starting
*another* program while stopping the first permanently. We often
think of two copies of a program as being 'the same' when in fact
they have features that differ (location, frex). But we only do this
because we *don't care* if the program or data we are using is
identical, as long as it is close enough that it works. A self-aware
program would probably care.

> Another argument is to consider the case of a world where copying like
> you describe is easy and commonly used as a means of transportation.
> Someone has copies "on ice" on many different planets, and when they
> want to travel elsewhere they transmit a copy of their brain state, get
> that programmed into the brain of the copy, which then wakes up as them.
> (This is similar to what is done in Linda Nagata's "The Bohr Maker".)
>
> Imagine yourself someone who has participated in this form of
> transportation many times over the course of his life. He has memories
> of having lived in each of these various bodies for a time, then
> transferring his consciousness to a different one, while the original
> body was destroyed (or at least its memory erased for future use).

I would say that such a person has participated only once. :) The
fact that he has memories which are taken from previous people
doesn't mean that he *is* those people.

> From his point of view, these transformations preserved his consciousness
> and sense of self just as much as going to sleep at night and waking
> up in the morning. He felt just the same, behaved the same, remembered
> everything he had done before the transfer. There is no reason for him
> to view it as death.
>
> This would be the natural point of view of anyone who had tried it a
> few times. So it would seem that actually experiencing the kind of
> transfer that you are afraid of leads to reassurance that it does preserve
> your sense of self, and there is no reason to fear it as death.

Assuming that someone who may exist in the future will think that
they are me doesn't really help me accept the idea that *I* should be
killed after the copy is made.

Wolfkin.