Re: longevity vs singularity/Question

David Blenkinsop (blenl@sk.sympatico.ca)
Wed, 18 Aug 1999 16:57:50 -0600

On August 16th, I responded to an August 1st message by Robert J. Bradbury, a message that appeared with "[fwd]" attached to he current "Re: longevity . . ."
subject thread header. Among other things, Robert Bradbury said "When you make two identical upload copies, you get the same thing as if you have an operating system that can "save" the entire memory/hardware state of a computer". In essence, the actual topic here is, how best to look at the somewhat controversial subject of copying and/or uploading human minds? In my reply, I outlined one possible way to work the odds on questions of how a 'soon to be copied' person should regard the probability of his personal identity being associated with one future copy or another.

In regard to this, Clint O'Dell wrote:

> I disagree.
> A conscious person is conscious because he is self aware.
> This self awareness is a pointer in mind pointing to itself.
> If you made a copy of a person then there would be two persons, two minds.
> One pointer per mind. Neither would be you unless, of course, you moved
> your pointer to one of the minds. The pointer cannot point to both minds
> because it is one pointer in the mind pointing to a section of the mind it
> can communicate with.

If this identity-defining pointer is a physical process, are you saying that a copy only gets the pointer if it somehow inherits the actual, original, atoms that the original person was made of? If so, you get into the logical difficulty that living organisms like ourselves exchange atoms with the environment on a continuous basis. There's no clear reason why a sense of self, or of identity, should be strictly connected to any atoms in particular, hence the idea that person-copies should be regarded as equivalent to one another as possible.

In my own scheme for evaluating one's chances of becoming a particular copy, one thing that I was trying to focus on, is that each new copy will surely identify himself as "my one and only self" *after* any copying event has happened. This will tend to be the case even if a copy concedes that any others have inherited the same personal history up to that point. This sense of identity just *has* to feel like a unitary, or single, extension of past history to each and every copy. Given that, some sort of probability outlook would seem the only way to handle questions of where is one likely to end up, if a planned copying event is still in the future.

>
> 2)You could combine your brains. That would allow for continues exchange of
> information as thoughts and experiences arise. If you go as far as merging
> the pointers together you would again be one person.

Under most circumstances, combining minds and identities with someone else would seem like a truly drastic, brainwashing modification of personal self image and self identity. You'd tend to get a new person who wouldn't be either of the old persons, unless maybe you're thinking that one side would be predominant, just collecting the other's info?

In the case of combining recently made people-copies, the matter wouldn't be so clear, I suppose. Technically, this "recombining" doesn't seem as straightforward as making a copy. Further, even if a "recombination" were to be implemented, it still sounds needlessly confusing for the "recombined" person! If I imagine having two different "tracks" of personal memories, then in that case, I'm probably going to want to know which track was actually "my own" personal memories and decisions and experiences! What if each component mind made key decisions differently, shouldn't I know which decisions were actually mine? Maybe I'd be better off if the copies were left permanently separate, aside from just getting together to discuss and compare notes in regular communication, as opposed to mind melding?

David Blenkinsop <blenl@sk.sympatico.ca>