At 11:41 AM 07/07/2001 -0700, Molloy wrote:
>--> Miriam English
> > The mistake is in thinking that we are continuous. We aren't. We
> > exist in short, daily bursts. We lose consciousness each night -- more
> > often if we happen to drink to excess or have surgical operations or
> > receive traumatic blows to the head. There is no continuous me --
> > just the illusion of it maintained by my memories of what I did and felt
> > on the preceding day.
>Continuity is provided by association with the same body that you went to
>sleep in. Given the ephemeral nature of intelligence, you have to place
>the burden of continuity on the corpus.
I am a regime of thought patterns and memories, a set of actions performed.
My body (including the physical brain) has only the most tenuous and
coincidental connection with that. My body is one of the things that helps
provide the illusion of continuity, but it has little to do with my feeling
that I am me.
> > If my biological self died in her sleep one night and next day my
> > downloaded self awoke, then in what way have I not continued? I
> > would still remember what went before in exactly the same way I do each
> > morning of my biological life.
>But it wouldn't be you -- it would be a copy of you; a separate person
>associated with a different corpus who just happens to be very similar to
>the (now extinct) you.
The you now is 'only' a very similar copy of yesterday's you that was
started up this morning. You are taking objectivity to an absurd level (I
mean that technically -- I am not intending to flame you).
Look at the situation subjectively for a moment and you can see the reality
of the situation: I go to sleep. I wake with all my thoughts, memories, and
emotions intact, but inside a virtual world. I no longer have a body in the
physical sense, but I can see and walk and talk like I could last night. It
is still me. The consciousness that was the broad set of actions performed
by the organic brain (before it died) is now being performed by a computer
inside a virtual world.
>I find belief in the indentity of two copies to be a risky hypothesis.
You are talking in ideals and absolutes, but in the real world things are
not so clear-cut. This is where our discussion doesn't quite meet. I am
speaking practically. I am happy to rely upon a backup of my brain to carry
my thoughts, feelings and memories into the future, as my parents are happy
to use their children for that, and I used to think I would have to rely
upon my artwork and writing.
Yes, in some sense the backup is not *exactly* the same person as before,
but considering the changes I go through every day, I think it is a trivial
The actual recording of a backup may be risky, perhaps. But so is trying to
replace your brain bit by bit, hoping nothing goes awry in the process. In
the replacement scenario you are tampering with the only brain you have,
and a healthy one at that! The other way is *much* safer -- you get backups
that can be checked for accuracy and finally activated upon your biological
The two are not even exclusive. I can imagine that if the replacement was
proven to me to be totally safe I would have it done... but I would also
want backups in case the replaced me just happened to walk in front of a
>there's no verifiable way to prove that that the essential you will
>continue if you peform some sort of radical cut-and-paste operation like
>the one described above.
Kinda true, but there is no verifiable way to prove that the you who woke
this morning is the same one that went to sleep. And I am not being cute
here. There are occasions when people have strokes in their sleep and wake
with significantly altered brains. Are they the same person? It depends
largely on how you define sameness... and that easily becomes a matter of
playing with words.
Also I think you are harking back to the same mistake of thinking that the
illusion of continuity is a real phenomenon when you say "the essential you".
> The only way to see is
>to try it -- which will either kill you or not. I'd much rather stick with
>continuity through association with a given chunk of (slowly changing)
>matter -- at least I know I'm not doing anything there that will wind up
>causing me to cease to exist.
Both options carry risks. Though I can't help thinking that mucking around
with the only copy of my brain is much more dangerous -- there is no
fall-back position there. But I gotta say, if it turned out to be quite
safe I would certainly go for it.
Both options are infinitely less risky than the third option: dying.
Q. What is the similarity between an elephant and a grape?
A. They are both purple... except for the elephant.
Virtual Reality Association http://www.vr.org.au
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:42 MDT