From: Brett Paatsch (paatschb@optusnet.com.au)
Date: Sun Jun 29 2003 - 10:40:20 MDT
Lee Corbin writes:
> Giulio writes
>
> > So, operationally from the outside and from the inside the
> > uploaded copy is the original. I wonder then what the meaning
> > of "but the uploaded copy is not REALLY the original" can be.
I'm thinking functional equality is *possibly* not the same as identity.
The contrary view appears to me to be that it *necessarily* is.
> > I feel like me because I remember the things that I remember
> > (including what I donīt consciously remember at this moment)
> > or, in other words, because of the specific information coded
> > in my brain. This is I believe the simplest explanation.
I think this is just a restatement of the "there is no possible difference"
case. I'll grant the duplicated you would feel like they were you.
I won't grant that they *are* you just because they feel like they
are. I will grant that to me and everyone relating to you the duplicate
will be satisfactory. I am more "selfish" when it comes to me. I don't
care whether you and my duplicate and everyone else in the world
agrees that after the transformation that produces my duplicate has
produce me, I *care* that *I* am not *sure* that it is so beforehand.
> Yes; in every way your uploaded copy---or even you if you
> are disintegrated at 10am tomorrow morning and then instantly
> re-integrated using the same or different atoms---will have
> this same impression. It could, even now, be happening a
> hundred times a second.
When you say "could" what are you basing your view on?
That your atoms change through your life? Granted but not all
at once.
That the clinically dead are sometimes revived? Granted,
but not after comprehensively being dismantled.
Spores and nematode worms can be brought back to life
after being frozen simply by warming them up, but this doesn't
demonstrate consciousness returning. It suggests it might. If there
is no important difference between consciousness and whatever
passes through the neurons of the worm AND if it doesn't matter
that the atoms are *all* exchanged at once then you are correct.
In both the above cases of frozen organisms reanimating the atoms
were only exchanged to a negligible degree.
> > I think when one dies consciousness goes on, even if that
> > particular individual stream of consciousness stops.
This actually seems to come closer to a belief in a soul or astral
travel to me than the alternate view. Your positing that the
substrate (or lack of it for a time) doesn't matter. I think
consciousness is like a running, a self referencing program or set
of interrelated programs, it requires a substrate on which to run.
Can it survive a full substrate change all at once? (Not having a
substrate for a time). That I contend *we* do not know.
> > There are still thinkers thinking thoughts. Of course you
> > can accept this as a partial "answer" only i[f] you believe
> > that each consciousness is fundamentally the same, in
> > other words that there is no physical or spiritual "signature"
> > other than information that defines the difference between
> > you and I.
My thesis contains no "spiritual signature" it contains recursive
biological programs that require somewhere to store not just
thoughts but thoughts about thoughts, and thoughts about
thoughts about thoughts etc. Possible this requires some sort
of recursion counter and a developmental process. Ie. You
can't have 'thoughts-about-thoughts' until the wetware has
developed enough to have 'thoughts' and 'feelings' first and
so on. I think children have thoughts and feelings before they
have a sense of self, before they become what we take
to be fully sentient.
Now the question is once an adult level of sentience is achieved
can you capture the recursion counters in the wetware? A
snapshot of the conscious process and memories and restore
it either onto an identical wetware substrate or a different set
of firmware on an upload. I don't know.
> > So when I die "I" will continue as another thinker thinking
> > other thoughts and remembering other things: this "I"
> > (weak I) is already preserved. If what I am interested in
> > is preserving a "strong I" (the thinker who is thinking these
> > specific thoughts and remembering these specific things),
> > then I do not see anything wrong with uploading when it is
> > technically feasible.
I see nothing *morally* wrong with uploading. Even if it doesn't
work, if all my friends and family do it, and it works well enough,
then old solipsist that I am, I'm better off than if they were gone
completely. Its a victimless crime if their original "strong I" was
going to die anyway.
And Lee is right, I suspect, in holding that the memesets that
*think* they will survive uploading will stand a better chance
then those that doubt if such translates into decisions to act
or not.
- Brett Paatsch
This archive was generated by hypermail 2.1.5 : Sun Jun 29 2003 - 10:49:30 MDT