From: Lee Corbin (lcorbin@tsoft.com)
Date: Sun Jun 29 2003 - 12:00:26 MDT
Brett writes
> Giulio writes
>
> > So, operationally from the outside and from the inside the
> > uploaded copy is the original. I wonder then what the meaning
> > of "but the uploaded copy is not REALLY the original" can be.
>
> I'm thinking functional equality is *possibly* not the same as identity.
> The contrary view appears to me to be that it *necessarily* is.
You are correct. Functional equality is not *necessarily*
the same as survival-identity. In my opinion, a giant lookup
table made of hyper-computronium would be functionally equivalent,
but not conscious at all.
> > I feel like me because I remember the things that I remember
> > (including what I donīt consciously remember at this moment)
> > or, in other words, because of the specific information coded
> > in my brain. This is I believe the simplest explanation.
>
> I think this is just a restatement of the "there is no possible difference"
> case. I'll grant the duplicated you would feel like they were you.
> I won't grant that they *are* you just because they feel like they
> are.
Just to clarify my position, I do agree with Giulio here.
> I will grant that to me and everyone relating to you the duplicate
> will be satisfactory. I am more "selfish" when it comes to me.
Absolutely! This is the *crux* of the real philosophical problem!
We ought to posit the existence of a Most Selfish Individual, and
ask to what well-informed decisions he would come.
> I don't care whether you and my duplicate and everyone else in the world
> agrees that after the transformation that produces my duplicate has
> produce me, I *care* that *I* am not *sure* that it is so beforehand.
Yes, exactly. True, there are some people whose altruism is so finely
woven into the rest of their decision making and belief system that it
cannot be extracted, and they misinterpret the philosophical problem.
But we are talking about a survival-oriented, selfish view of what is
"in it" for an individual.
> > Yes; in every way your uploaded copy---or even you if you
> > are disintegrated at 10am tomorrow morning and then instantly
> > re-integrated using the same or different atoms---will have
> > this same impression. It could, even now, be happening a
> > hundred times a second.
>
> When you say "could" what are you basing your view on?
Suppose that there existed a machine that could decompose you into
your constituent elements within a microsecond. (That's about 1000
times slower than sending signals throughout your body.) Then
suppose that a few microseconds later, it reconstitutes you. Now
suppose that this entire process happens hundreds of times per
second. There we are.
And this is NOT the place to inquire after the actual physical
plausibility of such a device. Yes, our notions of who and what we
are do depend on our basic understandings of physics, but whether
or not a certain technical breakthrough that is not conceptually
significant ever occurs or not is moot.
> > > There are still thinkers thinking thoughts. Of course you
> > > can accept this as a partial "answer" only i[f] you believe
> > > that each consciousness is fundamentally the same, in
> > > other words that there is no physical or spiritual "signature"
> > > other than information that defines the difference between
> > > you and I.
>
> My thesis contains no "spiritual signature" it contains recursive
> biological programs that require somewhere to store not just
> thoughts but thoughts about thoughts, and thoughts about
> thoughts about thoughts etc. Possible this requires some sort
> of recursion counter and a developmental process. Ie. You
> can't have 'thoughts-about-thoughts' until the wetware has
> developed enough to have 'thoughts' and 'feelings' first and
> so on. I think children have thoughts and feelings before they
> have a sense of self, before they become what we take
> to be fully sentient.
>
> Now the question is once an adult level of sentience is achieved
> can you capture the recursion counters in the wetware? A
> snapshot of the conscious process and memories and restore
> it either onto an identical wetware substrate or a different set
> of firmware on an upload. I don't know.
Why not? Are you suggesting that it will *never* be possible to
root out every last property of the human brain? And what about
the easy way: a copy is made by a nanotechnological device that
simply gets all the atoms? As for uploading, we will have to
wait to see what level of sophistication is truly required. It
may be that stripping off one neuron at a time, noting its
connections, and making sure that the machine implements an
equivalent architecture is sufficient. In *this* particular
case, I do believe that functional equivalence is adequate to
guarantee that the consciousness, intelligence, etc., is the
same---i.e., it feels that its you just the way you do, it is
a difference without a difference, etc. That's because it
makes the same computations you do (which isn't true in the
giant lookup table because IMO computations do not occur).
Lee
This archive was generated by hypermail 2.1.5 : Sun Jun 29 2003 - 12:12:04 MDT