---Jsn@concentric.net [John Novak] wrote:
> Perhaps your definition of identity is lacking.
> But I really can't tell, since nowhere in your proof by repeated
> assertion did you define it,
My first shot at defining identity is as follows:
Identity - A slightly dynamic but mostly stable fuzzy area within the design space of all possible information processors as defined by the ego.
> nor make any references to the argument
> that I have made against you--namely, that ego is not conserved, it is
> multiplied and the result is not one thousand minds with one ego and
> self preservation instinct, but one thousand minds with one thousand
> egos and one thousand distinct self-preservation instincts.
I believe the fuzziness in the above definition is determined by the ego. It is fuzzy because the ego has limited knowledge about its exact location in the design space. The ego is the sense of ownership/selfishness that we have about this particular design space. Under most circumstances the ego would insist on the existence of one and only one point plot existing in this area. In my previous post I gave many reasons why. I also gave some special circumstances for the exceptions. For example, planned space travel resulting in giving up most friends, family, property, and normal activity in familiar places causes a drift so fast that the sharing of multiple point plots within the fuzzy area would be short lived. Also, Situations where there would be multiple point plots only for ten hours would be another instance where sharing of the fuzzy area would be tolerable.
Normally, as long as there is one and only one point plot in the fuzzy area my ego and self preservation instincts are intact. This amounts to the opposite of what you are saying because it is the ego and self preservation instincts of my identity who is driving me to insure there is only one point plot in the fuzzy area. In order to self preserve and keep my ego intact 999 of my point plots would not hesitate to commit voluntary amnesia.
With the above definition it is clear to me that we are in fact talking about 1000 minds with one ego because the ego defines identity as the fuzzy area where all 1000 point plots reside. Even when all 1000 entities occupy different points, as long as all ego's fuzzyness are percieved to intersect, only one ego exists. All 1000 point plots realize that 999 of them must go and whether or not one particular is willing to snuff it is a simple matter of perfect and safe 10 hour amnesia and not something worth fighting your own ego over.
In fact I'll make an argument in just the opposite direction. Lets
say one of the 1000 workers, Mr.69 lucked out and had two hours of the
hottest sex he has ever experienced. Now, he is going to call Mr.
Original who has been home unpacking and tell him all about it. Mr.
Original is going to be thinking, "According to our selection criteria
this guy has achieved a higher fitness within our fuzzy area than I
have". Mr. Original is now hard as a rock knowing he can easily talk
Mr.69 into trading places and letting him (Mr. Original) be the one to
commit voluntary amnesia. In fact Mr. original is extremely anxious
to get the process on with so he can begin pondering the recollection
of his recent hot sexual experience.
------[Joe Jenkins wrote]
> >many otherwise rational people become an emotional wreck when dealing
> >with these issues. Witness the following:
> Well thanks.
Sorry, I didn't mean to single you out here. This debate is making me an emotional wreck.
> >Mark, I appreciate and empathize with your comments. However, in so
> >far as the safety and reliability of the technology is trusted
> >"[having] a change of heart" is totally irrational.
> Look, your definition of identity is fundamentally flawed, and no
> of technology will change that. A copy is only a duplicate of you at
> precisely the instant it's copied, and immediately begins to
> the copy is destroyed at that point it's dead.
I gave my precise definition of identity above. What part of it is flawed?
> Let's give a better example than yours. Suppose you decide that the
> weather is too nice to work, so you create a copy and send it to the
> beach; you'll go to work and then kill yourself, so the copy will take
> your place the next day.
> Will you kill yourself at the end of the day? This example is
> equivalent to yours except that you get to kill yourself, you don't
get > to
> claim without evidence that your copies will do so.
In this case I would participate in discussion with my copy to determine according to our selection criteria which of us had achieved a higher fitness within our fuzzy area. Most likely since he was at work presumably making undocumented accomplishments (i.e. not mundane) and I was out goofing off presumably not getting lucky or anything, I would concede (in my own best interest) that he achieved the highest fitness function and quickly commit voluntary amnesia.
> Are you really that suicidal?
voluntary amnesia=suicide? 1=1,000,000? This is true? Quite the contrary, the motivation behind committing voluntary amnesia comes from a strong desire to preserve my singular ego.
---John Clark <firstname.lastname@example.org> wrote:
> If the physical state of my brain is in a state that causes me
> to think that I will be dead in 60 seconds and if my semi copy
> does not think he will be dead in 60 seconds or even 60 years,
> then a physical process in my brain is not being emulated by my
> "copy". When a bullet stops that physical process the sense of
> identity it produces at that moment is destroyed also because
> it has no backup. I'm not saying John Clark would be dead, the
> other fellow would have just as much right to that title as I
> but as I stared at the gun I would know that the very focus of
> my existence, that is, the thoughts I'm having right now, would
> not continue. This would scare the hell out of my and I'll bet
> you dollars to donuts it would scare you too.
I can't tell you if dollars will be worth more than donuts by the time this experiment is possible. I can tell you I'd bet you either way no matter what the ratio.
> I would only be
> happy if my copy were made right now, but how long is "now"?
> I thing it would depend on the intensity of your conscious
> experience at the time (staring down the barrel of a gun is
> pretty intense), but on average I think "now" is about a second
> or two long.
In the terminology I've laid out above rephrasing what you've just
said is that your ego is able to resolve your design space to the
degree it changes on-average in a second or two. All this while us
mortal humans use averaging techniques to smooth out the state changes
in the variables of our consciousness in order to process our ego
function. Ego is not equal to consciousness. Your saying your
particular ego has complete omniscient knowledge of where it resides
in that design space. That proves it, you are not just a genius, your
ego alone is an all knowing conscious entity, I'd pretty much say you
are God. This is incredible, debating with God himself. Please tell
me what is the purpose of the universe ;)
> So why don't I find amnesia as frightening as the thought
> experiment with the gun and the imperfect copy? Because in
> the real would you only find out about amnesia after it
> I wake up in a hospital and the doctor tells me that yesterday
> hit me on the the head with a baseball bat, he says I'll be OK
> I've permanently lost all memories for the last week, I think no
> deal I feel fine now and if I comb my hair over my gash you can
> hardly see it. Perhaps the John Clark of one week ago is dead
> I don't care because I'm not him, I'm the John Clark of right
> On the other hand if I see you coming bat in hand ready to
> some more amateur brain surgery I'm going to run like hell.
Me too. However, in both my example of the 1000 copies and the thought experiment with the flash light type device, safety and pain were not an issue.
> John has been making the same kinds of arguments on this list for as
> as I remember, and I've yet to see anyone show a flaw in his
In my earlier post I was being facetious about questioning his identity. And I also agree that he has been consistent. I am however, surprised at his assessment of the power of his ego to resolve such a small fuzzy area of design space. I don't believe for a second that it is humanly possible. He is virtually claiming that the fuzzy area for him is actually a single point. I do not think this is a valid definition of identity. Analysis of John Novak's post reveals his belief in the same single point theory of identity. To prove your case you must prove your ego is at least as real-time dependent as your consciousness. I will then ask you why is there something rather than nothing. Then, I will worship you both in hopes of being uplifted ;)
To save time and effort please allow me the following crude simulation of the next phase of this debate:
--Mr.Jenkins simulation of the merged minds of Mr.Clark and Mr.Novak Wrote:
>> I would hardly call it omniscient, being able to resolve the >> deference between the The John Clarvak starring down the barrel >> of a gun and the John Clarvak with a bright future in front of him.
The ego is capable of measuring a multitude of variables to resolve the fuzzy area of design space. The design space consist of state variables and processing algorithms. Your consciousness would indeed be very aware of this difference because it is especially sensitive to state variables changes. But the nature of your ego is such that state variable changes such as stress are averaged out in order to allow for processing of many longer term variables and detailed assesment of algorithms. Observing a single pixel change in a 1280 x 1024 image is one thing but saying you would identify this as a different picture is out of the question.