Re: The Most Basic Question (Life and Uploading)

Mike C. (
Mon, 24 Mar 1997 22:51:14 -0500

>From: Anders Sandberg <>
>Date: Sun, 23 Mar 1997 21:11:13 +0100 (MET)
>Subject: Re: The Most Basic Question (Life and Uploading)
>On Sun, 16 Mar 1997, Mike C. wrote:
>> First I reply to:
>> >On Fri, 14 Mar 1997, Joseph Strout:
>> >>
>> >> On what basis would your upload not be you?
>> Control is a critical factor.
>> If my brain controls my actions
>> then a copy of my brain would control it's actions
>> and would not be under my control.
>I think you are missing the point here. The other brain would think it
>was you and that you wasn't it since it didn't control your actions.
>There would simply be two of you.

They would not be me.
A would think it was A, B would think it was B.
Not two of me, one of each of them.
One would go by one name and the other would go by another name.
A would live it's life.
B would live it's life.
Their lives would be different.
They would evovle in different environments.
One side of the room can heal while the other side can harm.
A gains a proton while B loses an electron.
A reads from the net while B makes an interview on the phone.
They are exposed to different memes.
They evolve diferently.
Once one thing happens to one that does not happen to the other
they are no longer the same.
Ever hear of the snowball effect?
Imagine a small snowball rolling down a hill.
It gets bigger and bigger.
It only takes one small action to cause a landslide.
It only takes one straw to break a camals back.
One difference, no matter how small, means they are not the same.
Different atoms = different people.
They would have many similar qualities.
They would not have all of the same qualities.
Yes, they may act alike for a while.
What if A's wife kills herself
and in despair it concedes to be a deathist?
That would not be me.

>> They evolve in the environment from differing perspectives.
>> I could predict it's actions quite well from empathy
>> but to say that makes it me is difficult to believe.
>So, who would it be?

It would be it.

>I think it is important to distinguish between
>you as your current conscious thread and you as a person with certain
>memories, personality and goals.

A's exact structure and evironment in spacetime is A's.
B's different exact structure and environment is B's.

>The copies after forking would be
>different threads of consciousness but would be the same person
>We need a better terminology here.

OK, terminology for what?

>> If destructive scanning techniques were used it may ruin my brain.
>> My consciousness may consist of vaporized atoms.
>Hardly likely. Consciousness seems to be very linked with the integrity
>of the brain (or information processing system, for infomorphs)

Imagine your brain being vaporized.

Now imagine it happening vveerryy sslloowwllyy,
in slow motion, still being conscious.

Sound like fun?

>> Maybe the brain is not as important as I thought.
>> Could I still be me without a brain?
>> Parden me while I get thrown for a loop.
>Yes, this is an important question. I define myself to be what I do (I
>see myself as a mapping A acting on the universe), so to me what kind of
>brain I have doesn't matter.

I agree,
but I worry about my sentience being linked to material substance
which would require tranference.
What would I be feeling if my brain were removed?
I know my brain stem alone could keep my body alive until it got a new brain,
but I wonder where I would seem to be.
Would I be more aware of the stem, the grey matter, both, neither?

>> How far have I come when I do not willingly accept simple answers
>> to the most basic question?
I suppose I have come a long way by wanting more than simplicity.