Hal Finney <firstname.lastname@example.org> writes:
> Suppose you are an upload. Someone proposes to pause your program for
> a moment, then resume it. Will this kill you?
Yes. You would be killing me and then unkilling me. Suspending my program by pausing it is similar to the idea behind cryonics. When I die, deanimate, or cease to function, I will be suspended. Hopefully, someone will later reanimate or unsuspend me. That is what you are doing here: You are suspending and then unsuspending the program.
As with cryonics, if I am unsuspended at some point in the future, I would consider myself alive. If I am never unsuspended, and my remains deteriorate, then I would consider myself dead.
> (Note that such behavior
> is an inherent property of timesharing computer systems, and that in fact
> the discrete nature of computer simulations implies that there can be
> said to be a "pause" between each clock tick.)
I don't mind these theoretical pauses between ticks. What I mind is having myself stopped while the rest of the universe goes on without me. If I am stopping/starting at the same rate as the rest of the universe, then I will continue to experience life in real-time with no gaps or loss of experience.
> Someone proposes to pause your program and then resume it on another
> machine. Will this kill you?
I don't know. This is the same question we are considering now. You really can't pause and unpause a program to a different processor. You must kill the first program, load a copy into the second program, start the second program and then kill the original. What you are asking is the same as the original question.
By using uploaded programs in your example, you are removing normal human life from the process. It is pretty obvious when a human is and isn't functioning. Programs start and stop all the time. Copies of them are run here and there. Are they the same program, different instances of the same program, or different programs? These are semantic questions as to what is "sameness". I would not be willing to let a program running my consciousness be terminated, even if a copy of it were to be started somewhere else. The choice to kill my program is based on my program alone. I don't care what other programs are also running, my program does not want to be killed.
> Suppose the computer has redundancy internally so that everything is
> duplicated, two copies of each processor side by side, likewise for memory
> elements, communication circuits, etc. You are run on such a computer.
> Someone proposes to remove some element of the redundancy so that
> there will no longer be this internal duplication. Will this kill you?
Semantics again. There are two copies of me running. If you kill one, it will die and there will only be one copy of me left running. Ask the identical copies which process should be killed, and I think they will give the identical answer: Kill the other one!
> Suppose the two copies are run on two different computers. They are in
> perfect synchrony and all signals on one are duplicated on another. This
> is another way of providing redundancy. Someone proposes to turn off
> one of the computers permanently. Will this kill you?
Same as above. You are postulating two of me. I have 2x processing power and am located in two distinct locations. If you kill one of me, I have lost a full person's worth of processing power, one person's amount of disk space, one person's cpu chip. My ability to experience consciousness in the terminated location has stopped. The loss is identical to the killing of a single instance of me. A person watching the one computer get killed will not be able to detect any difference between your example and a simple killing of the single entity known as me. The fact that miles away there is some other program running does not change the events for the one computer a bit.
> The last scenario seems essentially to represent the situation you describe
> where you would view it as committing suicide.
I agree. This scenario is the same as my suicide question. That's why my answer is the same. You probably expected my answer to be different, but I'm not sure why.
What is it about the copy that makes me lose my will to live? Is it the knowledge that I will have left my mark on society? I don't care about leaving a mark, I care about continuing to live. Is it that the copy can do anything I can do, so I am no longer unique? I don't care. My desire to live is not dependent on being unique. I will not be so depressed upon seeing a copy that I kill myself. Is it that the copy can predict what I will do, and therefore I don't have to do it? Is it that no one else could tell the difference, and if no one else cares, why should I care either? What information makes me change my mind about wanting to live?
-- Harvey Newstrom <mailto:email@example.com> Author, Engineer, Entrepreneur, <http://www.gate.net/~harv> Consultant, Researcher, Scientist. <ldap://certserver.pgp.com>