Mark Gubrud wrote:
> I agree it is in principle possible, within known physics, to copy the
> pattern of function and interconnection of the brain, and to implement a
> functional equivalent in some type of computer. Such a process could be
> termed "copying the contents of a brain."
I don't know if you're aware of this, but the atoms within the brain do
not remain constant. I forget what the exact rate of turnover is, but
something like 50% of the atoms get replaced every six months. So
whatever "we" are, it's clearly not the atoms.
If you scan in one neuron at a time, destroy the old neuron, and replace
it with a robotic interface to equivalent computations taking place
elsewhere, you can migrate, one neuron at a time, to the new substrate,
without ever losing consciousness. In this case, would it not be
possible to say that the new individual is unambiguously "you", with the
change of physical substrate as irrelevant as the turnover among your
constituent amino acids?
> How is your notion of "migrating consciousness" different from the ancient
> idea that there exists a "spirit" which can possibly move from one body to
> another? If this is not what you believe, shouldn't you make this clear
> by not adopting language that will be taken as implying that this is what
> you propose to achieve, by means of technology?
Nobody cares about what the ancient ideas are. The comparision is
wholly irrelevant, like making fun of airplanes by comparing them to the
legends of Daedalus or the chariot of the Sun. We discuss only the
And yes, if you can move an identity into a computer, then you can
create duplicates of the identity as well as moving it. You're the one
bound by the ancient preconceptions, not Samantha. Your preconception
of the mysterious spirit requires that it not be subject to copying;
then, you use the fact that a given technical procedure could copy a
spirit to prove by an alleged reductio ad absurdum that the same
technical procedure cannot move it, or that the process of moving only
creates a copy.
If you take a brain and put in the blender, you destroy the
informational identity. If you replace one neuron at a time with a
biologically equivalent neuron, or just wait a few years while the cells
metabolisms churn, you have moved the identity from one set of atoms to
another. If you replace one neuron at a time with a computational
equivalent, you have moved the identity from one set of atoms to
another. If you read out the information and create a functionally
equivalent copy, you have created two identities, neither of which has
any stronger claim to being the "real you".
In short, your argument rests on a sleight-of-hand. If you
nondestructively copy all the information in the brain, creating a
duplicate, then the impulse is strong to identify the original set of
atoms as the "real you", implying that the second set of atoms is a
"fake you" - and if a nondestructive upload "merely creates a copy",
then surely a destructive upload also "merely creates a copy". If one
set of atoms maintain physical identity, and the other doesn't, then the
temptation is very strong to say that the first set of atoms has a
stronger claim to informational identity. But this is a mere
anthropomorphism. If the informational equality is perfect - i.e., with
margins of error substantially below the ordinary "noise level" of the
brain's computational environment - then both sets of atoms have an
equally strong claim to being the "real you". They are both the
original. They are both copies. The atoms are irrelevant.
Who says there's such a thing as an observer-independent definition of
personal identity, anyway? Maybe someday we'll discover one, but we
certainly have no experimental base with which to make the assertion in
present time - just a set of preconceptions derived from the "ancient idea".
"But I am not an object. I am not a noun, I am an adjective. I am the
way matter behaves when it is organized in a John K Clark-ish way. At
the present time only one chunk of matter in the universe behaves that
way; someday that could change."
-- John K Clark
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:06:45 MDT