Good exchange with Eliezer

From: Mark Gubrud (mgubrud@squid.umd.edu)
Date: Wed Mar 29 2000 - 18:50:25 MST


Eliezer:

Please post my complete response to extropians. I would prefer if in the
future you do not post incomplete selections of my arguments to other
lists.

> From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
>
> I don't know if you're aware of this, but the atoms within the brain do
> not remain constant.

Duh.

> So whatever "we" are, it's clearly not the atoms.

The problem with this statement is that you are already investing a lot of
potential meanings into the word "we." Each of us exists AT THE PRESENT
MOMENT as a unique physical construction of atoms. We have also the fact
of continuous identity from birth to death. This is a simple enough
matter as long as you don't introduce all these bizarre scenarios we are
talking about. But as soon as you do, the "continuity of identity"
becomes problematic. Whereas the physical existence of a body, a brain,
in some compact region of space-time, remains a fairly simple and
unambiguous fact.

> If you scan in one neuron at a time, destroy the old neuron, and replace
> it with a robotic interface to equivalent computations taking place
> elsewhere, you can migrate, one neuron at a time, to the new substrate,
> without ever losing consciousness.

This is Moravec's scenario; see my response to "Max's argument" on this
list.

> In this case, would it not be
> possible to say that the new individual is unambiguously "you",

It is possible to say this, but that does not make it unambiguous. For
example, what if you make two copies simultaneously? There is no apparent
reason within this scenario why you can't make as many copies as you like.
Which one, then is "you"? What if two copies are made, and then one is
scrapped? Do "you" automatically become the one that remains, or is there
a risk that "you" wind up dead? What if 1,000,000 copies are made, and
then 999,999 of them are scrapped? Do "you" automatically become the one
that was randomly chosen to survive? How much difference would it make to
"you" if all of the copies were scrapped? Are "you" then left flying
around in the void without a body to migrate to? Or is this all just a
bunch of superstitious nonsense?

> We discuss only the technical possibility.

No, you also construct a quasi-religion within which this technical
possibility is made to seem humanly desirable.

> And yes, if you can move an identity into a computer, then you can
> create duplicates of the identity as well as moving it.

Now you resort to the word "identity," but you use it in exactly the same
sense as others use the word "spirit." But what does the word "identity"
mean, if there can be "duplicates"?

> Your preconception
> of the mysterious spirit requires that it not be subject to copying;

I don't believe in "spirits" (unless we use that term to mean actual,
physical human beings). But you apparently do:

> you use the fact that a given technical procedure could copy a
> spirit

I do not believe this "fact."

> to prove by an alleged reductio ad absurdum that the same
> technical procedure cannot move it, or that the process of moving only
> creates a copy.

What else does it do, besides making a copy? In what sense is my
description of the process as "making a copy" incomplete? What THING
"moves"?

> If you replace one neuron at a time with a
> biologically equivalent neuron, or just wait a few years while the cells
> metabolisms churn, you have moved the identity from one set of atoms to
> another. If you replace one neuron at a time with a computational
> equivalent, you have moved the identity from one set of atoms to
> another.

What does such an "identity" look like? Some kind of bluish light?

> If you read out the information and create a functionally
> equivalent copy, you have created two identities, neither of which has
> any stronger claim to being the "real you".

Now you're beginning to think.

> In short, your argument rests on a sleight-of-hand.

Moravec's argument is sleight-of-hand. Ditto Max More's. And these guys
have created an entire cult movement out of this gibberish.

> If you
> nondestructively copy all the information in the brain, creating a
> duplicate, then the impulse is strong to identify the original set of
> atoms as the "real you", implying that the second set of atoms is a
> "fake you" - and if a nondestructive upload "merely creates a copy",
> then surely a destructive upload also "merely creates a copy".

Very good!

> If one
> set of atoms maintain physical identity, and the other doesn't, then the
> temptation is very strong to say that the first set of atoms has a
> stronger claim to informational identity.

Too much inflated jargon here, but again you are more or less on target.

> But this is a mere anthropomorphism.

BINGO! And just what sort of morphism are YOU, my friend?

> If the informational equality is perfect - i.e., with
> margins of error substantially below the ordinary "noise level" of the
> brain's computational environment - then both sets of atoms have an
> equally strong claim to being the "real you".

Only because the claim is, in either case, meaningless! Unless you
allow the meaning that gives the claim to the "original." Which would be
the anthropocentric thing to do. Long live anthropocentrism!

> They are both the original. They are both copies.

Yeah, except one is the original and the other is the copy.

> The atoms are irrelevant.

If you say so.

> Who says there's such a thing as an observer-independent definition of
> personal identity, anyway?

Provided we do not introduce copying, there is no ambiguity in assigning a
unique personal identity to each human life from birth to death. But this
is clearly a construction. As soon as you introduce the possibility of
copying, you do violence to this construction. You unmask the void. You
reduce human life to an abstract, meaningless absurdity.

"Uploading" is nonsense. It can be justified only in terms of a mystical
faith in the implicit meaning of phrases like "informational identity" or
"algorithms of consciousness" or "you wake up in your new body." If we
look at what is going on from a materialistic viewpoint, we confront the
horror of the void. Indeed, the flip side of the "transhumanist movement"
is sheer nihilism.

Yours,
Mark



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:06:47 MDT