Re: >H Re: :Just talks or real work?

Randall R Randall (
Sat, 11 Jul 1998 17:50:49 -0400

John K Clark wrote:
> On Wed, 8 Jul 1998 Wrote:

>>>From my "Waiting for Zed"

>Clearly there are two atoms, they have twice the mass of one atom for
>thing, but they're interchangeable in ALL circumstances, that is, if
they are
>exchanged nothing happens. To repeat, when atoms are exchange no
>phenomena changes in any way, including the phenomena of consciousness,
>good thing too because from birth the atoms in our body are in a
constant state >of flux yet we remain the same person... I think.

I woulda said, "...I hope." :)

>>>Atoms have no individuality, If they can't even give themselves
>>>this property I don't see how they can give it to us.

>>Atoms cannot be wet, either, or soft, or...but you see the point.

>Yes I believe I do, atoms have no individuality but by arranging them
>large complex patterns you can make an object that does behave in a
>way, and of course a pattern is defined by information. Atoms are
generic but
>there is only one place where they are put together in such a way that
>behave in a John Clarkish way, at least only one place so far.

When there are others, each of you will be able to tell that the others are "not me".

>>*Half* of the music being played does indeed stop, though. You can
>>measure the volume and notice a lower level, so *something* is gone.

>And if I erase an Email message on my computer that you sent me but you
>retain a copy (or should I say "the original"?) on your machine does
>mean half the message is gone, are 50% of the ideas in it reduced to

Half of the total information under consideration is gone. It can easily be replaced, usually, but if for some reason it could not, and you needed it, you might realize that the information you deleted was indeed a unique copy.

>>I would say that we are a process running on an object. :)

>But lots of objects would do just as well but if we change the process
>change who we are. You're not really trying to argue that the reason I'm
>and you're you is that there's something special about our particular
>are you?

No. I am trying to argue that there must be continuity of consciousness to be sure that the uploaded one is the actual original consciousness, instead of a consciousness that was just created, while destroying the first.

>>>What Process X does is certainly not simple, so it's very hard to
>>>avoid concluding that Process X itself is not simple.

>>What do you mean by "simple"?

>Simple means easy for intelligence to understand. I'm too simple to
>what else it could mean.

>>It may be simple for whatever mechanism produces it.

>Now who's attributing meaning to inanimate processes?

I don't think I am. What I meant is that consciousness might not be as complex as we think. OTOH, it might required a structure which is not only very complex, but has properties (e.g. consciousness) that cannot be modeled separately from the structure. That is, you might need to simulate the entire structure to get consciousness, instead of being able to abstract the conscious being from the structure that produces it.

>>It is pretty simple for water to be wet; it's just an emergent
>>property of those atoms and bonds.

>That phrase, "emergent property" has never been one of my favorites,
it's a
>vague catch all idea that, near as I can tell, just means complex stuff
>happening without the help of intelligence. We need to understand how it

>works or we might as well call it "magic property".

Yes, but it may be impossible to understand something merely by reference to its parts. That is, there may be laws of the universe that do not act on simple structures. I don't have any evidence, though, this is just whistling in the dark.

>>it may be that we can only produce consciousness by simulating
>>whatever it is that produces consciousness in the brain,

>Could very well be true.

Well, that was half of my argument.

>>if that something is not really information processing.

>If it's not information processing then it's the soul and it's far too
>in the game to give up and abandon reason, especially when things are
>progressing so well.

I don't necessarily think that those are the only choices. Even if it can be shown that mere information processing is not enough for consciousness (as Penrose has tried and
failed to do, so far), we need not institute a "soul of the gaps", so to speak.

>>I think that we may be able to decipher exactly how consciousness
>>is produced.

>But how could you ever know if your deciphering is correct?

I don't know, but if I did, I'd be working on it *right now*. :)

>>Once we learn how to record and playback memory studies of
>>consciousness will have an experimental basis, no?

>No, it could be the experimental basis for the study of intelligence but
>for consciousness. You could examine the position and velocity of every
>in my brain and know better than I do myself what I'm going to do next,
>the only way you could know for certain what my subjective experience is

>would be for you to put you're brain into the exact same state as my
brain is
>in. The trouble is even that wouldn't work and "you" still wouldn't know

>because you wouldn't be you anymore, you'd be me, and that would be a
>pointless experiment because I already know what it's like to be me.

Yes, you mentioned this in "Zed", but I don't agree that the entire brain is required. Only specific memories are required, those of you thinking about being conscious.

>>Playing back selected memories would be a better approach, I'd

>You might be able to make a machine that would let you know what it's
like to
>be you pretending to be me, but all the memories in the world won't let
>know what it's like for me being me.

Well, if I could experience a memory of yours, then *that* would be me experiencing being you, for the length of the memory, no?

>>Since I can select a certain memory to think about, a hypothetical
>>machine that understands how the brain works would be able to do it
>>as well.

>But that's exactly the problem, there is absolutely no way to know for
>certain if your hypothetical machine really does understand how

It need not understand consciousness, only memory. If I can reliably record and play back memories, then I can play back one of *your* memories in *my* brain, which will let me know if *you* thought about being conscious during the time of the memory.

>>If *I* act like you, am I you?


If someone with plastic surgery and a really good imitation of you came along and asked you politely to commit suicide, so he could step into your life, would you? This is the acid test.

>>Suppose that I've apent the last 20 years studying you in every
>>waking moment. I should then be able to act like you in every
>>conceivable circumstance,

>I've hear something like that suggested as a method for uploading, you
>around a passive device that would observe you and try to predict your
>move, when it made a mistake it would change it's programming and try
>gradually it would get better and better until it was you.
>this would work but I'm not sure it's practical or how long it would

I disagree that it would work. I could program a present-day computer to give a specific answer to a specific question, specifically, the answer I would give. Would this computer then be me, for the duration of the asking and answering? It seems clear that it cannot be. But I see no difference between this and the 20 or 40 year version, except in length of programming time.

>>yet I am *still* not you. :)

>Why not? You're certainly not you anymore so you must be me.

Why can't I still be me, if I am only acting? Are you suggesting that an actor on television is *really* the role, and no longer the actor?


     4NCGCmtFMspbfspdYnkSHW/N6abYT24On076zlA7K | ICQ: 3043097 E-Gold Acct: 100678 @
On a visible but distant shore a new image of man, The shape of his own future, now in his own hands.

                                                | Johnny Clegg

You don't need to buy Internet access to use free Internet e-mail. Get completely free e-mail from Juno at Or call Juno at (800) 654-JUNO [654-5866]