Re: Consciousness/Uploading

Bryan Moss (bryan.moss@dial.pipex.com)
Tue, 7 Jul 1998 17:13:11 +0100

David Bradley wrote:

> Apparently you misunderstood a few things that I
> said. Frankly, I don't blame you, a lot of this
> was written relatively late at night.

I'll do my best to understand this time around, although I'm not promising anything!

> > The distinction between "original" and "copy"
> > is not a useful one.
>
> It is to me, since if there is a distinction
> then my 'original consciousness' is dead. Now,
> I like the concept of my body living on after
> me, it could do a lot of good in the universe,
> but if, as I said, it is a different
> consciousness than mine, then it sort of defeats
> a large part of uploading in the first place, to
> 'live' forever as me, not just as my body with a
> different consciousness.

But the problem does not create the distinction; the distinction creates the problem. You believe there is a distinction, you upload and die. I believe there is no distinction, I upload and live. Go figure!

> No matter what, the 'new' consciousness would
> certainly 'think' that it was a continuation of
> the first (there would be no reason for it to
> think differently), but if the first was lost,
> then it would be a very useful distinction,
> indeed.

You also touched on the idea that consciousness is lost when you sleep. I think there's some confusion between consciousness as a medical term (i.e. "he lost consciousness" and "he is unconscious") and consciousness in the context of philosophy (i.e. "I think therefore I am"). I would argue that consciousness does not exist in a sense that it can be lost. Like music, it can be played again, on different instruments at different times.

> I will admit right now that this is more
> paranoia than anything logical. I would hate to
> actually have a ways and means to live forever,
> and have it crushed by something totally
> uncontrollable like this. Thus would 'I' die
> unfulfilled. (Any other consciousness from my
> body could still become fulfilled, though, which
> would give it a little worth.)

People develop a fear of sleep for the very same reasons, and can we say it is the same person when the wake up? Observers will say it is the same person, science will say it is the same person, even they will say they are the same person. What else do we have to go by?

> The actions of the body may be the same, the
> firings in the brain may happen the same, but
> the thing 'thinking' to make this occur is not
> necessarily the same.

There are no subjective differences (the person knows who they are), there are no objective differences (all the atoms, etc are in the right place and on the right course), so all that is left is a mystical difference.

> I *don't* believe that I will die if I get
> uploaded (hence all the 'if's and 'possibly's),
> but I still consider the possibility, since I'd
> much rather have possible solutions for the
> minute chance that it be true.

I've considered it myself, I no longer think it's a valid possibility. I hope you see why I think this.

> > You say, "the consciousness arises because of
> > the actions of the brain cells." With that in
> > mind, a computer that simulates only these
> > "actions" (and not even the cells themselves)
> > would produce consciousness.
>
> The original assumption was that it *was*
> literally a part of the cells themselves.

When someone talks of the cogs of a clock producing the time, it's natural to assume they mean the actions of the cogs rather than their "essence" (I'm assuming that's what you meant).

> > A common statement made on this list is that
> > we cannot know if other people are conscious.
> > This is then used to assume that if it walks,
> > talks, and sounds like you - it is you. Our
> > only option is to assume that consciousness
> > *is* simply produced by our brains. There is
> > absolutely no data to prove otherwise, and
> > it's disputable that there ever will be.
>
> I won't disagree here, but I have to say that
> the universe doesn't have to operate only on
> what humans, with whatever happens to be their
> current knowledge level, assume. Relativity
> doesn't work for everything.

Most people say, "we cannot prove of disprove consciousness exists" most do not realise this is actually longhand for saying, "consciousness does not exist." I feel like I exist, but I'm hardly likely to feel like I don't exist.

> Sometimes one might benefit from assuming
> without reason, and later find reason to fill in
> the gap (not built of any points found as a
> result of things *after* the gap, though, of
> course).

That may well be true, but in this case we do not even have much on which to base assumptions.

> So we still have a whole, functioning brain,
> just made up of two groupings of nanotech
> devices. Next, The two halves are separated
> spatially, but still connected through wires
> (which transmit the normal brain signals from
> one half to the other, and vice versa, as would
> normally happen.) One consciousness is still
> controlling both halves, correct?

It is still one brain, yes.

> Then, each brain side has the required machines
> *copied* (in any way required) from the other
> side, and added to it's side, so the left(or
> top) now becomes the left and an exact copy of
> the right (or the top and copy of bottom) while
> the right now becomes a right and an exact copy
> of the left (or, again, the bottom and copy of
> top). Would the consciousness not expand to
> become a part of these portions too?

I'm not sure of the ability for consciousness to expand and grow. If we take the now exhausted instrument and music analogy, this whole exercise seems ridiculous. But I'm assuming you're now considering consciousness to be a *thing* that exist in the brain. I do not think there is sufficient evidence to reject consciousness as a product of the brain's functions.

> (Say for technical sake, that as layer after
> layer of cell was added, the wire was physically
> altered (parts removed and reconnected, or such)

> so that it would allow the same 'normal'
> connection. (i.e., 3/4s are connected to
> opposing 1/4 (assuming that it could be placed
> correctly inside these portions without
> disturbance.) Also, instead of the wire, the
> signals could be transmitted via waves of
> some kind (not sure of the plausibility of
> this), or even just calculated accurately in a
> computer using personality traits and the like.

This is slightly different from gradually replacing the brain cells with engineered replicas. It assumes the brain can function as parts, but reconnection 1/4 of the brain to a brain that's only made up of 1/4 could produce massive problems (not working for instance). But for arguments sake let's say this is a plausible scenario.

> That are not connected, but, if you followed my
> description of the process (consciousness
> filling in for both still-connected sides, as
> they both were appended up until they were
> individual brains) you should see how I see a
> possibility that the same consciousness could
> control both.

I see no difference between this and "cloning" two brains. If you could do this procedure perfectly it's unlikely that either person would feel any discomfort or separation.

> Any questions, still? I'd seriously like to
> hear your thoughts on this.

I'd like to know how you would actually define consciousness. Especially the idea of control and it's ability to expand in this scenario.

> Wait, if you will, please explain how memory
> transplants could do it (theoretically, at
> least.) How would you explain the idea that this
> would cause the sharing of thoughts. Speaking
> of which, I'm still not clear as to what you
> believe causes consciousness to form.

I simply used memory transplants as an example due to the sharing of thoughts being such a difficult procedure. If two people have sex and then have each others memory of the act implanted then it is almost as good as sharing thoughts during the act. Of course, the actual ability to share thoughts would no doubt lead to greater pleasure, but would be devilishly hard to achieve.

> Taking your previous statement that we are
> forced (at least logically) to assume that
> consciousness if formed by the brain, then
> technically, anything with my brain makeup could
> become conscious?

If we use the model of a brain as individual interacting agents (IMO the most compelling) rather than of centralised control. Then we can assume that we could destroy everything but your listening and music appreciation functions and construct a conscious microphone.

> I assume that that would also mean that a
> variety of other non living things (depending on
> what *exactly* makes consciousness) things are
> conscious, but just cannot act on desires
> physically.

Yes, I would say any self-analysing data processor would conform to what I consider consciousness.

> I'd love to hear your comments on the type of
> multiple *personalities* (i.e. multiple non-
> exact brains (due to differing memories)
> controlled by one consciousness) as I mentioned,
> used in this purpose, though.

Certainly it would make for some strange relationships, but there would still be no connection between the brains (other than they are very similar). Simply because two things are the same does not mean they are connected in any way.

> It seems (to me) that the only real opposition
> to multiple consciousness in one brain is the
> physical question of electrical charge in the
> neurons and such. Just that people don't yet
> have true "multiple personalities" isn't proof
> that it will never happen.

If consciousness is a product of the brain then it stands to reason that you need one brain for every consciousness. The structure of the brain wouldn't allow for multitasking, but with a powerful computer you might be able to run two minds simultaneously. This isn't really very different from two individuals living separate lives in the real world. The real fun stuff happens with the sh aring of a body, where one wants to turn left and the other right. Again this is a lot harder than it sounds, they way the brain is "wired" to the body would create some interesting technical problems.

> My intended point was just that it would be a
> more drastic change to the personality for
> someone with a 'networked' consciousness to have
> a body with no memories and receive every memory
> originally, than for someone with a lot of them,
> who is just being slowly splitting from the
> point in time where the memories would be
> identical (i.e. the point just before
> multiplicity occurred.)

If the person had no memory of when he or she received the memories, then they would not remember feeling any discomfort.

> Back to my situation of a man writing the
> calculations out, where exactly is the
> consciousness, in the simulation itself? The
> numbers? Only the results?

Our only logical source is the result.

BM