If you didn't notice in here, I really didn't make any conclusions on anything (in fact, I sort of 'proved' myself wrong.) This was just sort of a linear analysis of whatever I could think of on the subject, which actually changes throughout the document.
Bryan Moss wrote:
>
> David Bradley wrote:
>
> > I will share my thoughts, doubts, and ideas on
> > the subject for any who wish to respond. Any
> > previous discussions I could search for in the
> > archives would also be helpful (i.e. specific
> > thread topics).
>
> If you search through the archives for threads
> with cryptic names like "Uploading" and
> "Consciousness" you'll find that this conversation
> has been had many times before.
I should have stated that better. I was mainly just asking that if anyone had a very memorable topic (something like: "The Life of B. Norman") that specifically detailed a lot of this.
>
It is to me, since if there is a distinction then
my 'original consciousness' is dead. Now, I like the
concept of my body living on after me, it could do
a lot of good in the universe, but if, as I said, it
is a different consciousness than mine, then it sort
of defeats a large part of uploading in the first
place, to 'live' forever as me, not just as my body
with a different consciousness.
> > All of these assume that:
> >
> > 1) consciousness exists
> >
> > 2) the brain is the carrier/supplier of
> > consciousness
>
> I think it might be easier to think of both of
> these as simply, "consciousness is a
> function/product of the brain."
>
> > 3) the brain does so through chemical and/or
> > electrical means
>
> There is zero evidence to think otherwise.
>
> > 4) we have the ability to somehow manipulate
> > this process to our benefit.
>
> Always.
>
> > If two bodies share the same original
> > consciousness, they are for the purposes of this
> > discussion, the 'same.' For example: My
> > original consciousness is gone when my total
> > consciousness was uploaded into a new
> > body. That new thing is not the 'same' as me,
> > since "I" am now dead, even though it has my
> > personality, memories, etc. "I" am no longer in
> > control of it, or aware of it's presence.
> > However, if "I" can control it, and am aware of
> > it, it is still me.
>
> The distinction between "original" and "copy" is
> not a useful one.
'I' could die, but my body would live on.
I'm not saying that it's 'certain,' or even 'probable,' but consciousness seems to be a fickle thing, and if my consciousness *were* dependent on the charges and such in my body as it is now, then it is certainly 'possible' that any other consciousness that could arise from my body would not be the 'original.'
'I' could just die as normal when uploaded. Another consciousness would inhabit my body, and since it had my memories, would not know that 'I' (this consciousness) had ever existed.
Granted, logically it does seem that any later consciousness should be the same, in all ways, including 'being aware that it was the first consciousness', I suppose is the best way to say it. No matter what, the 'new' consciousness would certainly 'think' that it was a continuation of the first (there would be no reason for it to think differently), but if the first was lost, then it would be a very useful distinction, indeed.
>
> > 1) If the cells are all taken away and the new
> > system is implanted, that the original
> > consciousness *is* lost. There should be no
> > more reason for the original consciousness to
> > exist than there would be for a person 'a' to
> > still be living in an apartment after 'a' has
> > been replaced by person 'b'. The apartment may
> > still be a home (or the body may still have a
> > consciousness,) but there is a new 'tenant.'
>
> I think Daniel Fabulich did a good job of
> disputing this with his CD Player analogy. This is
> why I said, "consciousness is a function/product
> of the brain." It's best to think of it as music,
> rather than an instrument.
I, obviously, disagree. (Attacking a metaphor is rather pointless in any light, but here it goes anyway.) The sound that the speakers make is (for simplicity's sake) the same, and the pattern of electricity inside brought about by the specific CD is the same, but the thing causing these actions (the CD) is different.
To take out of metaphor...
I hope that you at least understand what I mean. Read these words over again and try to find exactly why I worded certain things such.
>
> > 2) If the cells are replaced one by one (or
> > slice by slice) the o.c. *may be* lost. Either
> > the individual must sit there and slowly die, or
> > the individual will sit there and notice nothing
> > (perhaps some odd, temporary effects, but
> > nothing serious).
>
> If cell replacement was performed gradually it's
> possible the patient would feel nothing. Not
> because the consciousness has died or is being
> destroyed, but because it is maintained. As I
> said, the distinction between "original" and
> "copy" is not a useful one.
Alright, this part is mutually understood. (Though I could still consider this 'original' to the effect that, if consciousness is, say, the electric current (unlikely, but possible) that it has just been 'transferred,' not 'replaced' (which would imply that the original is gone).
Remember this topic, it becomes important later.
>
The original assumption was that it *was* literally a
part of the cells themselves.
> > If the consciousness arises because of the
> > actions of the brain's cells, and is then
> > contained within, there is no reason why it
> > should live in a computer. A computer is, after
> > all, just a processor of information, at least
> > until some drastic design change is made.
>
> You say, "the consciousness arises because of the
> actions of the brain cells." With that in mind, a
> computer that simulates only these "actions" (and
> not even the cells themselves) would produce
> consciousness.
>
> A common statement made on this list is that we
> cannot know if other people are conscious. This is
> then used to assume that if it walks, talks, and
> sounds like you - it is you. Our only option is to
> assume that consciousness *is* simply produced by
> our brains. There is absolutely no data to prove
> otherwise, and it's disputable that there ever
> will be.
>
I won't disagree here, but I have to say that the universe doesn't have to operate only on what humans, with whatever happens to be their current knowledge level, assume. Relativity doesn't work for everything.
Again, this isn't a large complaint, just a follow-up to my points about whether what I call the 'original consciousness' should be examined.
Sometimes one might benefit from assuming without reason, and later find reason to fill in the gap (not built of any points found as a result of things *after* the gap, though, of course). So if it isn't threatening something greater, (which this case wouldn't) we shouldn't just limit ourselves to examining that which we can assume.
> > [...] Perhaps we could change around some x and
> > y chromosomes and create an opposite gender
> > version of oneself... I don't know what kind of
> > laws written or unwritten will be around for
> > thing type of thing, but you know someone is
> > going to try it. The o.c. controlling both a
> > female and male version of an individual would
> > be.. odd, to say the least. How would any
> > offspring be setup? I would assume there would
> > be the problems that inbreeding would cause, or
> > would it ‘transcend' even that?
>
> I'm not sure but I get the feeling that you
> believe there will be some sort of paranormal
> "connection" between the two cloned minds. There
> is no reason to think this.
Here is also where I might not have been totally clear in the original post. It isn't "paranormal." It is logical.
It is logical to assume that consciousness would pass into replacing nanotech machines that resembled the original cells closely enough. Consciousness travels into other such additions to the brain.
Now, as I said in the original post, if instead of just one grouping of nanotech machines replacing the entire brain, two grouping were introduced, each replacing only half (and at the end connecting to each other.) So we still have a whole, functioning brain, just made up of two groupings of nanotech devices. Next, The two halves are separated spatially, but still connected through wires (which transmit the normal brain signals from one half to the other, and vice versa, as would normally happen.) One consciousness is still controlling both halves, correct?
Bad diagram of the two halves
E-----------------------------------------------3 (wire still allows normal sending of messages, just over larger distance)
Then, each brain side has the required machines *copied* (in any way required) from the other side, and added to it's side, so the left(or top) now becomes the left and an exact copy of the right (or the top and copy of bottom) while the right now becomes a right and an exact copy of the left (or, again, the bottom and copy of top). Would the consciousness not expand to become a part of these portions too? (Say for technical sake, that as layer after layer of cell was added, the wire was physically altered (parts removed and reconnected, or such) so that it would allow the same 'normal' connection. (i.e., 3/4s are connected to opposing 1/4 (assuming that it could be placed correctly inside these portions without disturbance.) Also, instead of the wire, the signals could be transmitted via waves of some kind (not sure of the plausibility of this), or even just calculated accurately in a computer using personality traits and the like.
Either way, we now have
\ copy of / \ original / | bottom| | bottom| ------- -------
That are not connected, but, if you followed my description
of the process (consciousness filling in for both stillconnected
sides, as they both were appended up until they
were individual brains) you should see how I see a possibility
that the same consciousness could control both.
Any questions, still? I'd seriously like to hear your
thoughts on this.
>
> Inbreeding causes problems because of hereditary
> disease, so if we were uploading with advanced
> nanotechnology it wouldn't be too much of a
> stretch to correct any problems. You could always
> give the two people completely separate genomes
> but correctly engineered brains.
>
Alright, that makes sense.
> > That's a topic for thought. Either way,
> > something tells me that experiencing the act
> > from both sides at once would be ostracized
> > socially, as masturbation is now, though.
> > Besides that, it's also adding the taboo of
> > incest.
>
> To experience it from both sides, both minds would
> need to share the thoughts. If that were possible
See above.
> (perhaps through memory transplants) then it would
Taking your previous statement that we are forced (at least logically) to assume that consciousness if formed by the brain, then technically, anything with my brain makeup could become conscious? I assume that that would also mean that a variety of other non living things (depending on what *exactly* makes consciousness) things are conscious, but just cannot act on desires physically.
> make for a truly interesting experience. But I
> don't think this would require a mind cloned
> transgender version of your self.
>
> > Off of that subject though, is a serious topic
> > of debate. If genetic rearrangement, ala the
> > previous example, is possible, would it be
> > permissible to augment one's appearance in an
> > upload? Most people here would probably say
> > yes. Some would probably say that is required,
> > at least if one wanted a certain level of
> > advancement (Two legs? You must be kidding!)
>
> You can augment yourself without uploading, so I
> don't see a problem here. Saying that the mind is
These were actually just a segue into the uses, mainly bad, of the single consciousness/multiple brain thing which apparently I didn't explain well enough.
>
> > Consider also, though, that if this type of
> > multiple consciousness is possible, then someone
> > could also have multiple personalities.
>
> Multiple personalities (or more rightly
> Disassociative Identity Disorder) is a mental
> illness. It would no doubt be a mean feat to
> upload one of those "personalities" (the reason we
> now call it an "identity disorder" is simply
> because there are not actually multiple
> personalities). Likewise it would be *impossible*
> to get two people into one brain without extra
> hardware (as you said yourself, they are defined
> by the hardware).
I didn't say it as putting forth my belief, I used it only as a starting point for this analysis. It seems (to me) that the only real opposition to multiple consciousness in one brain is the physical question of electrical charge in the neurons and such. Just that people don't yet have true "multiple personalities" isn't proof that it will never happen.
>
> > The older an individual is, the less any memory
> > changes what they have believed since age 10 or
> > so. I.e. between birth and age 10, every day
> > could possibly trigger a totally new way of
> > thinking of the world, but after age 15 or so,
> > almost nothing will make you believe that a
> > certain race really *isn't* inferior to you, or
> > that, if you think about it, there's really
> > nothing *wrong* with wanting to live forever.
>
> I don't know the figures behind this, but from
> personal experience all my life-shattering
> realisations have happened from 16 onwards.
>
I suppose that this changes from person to person. My intended point was just that it would be a more drastic change to the personality for someone with a 'networked' consciousness to have a body with no memories and receive every memory originally, than for someone with a lot of them, who is just being slowly splitting from the point in time where the memories would be identical (i.e. the point just before multiplicity occurred.)
> > However, there is the possibility. Many people
> > here probably had such an experience with
> > Newtonian physics when relativity was
> > introduced.
>
> You don't literally mean the date when relativity
> was first introduced do you?
Heh, yes, once again, I should not be writing such 'presentation-important' things on a tired mind. I doubt many here were around (and aware of the mechanics of physics) when relativity was first introduced to the world. :)
>
> > It probably caused you to literally think
> > totally differently about your world and how you
> > perceive it. So, if one of my bodies was on a
> > planet around Proxima Centauri, and the other on
> > good old earth, and the one on P.C. talked to a
> > real life extraterrestrial, wouldn't that cause
> > a severe clash in my mind's rendering of my
> > bodies?
>
> Brothers often move apart, family shattering maybe
> but hardly great science. I still get the awful
> feeling you believe there might be some psychic
> connection between the cloned minds.
>
Call it 'psychic,' but I just did not covey properly that it is a logical product of consciousness 'spilling' into receivable areas.
> > So, if these nanomachine cells can emulate the
> > original cells, then shouldn't that mean that
> > consciousness isn't technically in the
> > chemicals? Yes, this is pure speculation here,
> > but so is the second of our assumptions, that
> > the cells are the cause/holder of consciousness,
> > actually all four are.
>
> More precisely you originally said consciousness
> is cause by the "actions" of the cells, not
> necessarily the cells structure. We've already had
This was me just making points in logical steps. Whether I worded it properly or not, I started with the assumption that the cells themselves created and maintained consciousness (and, now, (upon rereading) I do not expect you to have known that I meant that). >From there I have attempted to prove as best I can that this original assumption was false, which I eventually did (validly or not is why I sent it to the list, for others' opinions).
> success in recreating certain brain functions with
> artificial replacements for synapses, etc. Since
> there is no reason to suspect consciousness comes
> from anywhere but the brain, it stands to reason
> that a simulated brain would have all the
> functions of consciousness. The fact that we
> cannot yet prove this is more food for
> philosophers than lost sleep for scientists.
Indeed so.
>
> BM
Hope to hear a reply to my clarifications, Dave Bradley