Re: Immortality

From: Jason Joel Thompson (jasonjthompson@home.com)
Date: Mon Dec 11 2000 - 02:30:19 MST


----- Original Message -----
From: "Emlyn" <emlyn@one.net.au>
To: <extropians@extropy.org>

> ... and as we add new stuff, we can slowly shut off old stuff. Eventually,
> the transition is complete. Like the axe handed down through generations,
> head replaced 3 times, handle replaced 7 times, the components have
changed,
> but identity remains. I think we can fully upload, and retain conscious
> integrity.

Yes.

Now -here's- someone on my wavelength. (Thanks for picking up the ball on
these threads Emlyn-- your excellent "which-of-the-copies-is-most-me
response" saved me a long email. :)

 Not by a copy mechanism, however; only by gradual modification
> (how gradual? I don't know).

Me either, but I expect we'll find out pretty rapidly once direct-brain
peripherals become available. I, for instance, am -very- interested in just
adding general neural substrate and seeing what happens. We already know
that the current brain architecture is remarkably plastic. I suspect that
we would take advantage of new hardware surprisingly quickly. I can already
imagine the rapid reinforcement of the neural pathways attaching us to our
new 1ns-access-time-short-term-memory.

(Would your brain quickly figure out that if it remembers something 'in a
particular way' it consistently gets better results?)

I really do think you could 'coax' the whole consciousness into a new
substrate by this means.

> Unfortunately, this is not an information scenario, and means there is no
> possibility of backups. Of course, backups can be made, and run, and they
> will be very grateful for it, but the original's case is not helped at
all.

I actually think the back-ups of the future will be tremendously useful--
even for those difficult 'consciousness sentimentalists' like yourself and
I. Here's why:

Yes, if the whole system suffers catastrophe, we've got a big problem. But
back-ups across an individual upload's distributed network of intelligence
will provide lots of protection from damage. I tend to hold that the
consciousness is an emergent awareness resulting from the ongoing operations
of a plethora of parallel processes (Minsky's society of mind?) It appears
redundant and robust enough to survive the critical loss of functionality
across a signficant percentage of these processes. Given this state of
affairs, it will be very nice to have a timely copy lying around for
on-the-fly re-implantation of corrupted processes.

I'm kind of tired of the purposeless side of this debate (yes the copy is
you, no it's not, repeat,) and I wouldn't mind more 'collaborative-type'
discussion to explore the possibilities. Here's something I find
interesting:

You make an exact copy of yourself. You send him off on his own for 5
years. He comes back, and the two of you hook your brains together.
 What's -that- sort of subjective experience going to feel like? Will the
two consciousness be able to 'sense' each other, or will they be invisible
to each other? Will they, like drops of water on glass, merge upon
touching? Obviously this is highly speculative but I am very
curious/excited about the prospect of consciousness sharing. (I would feel
safest doing so with my copy at first.) (Clearly the means by which the
connections are drawn is incredibly important in attempting to make any
determination of the nature of the subjective experience for the two parties
in question.)

--

::jason.joel.thompson:: ::founder::

www.wildghost.com



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:35 MDT