--> Miriam English
> >If it's a separate backup, then it isn't you -- it's a copy of
> you. Back to
> >suicide again.
> >
> > > I recommend consulting the archives (if they be working), this is a
> > > recurring thread.
> >
> >Such as: http://www.lucifer.com/exi-lists/extropians.4Q00/3644.html
>
> Thanks for the link to some of the relevant discussion in the archives. I
> appreciate it. Wow! There sure is a lot of it. I still haven't
> read all of
> it (already used up a lot of my day with it :)
>
> An odd thing though... all the stuff I read kept making, what
> seemed to me,
> a surprisingly simple mistake. You mention it again above: the idea that
> the backup is a copy and is somehow distinct from a continuous "genuine
> you". The mistake is in thinking that we are continuous. We aren't. We
There is a difference -- see below:
> exist in short, daily bursts. We lose consciousness each night -- more
> often if we happen to drink to excess or have surgical operations or
> receive traumatic blows to the head. There is no continuous me --
> just the illusion of it maintained by my memories of what I did and felt
on the
> preceding day.
Continuity is provided by association with the same body that you went to
sleep
in. Given the ephemeral nature of intelligence, you have to place the burden
of
continuity on the corpus.
> If my biological self died in her sleep one night and next day my
> downloaded self awoke, then in what way have I not continued? I
> would still remember what went before in exactly the same way I do each
morning of my
> biological life.
But it wouldn't be you -- it would be a copy of you; a separate person
associated with a different corpus who just happens to be very similar to
the (now extinct) you. [Corpus in this case can just as easily refer to the
specific memory hardware that an uploaded self is running on -- don't get me
started on how to kill an uploaded intelligence through use of the mv
command :)].
I find belief in the indentity of two copies to be a risky hypothesis. It's
like a belief in an afterlife -- there's no verifiable way to prove that
that the essential you will continue if you peform some sort of radical
cut-and-paste operation like the one described above. The only way to see is
to try it -- which will either kill you or not. I'd much rather stick with
continuity through association with a given chunk of (slowly changing)
matter -- at least I know I'm not doing anything there that will wind up
causing me to cease to exist.
--> J. R. Molloy
>> Inloading (and similar gently-gently techniques) mean that the patterns
of
>> data that are you never actually stop being physically identified with
one
>> distinct set of matter -- you're just extending that set. You will
change,
>> but you won't cease to exist.
>
>Yes, that's the idea, and it closely parallels the process of ordinary,
normal
>human cognitive development, except that it extends that expansion to
include
>augmentation with machine intelligence, which is an entire new semiotic
>system. The first research group that successfully demonstrates such a
>learning technique can initiate the heralded evolutionary phase transition.
>Brain/machine interface still presents the greatest problem.
So what do we all think about management of the expansion process? What
limits are there to throwing a bunch of extra material at a grown human
brain and seeing what results?
Reason
http://www.exratio.com/
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:42 MDT