The Prospect of immor(t)ality

John K Clark (johnkc@well.com)
Wed, 25 Sep 1996 09:51:02 -0700 (PDT)


-----BEGIN PGP SIGNED MESSAGE-----

Mark Crosby <CrosbyM@po1.cpi.bls.gov> On Tue, 24 Sep 1996 Wrote:

>how will the Omega Point KNOW which of these infinite
>combinations of phase trajectories represents the actualized
>past and is, thus, a 'true' resurrection.

I don't understand why that's important. There might be a huge number of
"phony" John Clarks running around with vivid memories of things that never
happened, but I don't care as long as I'm there too.

>while my life experience IN GENERAL will certainly be
>emulated by the Omega Point, it is impossible for the Omega
>Point to emulate my life experiences IN PARTICULAR by any
>'brute-force' simulation.

As long as the Omega Point could perform an infinite number of computations
and not just an astronomically large number (two very different concepts),
I don't see why it couldn't emulate you and all your experiences. Even finite
brute force can accomplish a lot if you give it enough time, it did after all
produce life. If the brute force process has access to a computer that can
perform an infinite number of computations then coming up with you and me,
and every conceivable variation of you and me, would be trivially easy for it
to do.

>I submit that most real-world processes do not evolve as
>Markov chains because state Xi+1 is NOT dependent ONLY on
>the state the system is in at state Xi. Random inputs from
>the environment during the transition to state Xi+1 can
>cause the next state to be quite discontinuous from state Xi.

Why isn't that a Markov chain? Any knowledge other than knowledge of a
system's present state will not help you make a better prediction on what
state the system will be in next, and that's a Markov chain. The system's
history, how it got to its present state, is not important. Yes you can have
random inputs, but that's just another way of saying not every effect has a
cause and Markov chains can certainly handle that. Except for the trivial
case where all probabilities are equal to one, Markov chains are not
deterministic but stochastic.

>We are not Markov chains because we have memory

If I have a memory then the present physical state of my brain is different
than it would be if I did not have that memory and it could be important in
determining the probability of what physical state it will be in next,
in other words what my next thought will be. If I once had a memory but now
have totally forgotten it, both consciously and unconsciously, then that
information will not help you figure out what I will do next, you can only
look at my present state and then play the odds.

>AND we can plan ahead.

We can invent theories about what the most probable future will be, but there
is no reason a Markov chain can't do that.

>Even if A SINGLE entity can be IN only a finite number of
>states, this does not mean that there are ONLY a finite
>number of TOTAL states that an entity can choose from.

If I can only be in N states then whatever state I'm going to be in next it
must be one of those N states. A 10 bit computer memory chip can be in 2^10
sates, regardless of what happens when all is said and done, that chip will
always be in one of those 2^10 states. I think using a word like "choose"
just muddies the waters.

John K Clark johnkc@well.com

-----BEGIN PGP SIGNATURE-----
Version: 2.6.i

iQCzAgUBMkld9303wfSpid95AQGOiQTwtsKDQguIy+8svTlmvWen/ZRLcbPbGMpA
3DgufUkqZ3Bs5/Bo4Wi4n2GaGeC1pLUlCb6iax++3V18zVeVSRD4yuIvSJqLLUGZ
VXcFScISS7D11eQr7UpNg1D3HR3t8jwAmd5roUlL5eJiHmJJ6dumdtqFxuXUFUD/
ncvRBZh/s0+hSd9J+B50hB2cOuOL2pcjkWiwb5L0YFr5gEUW1BA=
=Ybgu
-----END PGP SIGNATURE-----