The Meaning of Life

John K Clark (johnkc@well.com)
Fri, 14 Feb 1997 09:27:39 -0800 (PST)


-----BEGIN PGP SIGNED MESSAGE-----

On Thu, 13 Feb 1997 Hal Finney <hal@rain.org> Wrote:

>>Me:
>>I ask the computer program: "Suppose you decided to search for the
>>smallest even number greater than 4 that is not the sum of two
>>primes (ignoring 1 and 2) and then stop. Would you ever stop?"

>Hal:
>It says, "You bet I would. I'd get bored after only a few numbers.
>I've got better things to do with my time!"

That was just one small example. The point is that there is no general method
to know if ANY computer program will ever stop.

>>Me:
>>The electronic computer is much faster than my old steam powered
>>biological brain, so it figures out in 10 seconds what I'm going to
>>do in 10 minutes, but if the computer then tells me of it's
>>prediction about me, and my personality is such that out of pure
>>meanness I always do the opposite of what somebody tells me to do,
>>then the prediction is wrong.

>Hal:
>No, the prediction is right.

How is it right? It did not predict that I would look at the box, but I did,
and all the predictions it made about me after that point are wrong.

>The prediction was of what you would do given situation X. When you
>do something different, you are not in situation X, but situation Y,
>where Y = X plus the knowledge of what the prediction was.

Exactly. My situation at X includes the prediction of what the magic box says
I will do, so for the magic box to predict what I will do it must also
predict what it will do. We need a super magic box for that.

>You can make a trivial computer program which reads a number and
>outputs the next number. Now we will try to predict what it will
>type next. We input our prediction and see what it types. Our
>prediction is always wrong! No matter what we input, it outputs
>something different. This is not an example of free will by any
>reasonable criterion.

I think that we could both agree that a person is a being and a rock is not,
but the dividing line between those two extremes is rather blurry and
arbitrary. I never claimed to have a good definition of "a being".
My intuition tells me that such a simple program is not a being, but if you
insist it is then I would have to say it has free will.

>If you had a box which had a simulation of your mind which ran
>faster than your own, then you could always find out what you were
>going to do before you did it.

No you could not. If you look at the box's prediction then you and the box
would be part of the same system and no system in general can predict what
it will do next. For example, just do the opposite of what the box predicts.

The bottom line is that regardless of how fast this box can emulate a brain
it would be completely unable to accurately tell you what you will be doing
in 10 minutes. It could however be able to accurately tell me what you will
decide to do next and it could accurately tell you what I will decide to do
next. According to my definition we would both have free will.

>Would the existence of such a box make you doubt your own free will?

No.

>Wouldn't it be strange to always have this box around which knew
>exactly what you were going to do ahead of time?

The correct predictions the box made would not seem strange to me because I
didn't see them. The prediction about me that I did see would not seem
strange because they were wrong. I still feel free.

John K Clark johnkc@well.com

-----BEGIN PGP SIGNATURE-----
Version: 2.6.i

iQCzAgUBMwSc3n03wfSpid95AQH83gTwumKQqq4+IiGTmjpRpy+TSpgJzVek4XIU
jHpxyWq9u2XWEbM3jqQRS4Jc5X1QXNXBjuCHOihPcbsiCz0hVBe0KjYH+O54HHP3
zgDdoRmRI9d64bqChi/U7QaZQ+ybQXduEhIf3P3ACIKFSzAdhJvtBUJgrnpkcuGh
T9VPFYHcSrx/Kx4HTDMcSK+gIvwTGruWJFFMYimOpqEaic61oYg=
=WTFR
-----END PGP SIGNATURE-----