Re: Free Will

Hal Finney (
Wed, 25 Feb 1998 10:43:46 -0800

John Clark, <>, writes:
> A man, animal or machine has free will if it can not always predict what it
> will do in the future even if the external environment is constant. A third
> party might be able to make such predictions but that's irrelevant, the
> important thing is that the person himself can not know what he will do next
> until he actually does it.

I think this definition is a good step towards understanding free will.
In particular, it helps address the question of whether the fact that a
system's behavior is predictable from outside, or simply deterministic,
means that it does not have free will.

However by itself the definition is too simple. There has to be a
specification of some minimum predictive ability, otherwise the definition
is satisfied vacuously by systems simply because they can't predict at
all. Cars, inclined planes, and knives all appear to have free will by
this definition, because they can not predict what they will do in the

One possibility would be simply to add that the system must be concious
before we begin to address whether it has free will. This would still
leave the definition useful for the cases we are interested in, without
having this distracting loophole.