Re: Personal goal system was: IA vs. AI

John Clark (
Wed, 11 Aug 1999 12:19:07 -0400

Hash: SHA1

Eliezer S. Yudkowsky <> On August 10, 1999 Wrote:

>If you define "free will" as actions that are not caused by any external
>force or lower level of physical reduction, then there's obviously no
>such thing.

Sure there is, it's called randomness

     >If you use the same definition of "reduce" as in "The Adapted Mind"
     >(thank you, Paul Hughes!) and say that a higher level does not
     >reduce to a lower one unless there's an identity of pattern, not
     >just causation of pattern, then there might be free will.

If you increase the speed that gas molecules move at inside a toy balloon the temperature will increase, if the temperature increases the pressure will become greater, if the pressure is greater the balloon will get larger. Size does not seem to reduce to pressure, pressure does not seem to be of the same pattern as temperature, and certainly temperature seems very different from speed, almost as different as a firing neuron is from a conscious thought. Question: Did the toy balloon decide to get larger, does a toy balloon have free will? Yes I would maintain that it does, and for that reason I would also maintain that free will is not a useful concept.

>free will is a cognitive abstraction used to define the basic unit of
>attributed moral responsibility.

The idea of Free will is not needed for morality, or for anything else. A person or animal is responsible for an action if and only if punishing that person or animal will reduce the occurrence of the action in the future. If I see my dog tearing up my couch I will discipline him, if he's sick and I see him vomit on the floor I will not.

John K Clark

Version: PGP for Personal Privacy 5.5.5

iQA/AwUBN7Giet+WG5eri0QzEQJtKwCgxHW7fux3HaL09BvKxkNsrZybSdYAoIqy +VeqOOMfgJEzd7iuq1pXRpfV