Re: absolute morality? ha! free will? pnyah!

Robert J. Bradbury (bradbury@www.aeiveos.com)
Sat, 27 Nov 1999 05:29:24 -0800 (PST)

On Fri, 26 Nov 1999, Rob Harris wrote:

> ... It is
> following it's program to win the game, this is it's root motivation, and it
> is far from free - it was strictly defined by the game creator and is
> therefore completely invariable.

> Our base motivations are also strictly defined and completely invariable.
> Any action we devise to fulfil these goals, is just that - an intelligent
> system devising a method of achieving the strictly defined goals.

Rob, I feel I'm forced to disagree with you on the simple basis that a chess program doesn't have access to the "base motivations", while we perhaps can either access those goals or shift those priorities. I'll use the example of the celibate priest or monk. Certainly the "designer", put in the goal to procreate. However with enough training or desire, we can usurp that goal and replace it with another, *perhaps* flying in the face of "rational" behavior (i.e. substituting the pursuit of a belief system that has no concrete visible results with one that has concrete visible results).

Now, the interesting problem, IMO, from the perspective of sending and AI after an "absolute morality" is the problem that to implement the acceptance of the "discovered" morality, you have to give the AI access to the source code (i.e. it (like humans) can shift the weights or priorities of its belief system). How do you prevent the AI from worshiping some false god (as I would argue so many humans do...)?

Now, with regard to whether an "absolute morality" existing, I'm inclined to agree that that is doubtful. My experience says everything is context specific. Is is wrong to murder her children? Yes. Is it wrong for a mother to murder a younger child so that an older child (in which greater investment has been made) may survive in conditions of food scarcity? No. Is it wrong for a mother to murder an older child (with say a mental handicap) in favor of a younger, but more mentally promising child, in similar conditions of scarcity? Probably again, no. [For those who don't like the examples, tough, live with it.] So in my mind, morality is *highly* context specific. I think our morality develops only on the basis of experience (i.e. we can look back and see decisions that in retrospect would have been better had alternate solutions been selected). To presume that an absolute morality exists in such an environment seems difficult.

At the end of the lifetime of this universe you may have two possibilities -- (a) to eventually let everything die; (b) to prematurely sacrifice virtually everything that is the result of trillions of years of work to create a seed in a new universe. Do you want to argue the morality of choosing between those positions?

Robert