> rhanson@gmu.edu said:
> > And as far as I can tell, all interesting moral/ethical/ought
> > questions are equivalent to questions about what various creatures
> > want.
>
> I find that pretty surprising, Robin. I might be willing to grant that you
> and I have read enough game theory to have the underlying chain of
> reasoning in our heads to be able to get from our model of the nature of
> thinking agents to simple rules like tit-for-tat, or the golden rule, but I
> sure don't see how Joe or Jane on-the-street's desires lead directly to
> reasonable ethical rules. Care to say more?
Game theory is useless unless you know the payoffs ahead of time. Those payoffs--which are inputs to game theory, not outputs--are nothing more than personal desires. Game theory can only tell you how to achieve those desires, not what they should be. Results like Axelrod's are very suggestive, and they shed light on how things like the golden rule may have evolved; but don't forget that Axelrod's robots had only one goal: get money. He discovered that the best way to do that is often to play nice; that does nothing to support whether "playing nice" is or isn't a worthy goal in its own right.
Simple desires of ordinary humans are a powerful and fruitful
base from which to reason about ethical questions. Take this
statement of J.R.Molloy, for example:
"A scientific approach to making decisions together with other
people, acting in the public sphere would, I imagine, eliminate
biases which interfere with obtaining the most successful decisions."
He imagines, indeed, because that statement is so utterly 180-
degrees opposite of reality that its consequences can only exist
in his imagination.
-- Lee Daniel Crocker <lee@piclab.com> <http://www.piclab.com/lcrocker.html> "All inventions or works of authorship original to me, herein and past, are placed irrevocably in the public domain, and may be used or modified for any purpose, without permission, attribution, or notification."--LDC