Re: Ethics

Daniel Fabulich (daniel.fabulich@yale.edu)
Tue, 14 Jul 1998 20:31:17 -0400 (EDT)

On 14 Jul 1998, Felix Ungman wrote:

> Not true, if you would keep silent and I would defect, you would be
> even worse off than if we both defect. Now, would you you really
> trust me to keep silent? Just tell me, and I'd be delighted to play :-)

We begin by presuming that the rational action is the action which results in optimal consequences. Both egoism and utilitarianism agree on this; they just disagree over what "optimal" means. We then go on to observe that an ethical system is rational if and only if it is rational for both players; if it is rational for me to defect, then it is also rational for you to defect. Egoism cannot only be rational for you; if it is rational at all, it is rational for both of us.

If both players defect, both players find themselves worse off than they would be had they chosen to cooperate. Since egoism demands that both players defect, despite the fact that its consequences are sub-optimal compared to utilitarianism, egoism is not rational.

>
> >If we agree that rationality, at least in part, involves doing
> >what is necessary in order to get the optimal consequences (where the
> >"optimal" consequences is determined by one's value system) then egoism
> >dictates that the way to fulfill the ends of egoism is to reject egoism;
> >in other words, it is *not* rational to be an egoist, because it leaves
> >the players worse off than it would be had they been utilitarians.
>
> You're using the word "egoism" in a very simplistic manner here.
> In the real world, the payoff function is a very complex relationship,
> that includes most of the people that you know and even many that you
> don't (in most cases indistinguishable from a utilitarian one).
> On the other hand, if you're a Utilitarian, you're forcing someone
> else's payoff function on me. It's a nice offer, but I think I'll pass.

<sigh> As I've already said many times before, this is true when the game is iterated, which is most of the time. I'd say utilitarianism and egoism coincide at least 80% of the time, if not more. However, egoism leaves us worse off in that remaining 20% region, representing games where you don't know who your opponent is or in which one or more players will not play again. Put simply, in a small number of cases, it is good from the egoistic perspective to hurt others for profit. Since utilitarianism and egoism agree the rest of the time, and since we are made worse off when we hurt each other in this way, we should reject egoism and adopt utilitarianism instead.