> [...] there is no specific extropian answer to
> the problem of ethics, though we could easily
> suggest that one extreme answer might be: "The
> right action is that which maxmizes extropy over
> all other possible actions." However, I think
> many extropians would not agree with this answer
> to the question; while extropy is good, it may
> not be the only important thing in the universe.
> Following this rule would demand that we kill,
> steal, torture, and even commit suicide if doing
> so would even marginally increase universal
> extropy compared to that which would be lost by
> doing so.
In the above paragraph you are judging one answer,
"the right action is that which maximises extropy
over all other possible actions" by another,
Utilitarianism. Clearly this is not logical.
> I tend to reject the third answer because simple
> egoism tends to promote actions which seriously
> hurt others, if the agent can pull them off
> without getting more punishment than pleasure
> from doing the action.
But you're judging Egoism by Utilitarianism, which
does not seem to be a rational way of finding the
correct answer. You cannot say one answer is wrong
because it is not equal to another answer when you
do not know the other answer to be correct.
> Ultimately I believe utilitarianism is correct
> because even the modified version of egoism may
> prevent us from doing something obviously
> rational under a variety of situations. A
> common example is one in which you might steal
> from someone in order to save lives, possibly
> your own. An objectivist would quickly point
> out that stealing is wrong, and therefore one
> should never do so.
I'm not sure this is obviously rational, I fail to
see why saving lives is a rational act.
> However, under situations of sufficient urgency,
> no other choice may be available (besides
> death), and within that context we again seem to
> be forced into the position of accepting bad
> consequences when we do the right thing; a
> situation which I believe to be paradoxical.
> Utilitarianism, on the other hand, allows us to
> get the consequences we strive for without
> sacrificing practical rationality.
But doesn't Utilitarianism promote self-sacrifice?
If your death saves lives (or just makes others
happy), then your death is a good thing.
Thanks for the detailed reply,
Bryan Moss