From: Lee Corbin (lcorbin@tsoft.com)
Date: Fri Jul 18 2003 - 23:30:07 MDT
Robert writes
> [Eliezer wrote]
> > Historical villains have killed millions of people in
> > terrible causes, but the idea that it's too inconvenient
> > to think about the subject, and that dropping nukes would
> > save time and aggravation, may well represent a new low
> > for the human species.
>
> Ah, but the debate must change if the "killing of millions of people"
> is in the name of a "good" cause. I do not note in your message
> a schema for the valuation of "lives". Say even an AI life vs.
> a human life. This is not a new debate -- it goes way back to
> the question of whether one has the right to kill (shut off, erase)
> ones copies -- (even *if* they have given you "well informed"
> permission to do so in advance).
I would say that you might try to avoid thinking of "rights" in
the abstract---as you did in that last sentence---and then go on
to carefully distinguish between
(i) what the legal rights of entities ought to be in
society in your opinion
(ii) how you value various kinds of lives under varying
circumstances, and what prescriptions you are prepared
to give.
and to realize that these are not at all the same thing.
There are indeed cases, which SF authors often delight us with,
where one must decide the fate of millions. In Enders Game,
for example, one has to decide whether killing millions of
aliens even to the point of complete genocide (because they might
kill you) is what you really approve of or not. (It's really not
too different from deciding whether or not to invade a foreign
country because they might fabricate weapons that could destroy
you.)
> I do agree that villains have abused their power and that millions of
> innocent people have died as a result. I would also probably agree
> that my suggestion would also result in similar negentropic casualties.
> But the point I am trying to get at is *when* the negentropic losses
> are acceptable? Is the saving of a single human life worth a sacrifice
> by humanity? In medicine this is known as "triage" -- and it involves
> some very difficult decisions as to how one optimizes who one saves.
You are quite right, of course. I do not know if your suggestions
about Afghanistan and North Korea would be entropic or not, and
neither does anyone else. Of course, we are all entitled to our
opinions, and mine---for what it is worth---is that extreme acts
must be approached with the greatest caution. Our intuitions and
traditions *usually* suggest such great caution.
> I was trying to determine whether or not there is a moral
> framework for the net worth of human lives and whether that
> justifies a "way of being"? For example, the Buddhist
> perspective on "lives" provides a "way of being" -- the
> extropic principles may not (at least in some aspects).
> And perhaps more importantly the extropic perspective
> may *never* generate a schema that trumps the Buddhist
> perspective. That is why I raised the question of how
> one achieves the shortest path to ones goals.
I am not following you here at all. For me, there are no
rationally arrived at "moral frameworks"---basically, one
consults his or her own intuitions on the matter, and
attempts to make them as rationally consistent as possible.
It also helps one's intuition to consult the voices of
others, including those long dead but whose wisdom is
widely respected.
Gimme any scenario, and I'll tell you my answer.
No, I would not nuke North Korea off the face of the world
at this time, although I would retain the capacity to do so,
and perhaps even make it clear that under *some* circumstances
I could be driven to do so. But the present ones are far from
that, for me, and at this point the wisest thing to do seems
to be just to ride it out.
Lee
This archive was generated by hypermail 2.1.5 : Fri Jul 18 2003 - 23:41:43 MDT