From: Barbara Lamar (blamar@satx.rr.com)
Date: Sun Jul 27 2003 - 12:24:08 MDT
Robert wrote:
> Point taken. I might offer that perhaps there are 3 codes --
> a legal code, a moral code and an instinctual code.
I agree, if what you're saying is that humans are genetically predisposed to
perceive, think, and act in certain ways. This "instinctual" way of
perceiving, thinking, and acting may or may not serve us well in any given
situation. For example, some of the research I've read indicates that people
are not genetically predisposed to approach decision making in a rational
manner. (I've read a lot of research on decision making, because my PhD
dissertation [not completed] was on the topic of how people make business
decisions). In an emergency where complex decision making is required, even
a mathematical genius might (would probably?) tend to fall back on
irrational decision making techniques.This is one reason people need a set
of behavior-guiding rules that have been carefully thought through before an
emergency arises.
As you point out, we will soon be in a position to alter the instinctual
code, but from what I've read and from interviews I've heard and
conversations with people working in the field, it will apparently be quite
difficult to precisely control such complex behavior as decision making (the
behavior may well be *altered* in various individuals, but I suspect such
alterations will at first be random byproducts of genetic alterations made
for completely different purposes).
> The moral
> code seems highly society dependent (witness "eye for an eye"
> behaviors in Iraq or Turkey which have been the discussion
> of recent news articles).
Sure. First, successful rules (successful = encouraging the survival of the
culture) will be different depending on the environment, so you'd expect
that some rules which made a lot of sense in one time and place, with a
certain population size and so forth, wouldn't work well in other
situations. Second, there is almost always more than one way to satisfy any
given need. Third, the rules for any given culture are to some extent
dependent on each other.
The "eye for an eye" rule might work quite well in a situation where people
live in relatively small, self-governing tribal groups. I would not expect
it to work well with large groups of people ruled by a small elite group.
>The legal code seems quite
> bendable [at least over time] (witness the recent PBS
> specials documenting how the wives of Henry VIII were
> manipulated using varieties of court/church/political
> "law" all the way up to the the Japanese internment
> during WWII -- "The Fred Korematsu Story" and how
> the Supreme Court was manipulated by the Justice Dept.
> to produce a verdict that still seems to hang over our
> heads).
I didn't see the PBS specials (wish I had; sounds interesting), but I'm well
aware of the bendability of the legal code.
> Against those 3 codes -- I would still offer up the questions of:
> -- (a) Are the rules of the code(s) worth the sacrifice of those
> who would uphold it (or believe in it) [or benefit from it]?
> -- (b) If the answer to (a) is yes, then the question would seem
> to become "How many current lives -- and how many future lives --
> are you willing to sacrifice for such principle(s)?"
With all due respect, Robert (and I do mean this -- I respect you, or I
would not be spending my time engaged in this discussion), I think you're
asking the wrong questions and that a better set of questions would be
something like this:
1. For any given legal code, is there a coherently expressed moral code upon
which it's based?
2. If there is such a moral code, is it valid for the circumstances under
which the legal code will be administered?
3. If the answer to 2 is "no" then what changes are needed to bring the
moral code into line with present-day reality?
Sometimes your approach seems to be all or nothing. Either the moral/legal
code functions perfectly or it's completely worthless. But other things
you've written lead me to believe that this is not really what you mean to
say (for example, your comment that to act on the premise "the hell with the
moral code" would lead to chaos and would thus be entropic rather than
extropic). Incidentally, there was some research done around 25 years ago to
find out what motivates taxpayers to be honest on their tax returns (sorry,
I don't have a copy of the paper, and I can't even recall the authors'
names). Among the motivators tested were fear of getting caught and
subjected to fines and imprisonment, fear of being publicly exposed, and the
perception of a moral duty. Perception of a moral duty to pay one's taxes
was a significantly more powerful motivator than any of the others.
> Note that I'm attempting to force one into a position of doing
> triage (what lives do I allow to be lost now?) as well as performing
> a discount value analysis of the net worth of current vs. future
> human lives.
Triage is generally associated with emergency situations. More broadly, it
can mean "the assigning of priority order to projects on the basis of where
funds and resources can be best used or are most needed"
http://www.m-w.com/cgi-bin/dictionary?book=Dictionary&va=triagec
Although assigning priorities is certainly necessary when one is deciding
between conflicting goals, I don't see the concept of triage as particularly
useful in evaluating and revising a moral code. Rather, it's a process you
would use after the moral code is already in place.
> > Rather than saying that it's okay to disregard one's moral code, you now
> > seem to be arguing that a practical moral code should be in
> part based on
> > the assumption that the needs of many outweigh the needs of the
> few, or the
> > one.
>
> Perhaps I am more interested in precisely *when* one code grants
> precedence or priority to another. Or I may be interested in how
> one structures the codes for the most extropic outcome. It is
> probably reasonable to assume that I desire the most extropic
> future. This would seem in line with my actions over the last
> decade, my position on the ExI board, and at least some of my
> postings to the list).
>
> Now, if an extropic future path is in conflict with current
> legal, moral or instinctual codes then I think we need to
> seriously look at those codes.
I agree with the last paragraph above. Note that the legal code is
relatively easy to change, the moral code a bit more difficult, and the
instinctual *very* difficult. Although the instinctual code can be
overridden by the moral code in day-to-day life, people often fall back on
it in a bind, so the best moral code is one that meshes well with the
instinctual code, and a major instinct is to Stay Alive. One thing I never
liked about Christianity (at least as it was presented at the churches my
parents forced me to attend when I was a kid) was that if one practiced it
conscientiously and consistently, one would soon be dead.
Now, just to make sure we're all on the same sheet of music here, I believe
the main topic under discussion is whether or not it would be appropriate to
nuke North Korea tomorrow, or some other time before they've actually taken
any action beyond developing weapons; or more generally speaking, whether or
not it would be advisable to kill a group of people whose continued
existence might shorten the lives and/or prevent the births of a larger
group of people.
You have offered two lines of thought: on the one hand, you said that all
people are pretty much alike and therefore one should look only at numbers:
thus, it would be permissible to kill, say, 100 people, in order to allow
105 people to live. Or perhaps you would argue that the difference in
numbers must be a certain minimum %, say 200%. Thus, it would not be okay to
kill 100 people in order to let 105 people live, but it would be okay to
kill 100 people to let 200 people live.
On the other hand, you have argued that the reason the north Koreans should
be killed is that they have an unextropic culture. You added that because of
the nature of the culture, their continued existence could result in the
deaths of, and failure to be born of, a much greater number of people. So
maybe you would insist that this is still selection by the numbers, but I
don't think it's quite the same thing.
> > Apparently in your scenario, the decision maker is whoever has
> > the most potent weapons.
>
> This would appear to be true. However the decision maker can
> choose *not* to exercise that power. In which case the power
> migrates to those with less potent weapons.
While this is true in certain circumstances, I don't think it is necessarily
true. Clearly it's not true with respect to interactions between
individuals. If I have a shotgun, which I could use to threaten you and
steal the $15 you have in your wallet, but instead I give you my surplus
corn for your $15, my power hasn't migrated to you. We're now both better
off. I've freed up the storage space the surplus corn was taking up, you've
got the corn, and I've got the $15. And I haven't used up any of my shotgun
shells.
However, this is not really a fair example. What about if you have a knife,
and I know for a fact that you've threatened other people with the knife and
stolen their corn? What if you've just traded in your knife for a machete?
And I hear that you're on your way to rob me? Or my neighbor? Should I go
over to your house and blow you away with my shotgun?
This is still not a fair example. How about this one: what if you have a .22
caliber handgun, and I have reason to believe you intend to use it to take
over the community well that supplies all the water for irrigation of crops?
Should I go over to your house and blow you away, just in case? Should your
philosophy make a difference? Should it make a difference if your reason for
taking over the well is to redesign the irrigation system so that it
functions more efficiently? (note: I do not mean to imply that North Korea
is technologically superior to the U.S. The preceding is a purely
hypothetical line of thought)
> Let me reformulate it -- one can sacrifice
> 100 million (10^8) human lives (one time) *or* one can
> sacrifice 10,000+ potential humanities (i.e. ~10 billion
> people) *every second* that you choose not to sacrifice
> those 100 million lives.
How about a less complex situation: Suppose there is a bus carrying 45
innocent passengers. You have reason to believe that the bus driver intends
to drive the bus through a crowd of pregnant women at a LaLeche League
event. To allow the bus to continue on its present course may result in the
deaths of 50 women, 53 fetuses, and 2 bus passengers; and at least 120
potential lives will be lost (some of the women killed would have gotten
pregnant and given birth again, after the births of the fetuses they are
presently carrying, and several of the bus passengers are females of child
bearing age). Do you blow up a freeway overpass as the bus approaches,
causing the bus to crash and burn?
Does it make a difference if one of the passengers on the bus is the brains
behind a research project that is on the brink of a major medical
breakthrough that could potentially save millions of human lives over the
coming centuries?
Does it make a difference if the pregnant women are taking part in an
anti-abortion demonstration and if 30% of them are carrying Downs Syndrome
fetuses?
> And I have not seen one discounted present value analysis
> of the lives lost [because we wish to maintain current
> legal or moral codes] (and this leaves out what the value of
> those lives from an extropic perspective which may be quite
> different).
If you wish to value people purely for the amount of money their lives could
have generated, the value of a human life will vary, depending on the
individual whose life you are valuing. A mentally "challenged" person, who
could hope to contribute no more than his physical labor over the course of
his life, or a bright person burdened with an inability to focus, would be
worth a lot less than someone capable of focused, thoughtful action. In
general, males would be worth less than females (for an interesting analysis
of the costs and benefits of the two sexes, see MEN ARE NOT COST EFFECTIVE
by June Stephenson). Following your argument that most people are pretty
much alike, the most cost effective way to proceed might be to kill all the
males whose IQ's are less than 140 (and even quite a few of the smart ones
might need to be sacrificed, since many high IQ males don't seem to be very
productive) while allowing all women with IQ's over, say 100, to live.
Aside from difficulty of determining the value of a life before the fact,
this does not seem to be a very satisfactory measure of the value of human
life even after the life has been lived and you can add up all the money the
individual generated. Not all valuable traits can be traded in the market,
and therefore there *is* no monetary value. It's true that monetary
valuation for, say, companionship, is attempted in personal injury and
wrongful death lawsuits; but no one claims that this is anything more than a
means of settling a dispute.
> > Your suggestion of mass murder to save humanity is qualitatively
> > similar to spiking your drinking water with arsenic to kill bacteria.
>
> First, my thoughts were motivated to save *many* humanities.
> Second, I do not understand the comparison -- I am proposing
> that some may need to die so that many many more others may live.
You are proposing an extremely messy, expensive course of action. Among
other things, you have neglected the cost of environmental damage,
disruption of trade, breakdown of relationships with other nations, probable
breakdown of civil order within the U.S. In order to justify this
destructive course of action, you are proposing the adoption of a moral code
that would neglect the value of the individual, thus leaving no basis for
individual freedom. There's a book that discusses the economic value of
individual freedom: *Human Action: A Treatise on Economics* by Ludwig von
Mises
http://www.amazon.com/exec/obidos/tg/detail/-/0945466242/103-6639845-6662251
?v=glance I would recommend it, at least as a starting point in devising an
extropic moral code.
Barbara
This archive was generated by hypermail 2.1.5 : Sun Jul 27 2003 - 12:35:24 MDT