RE: Radical Suggestions

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sat Jul 26 2003 - 10:44:36 MDT

  • Next message: Terry W. Colvin: "FWD (TLCB) Sitting ducks, sappers, and artificial boundaries..."

    On Fri, 25 Jul 2003, Barbara Lamar wrote:

    > Not at all. If you wish to express it in these terms, then the question I
    > was getting at is: "What is the price of the ABSENCE of a moral code?"

    Point taken. I might offer that perhaps there are 3 codes --
    a legal code, a moral code and an instinctual code.

    The instinctual code seems dependent on our genetic makeup
    (currently fixed -- but likely not for long). The moral
    code seems highly society dependent (witness "eye for an eye"
    behaviors in Iraq or Turkey which have been the discussion
    of recent news articles). The legal code seems quite
    bendable [at least over time] (witness the recent PBS
    specials documenting how the wives of Henry VIII were
    manipulated using varieties of court/church/political
    "law" all the way up to the the Japanese internment
    during WWII -- "The Fred Korematsu Story" and how
    the Supreme Court was manipulated by the Justice Dept.
    to produce a verdict that still seems to hang over our
    heads).

    Against those 3 codes -- I would still offer up the questions of:
    -- (a) Are the rules of the code(s) worth the sacrifice of those
       who would uphold it (or believe in it) [or benefit from it]?
    -- (b) If the answer to (a) is yes, then the question would seem
       to become "How many current lives -- and how many future lives --
       are you willing to sacrifice for such principle(s)?"

    Note that I'm attempting to force one into a position of doing
    triage (what lives do I allow to be lost now?) as well as performing
    a discount value analysis of the net worth of current vs. future
    human lives.

    > Although there is usually quite a bit of overlap, a legal code and a moral
    > code are not identical.

    Agreed. I think I have provided examples why this perspective may
    be so.

    > This seems to be a different argument from the one you were making earlier.
    > Rather than saying that it's okay to disregard one's moral code, you now
    > seem to be arguing that a practical moral code should be in part based on
    > the assumption that the needs of many outweigh the needs of the few, or the
    > one.

    Perhaps I am more interested in precisely *when* one code grants
    precedence or priority to another. Or I may be interested in how
    one structures the codes for the most extropic outcome. It is
    probably reasonable to assume that I desire the most extropic
    future. This would seem in line with my actions over the last
    decade, my position on the ExI board, and at least some of my
    postings to the list).

    Now, if an extropic future path is in conflict with current
    legal, moral or instinctual codes then I think we need to
    seriously look at those codes.

    > Apparently in your scenario, the decision maker is whoever has
    > the most potent weapons.

    This would appear to be true. However the decision maker can
    choose *not* to exercise that power. In which case the power
    migrates to those with less potent weapons. I'd suggest a
    review of the battles between the British Navy and the German
    Navy during WWII, esp. the history of the Bismark.

    > But while I do not agree with the moral code you are suggesting, I would
    > certainly feel more comfortable dealing with you if I thought you adhered to
    > SOME moral code rather than just acted on whatever course of action seemed
    > to you at the moment to be expedient.

    "Expedient" may be pushing it a bit. I believe that I subscribe to
    an "extropic" moral code and am seeking a path where that fits into
    other codes. In particular -- in previous suggestions I was looking
    at the distinction between a one time cost of 10^8 human lives
    vs. 10^14 human lives *per second*. Few, if any, individuals
    seem to have picked up on the magnitude of the problem I
    was raising. Let me reformulate it -- one can sacrifice
    100 million (10^8) human lives (one time) *or* one can
    sacrifice 10,000+ potential humanities (i.e. ~10 billion
    people) *every second* that you choose not to sacrifice
    those 100 million lives.

    Let me make it clear again -- *every* second you choose not
    to pursue the the most extropic development path you are
    sacrificing 10,000 potential humanities.

    And I have not seen one discounted present value analysis
    of the lives lost [because we wish to maintain current
    legal or moral codes] (and this leaves out what the value of
    those lives from an extropic perspective which may be quite
    different).

    And Barbara -- you cannot escape from the "value of human life"
    analysis as our legal "code" regularly deals with it.

    > Your suggestion of mass murder to save humanity is qualitatively
    > similar to spiking your drinking water with arsenic to kill bacteria.

    First, my thoughts were motivated to save *many* humanities.
    Second, I do not understand the comparison -- I am proposing
    that some may need to die so that many many more others may live.

    > Yes, the fate of the Donner Party is a fascinating tale, but if you're using
    > it to somehow justify your position, I think you need to explain further.

    It was a point to show how the "moral code" (I.e. thou shalt not eat
    other humans) was on a slippery slope.

    > Is your answer to this question therefore that you
    > would be repulsed by situations in which the moral code gets discarded?

    I personally would not be repulsed by situations in which the legal
    or moral codes get discarded for the instinctual (survival) code.
    I might not like them, particularly if I were on the down side
    of the decisions (say serving as food for the Donner party).
    But I would understand the necessity of the actions.

    Robert



    This archive was generated by hypermail 2.1.5 : Sat Jul 26 2003 - 10:52:49 MDT