Re: ExI principles: people left behind?

From: Samantha Atkins (samantha@objectent.com)
Date: Sun Jul 20 2003 - 02:22:09 MDT

  • Next message: Samantha Atkins: "Re: ExI principles: people left behind?"

    On Friday 18 July 2003 19:55, Robert J. Bradbury wrote:

    > > Historical villains have killed millions of people in
    > > terrible causes, but the idea that it's too inconvenient to think about
    > > the subject, and that dropping nukes would save time and aggravation, may
    > > well represent a new low for the human species.
    >
    > Ah, but the debate must change if the "killing of millions of people"
    > is in the name of a "good" cause. I do not note in your message
    > a schema for the valuation of "lives". Say even an AI life vs.
    > a human life. This is not a new debate -- it goes way back to
    > the question of whether one has the right to kill (shut off, erase)
    > ones copies -- (even *if* they have given you "well informed"
    > permission to do so in advance).
    >

    Some things, imho, go beyond just logic and reasoning. Ethics is one of those
    things. The value of human life is not something you can simply plug into a
    mathematical formula. The question of whether it is ok to sacrifice tens of
    millions of people to our supposed brighter future without them cannot be
    settled at the level of the reasoning intellect alone. Copies have nothing
    to do with it.

    > And the "valuation of lives" goes to the crux of the matter. Nick
    > in his paper suggested there were some alternate perspectives
    > of utilitarian evaluations, Anders expanded on this quite a bit in
    > his comments (much to my education).
    >
    > But the problem is not simple and it doesn't go away (just because we
    > find the discussion repulsive).
    >

    The problem originally stated was that some cultures do not understand or want
    some things we consider to be of extreme value and distrust those who do not
    value them utterly. I hardly thing it is a solution to simply exterminate
    all of these inconvenient people. Do you? Really? It goes far beyond
    repulsive. It crosses the line into that which will not be considered or
    allowed if we are talking about extropic values for humankind. Last time I
    checked our supposed shared principles don't say extropic values for "us" at
    the expense of however many of "them" seem inconvenient or possibly a threat.
    If you believe differently then it is up to you to justify this stand. A
    specious argument on hypothentical astronomical numbers of future beings is
    not at all sufficient has has been pointed out.

    > I do agree that villains have abused their power and that millions of
    > innocent people have died as a result. I would also probably agree
    > that my suggestion would also result in similar negentropic casualties.
    > But the point I am trying to get at is *when* the negentropic losses
    > are acceptable? Is the saving of a single human life worth a sacrifice
    > by humanity? In medicine this is known as "triage" -- and it involves
    > some very difficult decisions as to how one optimizes who one saves.
    >

    Negentropic losses are nto acceptable when they can be avoided. We are not
    talking about the saving of a single human life but the cold-blooded
    slaughter of tens of millions of real human beings on the basis of some
    specious hypothetical argument. There is quite a huge difference.

    > I was trying to go beyond that. I was trying to determine whether
    > or not there is a moral framework for the net worth of human lives
    > and whether that justifies a "way of being"? For example, the
    > Buddhist perpective on "lives" provides a "way of being" -- the
    > extropic principles may not (at least in some aspects). And perhaps
    > more importantly the extropic perspective may *never* generate a
    > schema that trumps the Buddhist perspective. That is why I raised
    > the question of how one achieves the shortest path to ones goals.
    >

    You are correct that the extropic principles do not provide a sufficient
    ethical basis or "way of being". My goals are oriented in a maximally
    extropic future for all human beings, not just for those who happen to think
    more or less like I do. It is not compatible with my goals to murder and do
    so in the tens of millions as this is expressly contradictory.

    > > And how easy it is for people who can't distinguish word games from
    > > reality to arrange a few thoughts in the right order and decide to
    > > commit genocide. The human mind has no safety catch.
    >
    > I am not playing word games. My comment was very serious (though
    > I may currently regret posting it). It was an effort to question
    > "at what rate" and "how" do you want humanity to evolve?
    >

    The "evolving" I want is not compatible with the regression suggested in the
    least.

    > > Because you genuinely seem to be serious. I wish I could say I don't
    > > understand it, but I do, and I'm sad, and frightened, because you were
    > > someone I used to respect. Even if you don't understand what you're
    > > saying, even if it has no connection to reality for you, you said it, and
    > > I can't make it unreal to myself.
    >
    > If it is of any help, reframe it in terms of "can you erase your copies"?
    > It seems to be a reasonable proposal that an evolving technological
    > civilization that allows the erasing of copies would advance faster
    > than one that does not (simply due to the expense of the memory
    > requirements of preserving inactive copies -- ignoring the question
    > of whether copies must be allowed some slice of the global CPU time).
    >

    Specious dry intellectual games are abhorrent when you have suggested
    megadeaths as somehow possibly beneficial to our goals.

    > So making the great "leap" that one human is pretty much like another
    > human (I mean really -- if a 1 cm^3 nanocomputer can support 100,000+
    > human minds our "individuality" is probably overrated) one begins to
    > get into the question of the "survival of humanity". This isn't a
    > new topic -- it has been discussed by Robin in his "If Uploads Come
    > First" paper (http://hanson.gmu.edu/uploads.html).
    >

    This leads straight to the Dark Side. You can choose whether or not to go
    there. It is a very fundamental choice. You can't intellectualize your
    choice. It comes from a deeper or at least different level than that in
    large part.

    > All I am saying, and I am sad that it makes you "sad, and frightened"
    > but someone has to face what I perceive as the spectre of the Pied Piper,
    > is that the philosophy, belief system, what we promote, etc. may be
    > very incomplete unless we deal with the fact that a society that
    > allows the deletion of copies may out-evolve a society that does not.
    >

    We are not talking about the deletion of copies! And no, human beings are not
    interchangeble units with zero cost to erase a few if the appear to you to be
    superfluous! And no, it is not at all itneresting to talk about how the copy
    question is actually the same thing. You have a fundamental choice to make.
    What will it be? That choice will have consequences.

    - samantha



    This archive was generated by hypermail 2.1.5 : Sun Jul 20 2003 - 02:32:19 MDT