Re: Radical Suggestions

From: Brett Paatsch (bpaatsch@bigpond.net.au)
Date: Tue Jul 29 2003 - 11:56:10 MDT

  • Next message: estropico: "Announcing the ExtroBritannia BLOG"

    Robert J. Bradbury wrote:

    > I have a core belief that "more complexity" is
    > better than "less complexity". This may be
    > modified by "more complexity sooner" is better
    > than "less complexity sooner". That is because
    > the Universe as we know it and as currently
    > structured might end and if one doesn't
    > get sufficient complexity to potentially intervene
    > in that process (sooner) then all bets are off.
    > (Viewed as another way I'm concerned that we
    > not only have to create an extropic reality now,
    > but we have to create an even more extropic
    > reality in the future, and we potentially have to
    > warp reality such that increasing extropicness
    > is feasible and can survive the current reality.
    >
    > This is driven in large part by Dyson's "Time
    > without end" perspective [1]. Alternate "realities"
    > such as Tipler's hold little attraction for me.
    > The entire state of our comprehension of our
    > universe *is* in a significant state of flux at this
    > time (due to work in everything from dark energy
    > to dark matter to string theory) -- so exactly
    > *how* the physics of reality impacts on
    > maximization of extropic perspectives I do not
    > now know. But I *do* think we should be
    > thinking about it.

    For some reason I'm reminded of the movie
    Butch and Sundance where they are about to
    jump down from a very high cliff into a river in
    a desperate attempt to escape their pursuers
    and Sundance I think says to Butch that he is
    worried because he can't swim (= the universe
    is going to end someday) and Sundance laughs
    and says "why are you crazy, the fall" (= the
    next forty years) "will probably kill you". ;-)

    Hell, Robert, we get through that next 100 years
    and we are still having this sort of conversation
    and we may be glad that the universe isn't going to
    make things too boringly ease for us.

    > > To me, a potential life, a never-would-have-
    > > been-born-otherwise life gets a coefficient of
    > > actuality--and corresponding coefficient of value
    > > --of zero.
    >
    > Here is where we may part in our opinions.
    > I view a potential future human life as a "it might
    > save my lazy old ass probability". [This is well
    > documented in terms of the productivity of
    > scientists younger than 30, not yet married, etc.]
    > So one should not discount the value of future
    > human lives to zero (irrespective of the discount
    > rate one is otherwise using). [For as I've discussed
    > if one loses humanity totally (leaving aside oneself
    > as an individual) then the discount rate one selects
    > doesn't really friggin matter -- so one had best
    > factor in some way to figure out how current vs.
    > future human lives may relate to ones long term
    > survival.]

    Surely net complexity, total available ingenuinity is not
    *necessarily* going to correlate too closely with
    such a crude thing as the actual *number* of sentients
    in the distant future. I don't need to believe in AI to
    feel confident that cybernetics will be feasible and
    that there will be both massive increases in the
    intelligence of the individual sentients we come in
    contact with going forward and also that there will be
    a far greater community of mind, bringing with it
    both a speed of conveying ideas and a greater sense
    of community to the majority of sentients in the future.

    > > Once you're born and living, you *count*.
    >
    > No argument. But from an extropic perspective I
    > have to raise the question "How much do you count"?

    This is a very good and very contemporay question.
    This genuinely good question was raised in Australia
    and elsewhere in recent times around the nature of
    stem cell legislation where the real moral question
    that had to be considered once all the sound and fury
    was removed was, given that an embryo, (or perhaps
    in future any living cell) has the *potential* to be
    transformed into a person how do we today, tommorow
    and next year, in real time, with real medical problems
    and the need to make hard triaging calls on where we
    as communities and societies spend our resources
    determine the relative moral weigh of potential persons
    in various stages of actualisation against actual persons?

    > Can you climb out of bed on a day to day basis and
    > answer the question "Yesterday, did I increase or
    > decrease the entropic trend in the universe?".
    >
    > Most members of humanity do not really have the
    > opportunity to ask that question unfortunately. Most
    > members of this list and certain people in power
    > positions do.
    >
    > I *still* have not seen a clear and concise argument
    > that "the needs of the many living" outweigh "the
    > needs of the many more future living".

    What part of your quandry is not addressed if we state
    things in terms of the need of those persons who actually
    exist outweigh the needs of the many more potential
    persons (who by definition don't yet exist as full persons)?

    The reason I asks is because if the argument or question
    is couched in terms of embryos or cells or forms of life
    then we can perhaps come to grips with it with to varying
    degrees of precision both now and after review again latter
    as the practical necessity to make decisions requires.

    But if you were for instance to hold the view that the
    death of a man must be weighted as a tragedy not just
    because he is himself dead but each sperm in his testes
    were somehow special too then we really are going
    somewhere strange and my question would be why
     bother going there? How is it useful to pursue such a
    line?

    > I've seen arguments that those living have a right of
    > "being". Sorry, Nature doesn't care about said "right"
    > -- be it an earthquake, a volcano or an asteroid they
    > do not discriminate and they are ruthless. So the
    > question becomes *what* one is doing to reduce
    > those risks (for either humanity or oneself) or whether
    > one has adopted a perspective of "Live for now -- for
    > it may all be over soon" (which of course isn't a very
    > extropic perspective).
    >
    > > Woulda-coulda-shoulda been alive is nothing, zero,
    > > a fantasy.
    >
    > Not in my book -- those people may save my ass.

    I can well understand how having an increased number
    of highly intelligent knowledge workers working in
    concert on life extension etc could indeed save your
    arse in the near term (and mine) and I can well
    understand how such a perception can motivate and
    focus one but I don't really follow the additional point,
    if indeed you are making it, that numbers of lives will
    matter in the timeframe that is relevant to altering say
    the big crunch or dilution of the universe.

    I think it would be useful to the discussion to identify
    how it is that you think larger numbers of sentients
    as opposed to fewer numbers of sentients might
    save your ass. (If your right there are few other
    contemporary asses I'm quite fond of as well including
    one that I am very attached too).

    I *certainly* agree with this in the short term. But
    beyond that 100 year horizon I don't think increasing
    numbers will matter as each individual repositary of
    sentience can still increase its complexity and therefore
    net complexity can still increase.

    I reckon the imperative to triage and do the tough moral
    analysis is greatest in the *near* term. We are going to
    have to come to terms with the fact that we *cannot*
    save all of us and not all of us (humanity) are even
    currently living with the lights on. Further some sections
    of humanity will go out of their way to oppose what is
    an extropic path because they are afraid or because
    they are mistaken.

    > I'm trying to force one into the position of possible
    > choices between "morality" or "humanity".
    >
    > I believe that it may be fundamental to extropic
    > perspectives.

    Me too.

    Brett



    This archive was generated by hypermail 2.1.5 : Tue Jul 29 2003 - 12:01:23 MDT