RE: who cares if humanity is doomed?

From: Lee Corbin (lcorbin@tsoft.com)
Date: Tue Mar 11 2003 - 10:25:19 MST

  • Next message: Lee Corbin: "RE: Spacetime/Inflation/Civilizations"

    Ramez writes

    > I'm not trying to be callous here. I care about people, even those I
    > haven't met, and I do my small bit to help eliminate needless death.
    > But individuals die. They always have and they always will.

    Given the kinds of technology that might control our solar
    system in only a century, this would turn out to be false.
    Individuals might very well *never* die. The cryonics and
    immortalist literature has dealt with this for decades.

    For example, when humans are uploaded they'll have enough
    secure backup copies that the probability of death drops
    very close to zero. We should keep this goal before us.

    > Alternately, when you [Eliezer] said "humanity" did
    > you mean the human species? If so, should we care?
    > I care about individuals.

    Yes, so do I. And this definitely includes one Ramez
    Naam, for example. Absolutely no one should look upon
    our present situation as certain death sentence.

    > If those individuals are AIs or post-humans or such,
    > is that any worse than if those individuals are humans?
    > I don't see why. But maybe I'm unusual in having more
    > sentience-loyalty than species-loyalty.

    Don't you see the frightful logic that you are allowing
    yourself to carelessly embrace? It's as though Montezuma
    and his Indios friends had said, "It will be good if we
    continue to live, but does it really matter if the Spanish
    live instead, and we die?"

    Why isn't there room for everyone to live? As Feynman
    said, "there's plenty of room at the bottom".

    > Then [this advanced alien] gives you [the primate precursor
    > to humans] the choice. Should humanity be allowed to come
    > into being or not? I would choose yes.

    Excellent choice. But it need not be at ours, or anyone's
    expense.

    > Today, I would rather see creatures with more
    > intelligence, awareness, creativity, passion, and curiosity than
    > humans come into being. I don't want to do so in a way that hurts
    > people,

    Then don't embrace sub-optimal solutions!

    > but I know that evolution goes hand in hand with strife.

    Not after we get control of it! At least, not necessarily.
    Death can become as Popper said of ideas: "It is better
    to let them die in our stead." So particular algorithms
    can either not be run, or be put on an extinction schedule
    (i.e., if X is the algorithm that you deem in your own
    space to be less worthy of run time, then you begin issuing
    X one second of run time only every 10^n seconds, for
    increasing integer n).

    > Given the choice between humanity continuing in its
    > current form for millennia vs. humanity succumbing
    > to a post-human type of life, I'd choose the latter.

    Well, as Rafal would say, nix the choice! This isn't a
    thought experiment, and there need not be a hard choice.

    Lee



    This archive was generated by hypermail 2.1.5 : Tue Mar 11 2003 - 10:26:55 MST