Re: who cares if humanity is doomed?

From: Samantha Atkins (samantha@objectent.com)
Date: Thu Mar 13 2003 - 01:53:14 MST

  • Next message: Anders Sandberg: "Re: Europe and assimilation"

    Ramez Naam wrote:
    > From: Eliezer S. Yudkowsky [mailto:sentience@pobox.com]
    >
    >>I finally did work out the theory that
    >>describes what it is going from point A to point B when a moral
    >>human builds a moral AI, and it turns out that if you build an AI
    >>and you don't know exactly what the hell you're doing, you die
    >>Period. No exceptions.
    >>
    >>Do you have any ideas for dealing with this besides building
    >>FAI first? Because as far as I can tell, humanity is in serious,
    >>serious trouble.
    >
    >
    > What exactly do you mean by humanity? Do you mean the individual
    > humans? If so, none of them are counting on living forever, and
    > frankly the odds are that the vast vast majority of them will die in
    > the next several decades.
    >
    > I'm not trying to be callous here. I care about people, even those I
    > haven't met, and I do my small bit to help eliminate needless death.
    > But individuals die. They always have and they always will.
    >
    > Alternately, when you said "humanity" did you mean the human species?
    > If so, should we care? I care about individuals. If those
    > individuals are AIs or post-humans or such, is that any worse than if
    > those individuals are humans? I don't see why. But maybe I'm unusual
    > in having more sentience-loyalty than species-loyalty.

    Uh, I think the worry is that our species will destroy itself
    before reaching its own potential or creating any viable
    successors. That would be a great tragedy indeed.

    - samantha



    This archive was generated by hypermail 2.1.5 : Thu Mar 13 2003 - 01:54:14 MST