Re: Rulers and Famine in Poor Countries (was Obesity)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Mar 09 2003 - 10:58:39 MST

  • Next message: Christian Weisgerber: "Re: Do patents really foster innovation?"

    Robert J. Bradbury wrote:
    >
    > While Eliezer's position is I believe partially correct -- in that the
    > sooner we get the Singularity the better -- at least from some positions
    > in that I think he assumes you get a friendly AI, while I'm concerned
    > that before then we might get a rogue AI

    That was the old Eliezer. I finally did work out the theory that
    describes what it is going from point A to point B when a moral human
    builds a moral AI, and it turns out that if you build an AI and you don't
    know exactly what the hell you're doing, you die. Period. No exceptions.

    Do you have any ideas for dealing with this besides building FAI first?
    Because as far as I can tell, humanity is in serious, serious trouble.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Sun Mar 09 2003 - 11:03:49 MST