Re: Rulers and Famine in Poor Countries (was Obesity)

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sun Mar 09 2003 - 09:39:10 MST

  • Next message: Lee Corbin: "RE: FWD (SK) Fear Inside the Power Elite"

    On Sat, 8 Mar 2003, Samantha Atkins wrote:

    > Eliezer S. Yudkowsky wrote:
    >
    > > As far as leveraged altruism goes, nothing is within the same league of
    > > the Singularity, for either efficiency or efficacy, by N orders of
    > > magnitude. I really don't see why rational discussion would tend to
    > > linger anywhere else.
    > >
    >
    > So, it is your position that the only meaningful thing that can
    > be done to deal with the real suffering around us today is to
    > support Singularity? Do you believe that all other attempts to
    > ameliorate suffering are irrational?

    While Eliezer's position is I believe partially correct -- in that the
    sooner we get the Singularity the better -- at least from some positions
    in that I think he assumes you get a friendly AI, while I'm concerned
    that before then we might get a rogue AI (I think we had a discussion
    about this months ago because the Kazaa network most likely has the
    CPU resources). So there is a risk that humanity will never get to
    the singularity. In which case by focus only on the Singularity we
    may have lost many years of lives people otherwise might have lead
    if we only focus on getting to the Singularity. For example, The
    Hunger Project leads dozens if not hundreds of people around the
    world to solve fundamental educational and societal problems that
    are responsible for killing millions of people (over a number of years).
    I believe that we need to solve problems of that nature (those that
    lead to premature death) The people who die this year, next year or in
    the other intermediate future will never make it to the Singularity.
    On the other hand if we invest properly in ending war, hunger, extending
    longevity, etc. then we will save more lives than if we focused *just*
    on making the Singularity arrive sooner. It comes down to the relative
    number of lives you use by allocating your resources for the different
    purposes.

    Robert



    This archive was generated by hypermail 2.1.5 : Sun Mar 09 2003 - 09:44:05 MST