Re: The Metamorphosis of Prime Intellect

From: Wei Dai (weidai@weidai.com)
Date: Thu Feb 27 2003 - 15:43:16 MST

  • Next message: Anders Sandberg: "Re: The Metamorphosis of Prime Intellect"

    On Thu, Feb 27, 2003 at 04:06:27PM -0500, Eliezer S. Yudkowsky wrote:
    > See some discussion of this story on the AGI list:
    >
    > agi@v2.listbox.com/thrd3.html">http://www.mail-archive.com/agi@v2.listbox.com/thrd3.html

    In one of the posts you say that with Prime Intellect humanity is
    stuck in a "nightmarish system". I agree the situation isn't ideal, but
    how bad is it really? Ok, so you can't commit suicide. Why not ask the PI
    to freeze yourself indefinitely, or slow down your thought process
    exponentially? All the aliens are frozen, so ask the PI to create a
    virtual universe and instantiate them there. Is this truly nightmarish
    enough to want to kill off everybody except yourself and restart history
    from scratch? And nobody made the point that Prime Intellect just isn't
    that smart. The real nightmare is having to stay at human-level
    intelligence for eternity.

    I guess what I want is a better "Seed AI Programmer Screws Up
    Friendliness" story, one that shows how really nightmarish it could be,
    and makes it clear that it's screwing up the Friendliness part that is
    responsible for the nightmarishness, not screwing up the Intelligence
    part.



    This archive was generated by hypermail 2.1.5 : Thu Feb 27 2003 - 15:45:34 MST