Re: greatest threats to survival (was: why believe the truth?)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 17 2003 - 15:33:34 MDT

  • Next message: Eliezer S. Yudkowsky: "Re: Why believe the truth?"

    Rafal Smigrodzki wrote:
    >
    > ### For all the decades of unmet expectations, AI relied on computing
    > power of the order of an ant, and only recently, as Moravec writes, did
    > they graduate to the computing power of a mouse. Since AI on
    > ant-powered computers gave ant-powered results, and AI on mouse-powered
    > computers gives mouse-powered capacities (such as target tracking,
    > simple learning, simple motor control), we may expect that AI on
    > human-level computers will give human-level results. Human-level
    > computing power is going to be available to SingInst in about 15 years,
    > so we can expect the recursive self-enhancement of the FAI to take off
    > around that time.
    >
    > QED?

    No, unfortunately, as far as I can tell, we have *enough* computing power
    available for AI now. Yes, right now. *More* computing power will make
    it *easier*, again unfortunately so. At least with current computing
    power it should still be fairly *hard* for the standard flounder-around
    style of AI to get anywhere.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 15:43:27 MDT