Re: How Extropians Live Their Lives was: Optimism

From: Rafal Smigrodzki (rafal@smigrodzki.org)
Date: Sat Jul 19 2003 - 21:58:54 MDT

  • Next message: Damien Broderick: "Bryan Moss's recent posts"

    From: "Bryan Moss" <bryan.moss@dsl.pipex.com>
    \The
    > numbers we use to predict the Rupture (Singularity) or decide how many
    > "potential lives" are lost by not colonising space or nuking North Korea
    > (for fucks sake) are rough, back of the envelope calculations by a
    > roboticist that make some massive, sweeping assumptions about the brain.
    > When someone expresses the opinion that, perhaps, the brain isn't easily
    > simulated, they're usually met with, "I doubt anything quantum mechanical
    is
    > going on," which forgets that there's no simple correlation between
    > classical physics and classical computers. An actual simulation of the
    > brain, its biological and chemical processes, is most likely impractical.
    A
    > "simulation" of thought processes is pure pseudo-science, as is any claim
    to
    > a "general intelligence." You can argue that not everything that happens
    in
    > the brain (physically speaking) plays a functional role in thought, and
    > that's fair (depending on your criteria), but if you start arguing that
    you
    > know what does and does not play functional roles, in any strong sense,
    > you're most likely being disingenuous.

    ### No, I don't think I am being disingenuous when I dismiss Hameroff's
    arguments that the mystery of cognition and intelligence is in the
    quantum-mechanical entanglement of microtubules. We do have a pretty good
    idea about what does and what does not correlate with learning and conscious
    experience. We can safely exclude the chemical basis of life from the
    description of consciousness, as long as the synaptic strengths, the general
    rules for their adjustment and the overall pattern of connectivity are
    accounted for. You can make dramatic changes at the level of ion channels,
    proteins, but as long as they are compensated by appropriate homeostatic
    mechanisms, they do not interfere with synaptic activity and do not impact
    consciousness.

    Therefore, estimates of equivalency of computers and brains based on synapse
    counting are fundamentally right, plus/minus a couple orders of magnitude.
    Since such estimates do not account for the limitations of the architecture
    of brains, such as need for self-organization, unfinished or obsolete
    evolutionary optimization, and the massive redundancy needed to cope with
    the imperfect computational substrate, the actual computational needs for an
    engineered, constructed intelligence implemented in a more stable substrate
    are in my opinion likely to be less than estimated by e.g. Moravec.

    The emergence of superhuman AI is not inevitable, but still quite likely.

    Rafal



    This archive was generated by hypermail 2.1.5 : Sat Jul 19 2003 - 22:08:06 MDT