Re: Doomsday vs Diaspora

From: Anders Sandberg (asa@nada.kth.se)
Date: Sat Apr 26 2003 - 03:04:32 MDT

  • Next message: Eliezer S. Yudkowsky: "Re: thunderbolt (was Re: Doomsday vs Diaspora)"

    On Fri, Apr 25, 2003 at 11:21:53PM -0400, Spudboy100@aol.com wrote:
    > Jef stated:
    > <<Regarding the question of whether there is a limit to complexity, I'm not
    > qualified to debate the current theories.  I am intrigued by the way systems
    > in general tend to use energy to increase local complexity, and what this
    > might mean to humans in the sense of the universe playing a non-zero sum
    > game. 
    > - Jef >>
    >
    > Neither of us are qualified, however, I believe that both of us, and everyone
    > else on this list, is bright enough to view the issue, and run with it,
    > should they take an interest. I suspect it was Anders Sandburg, on his
    > website who first promulgated, in a large way, using his website, the notion
    > that increasing complexity is anti-entropic in nature. Life takes energy and
    > material and utilizes it and changes it. Coral polyps form huge colonies, and
    > eventually limestone mountains. Can humanity and later, transhumanity take
    > the solar system and turn it into colonies and then, a dyson sphere, then a
    > Bradury Matrioshka Brain, then a ...?

    I was hardly first with that idea - it is inherent in the early
    extropian writings and the work of Ilya Priogogine. Open systems can use
    free energy to decrease their own entropy, and certain kinds of systems
    have an internal dynamics that also increases their complexity (which is
    still a somewhat vague term even after more than a decade of debate in
    the alife and complexity community).

    Is there an upper limit to complexity? I think the question can be
    answered if one defines complexity as stored and retrievable information
    - in that case the Bekenstein bound gives an upper limit of kMD for a
    spherical region of mass M and diameter D. If we define complexity in
    terms of minimal description length it is of course also limited by the
    Bekenstein bound. But complexity defined in terms of contingency or
    "information depth" is harder to catch - the information depth of the
    famous letter dialogue of Victory Hugo (he wrote: "?", his publisher
    answered: "!") was far larger than the actual information content. Here
    the interpreting system plays an important role in uncompressing
    meaning, and if that meaning is not strict then there is likely some
    fuzziness on how much information can be stored or transmitted.
     
    My preliminary answer would be that in a finite region there can only be
    a finite amount of complexity. But this could very well be an
    exponential function of the number of available states.

    -- 
    -----------------------------------------------------------------------
    Anders Sandberg                                      Towards Ascension!
    asa@nada.kth.se                            http://www.nada.kth.se/~asa/
    GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
    


    This archive was generated by hypermail 2.1.5 : Sat Apr 26 2003 - 03:09:39 MDT