RE: Parallel Universes

From: Rafal Smigrodzki (rms2g@virginia.edu)
Date: Wed Feb 12 2003 - 08:15:29 MST

  • Next message: Natasha Vita-More: "Re: John Clute's novel APPLESEED: a review"

    Eliezer wrote:
    > Rafal Smigrodzki wrote:
    >> Lee Corbin wrote:
    >>
    >>> (Speaking about "splitting of worlds" sometimes has the
    >>> drawback of giving rise to notions of increase in quantity.)
    >>
    >> ### But I thought that as the entropy of the universe increases, the
    >> amount of information needed to describe it also increases. The
    >> state of the universe at 10e-43 was very simple, but with every
    >> second you need to use more bits to describe the evolving entity.
    >> Splitting adds to the amount of data contained in the whole ensemble
    >> of universes, doesn't it?
    >
    > No, it greatly decreases it, actually. Tegmark discusses this.

    ### I think I didn't express myself properly - I know that the algorithmic
    complexity of an ensemble of universes derived according to a algorithm is
    lower than the algorithmic complexity of a single universe from this
    ensemble, since additional assumptions have to be added to the original
    algorithm to select the universe from others. This was beautifully explained
    by J. Schmidhuber in http://www.idsia.ch/~juergen/everything/html.html
    (previously quoted on the list a few months ago). If you have unlimited
    computing power and unlimited memory, computing everything is easier on the
    programmer than computing only a part of it.

    However, I meant something else - the total amount of memory needed to
    contain a detailed description (not the algorithm) of a sheaf of universes
    derived by splitting from a single ancestor increases with the passage of
    time. While in the beginning you need only very limited data to describe the
    initial condition, later you need to store all the alternatives.

    So, if you look at the problem as data compression, the more universes, the
    more you can compress, but you also need to perform many more calculations
    to decompress the data and give an answer to a specific question, like how
    many planets are in our galaxy. On the other hand, you could see the problem
    as an increase in storage, with minimal compression, and minimal computing
    power requirements for an answer (aside from the problem of addressing the
    memory space).

    Rafal



    This archive was generated by hypermail 2.1.5 : Wed Feb 12 2003 - 08:08:41 MST