Re: Minds, Machines, and the Multiverse

From: hal@finney.org
Date: Wed May 03 2000 - 11:44:49 MDT


> " [Fredkin] estimated the total amount of computation going on in the
> universe, producing, ironically, a figure that seems puzzlingly low. He
> calls this the problem of the missing workload. Essentially what he has
> done is calculate how large a cellular automation would need to be to
> simulate the entire universe in all its details. The answer, he argues,
> is that the CA that operated at the tiniest quantum scales known as the
> Plank length and Plank time would only need to be not much larger than
> a bigish star to faithfully simulate the entire macroscopic evolution
> of our universe from the Big Bang to the present in about 4 hours. The
> difference in space time volume between the universe and such a system
> is a factor of 10^63.

It's true, most of the universe is empty space. It doesn't do much
most of the time except propagate some wave/particles from one side to
the other. It's a pretty low level computation compared to what it
could be doing.

I'll bet Fredkin didn't estimate the calculation if we assume that the
many-worlds theory is true and the universe is really a "multiverse".
He is just calculating what it takes to do our branch of the multiverse.
In reality (in this theory) the totality of the universe is enormously
larger.

Still though I don't think there is any reason a priori to expect that
the universe will be tightly packed with computations. There isn't
necessarily anything about this that has to be explained.

One of the approaches I like is to suppose that the universe is that
which minimizes the information content needed to describe it, subject to
the constraint that intelligence exist. It could well be that spreading
things out in space and time produces a simpler description than packing
them together.

Hal



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:10:25 MDT