Re: MATH/COMP/PHIL: "Omega Man"

From: scerir (scerir@libero.it)
Date: Tue Apr 03 2001 - 15:05:55 MDT


> Eugene Leitl wrote:
> What woulld be interesting to know 1) has this practical relevance to
> physics?

There are a couple of links. But I'm not sure whether they are practical.

Measurements convert statistical uncertainty (statistical entropy)
into randomness (algorithmic information content of the outcome).
Measurements decrease our ignorance about the state of the system,
but also increase the inefficiency (randomness) of the encoding data
process (the acquired information).
             
        S = H + K (at equilibrium, i.e. reversible computation)

where S is the physical entropy, H is the Shannon entropy,
K measures randomness (or the algorithmic information content)
of the information outcome.

Performing measurements far from equilibrium, H (Shannon entropy,
or ignorance) decreases rapidly, but K (randomness, algorithmic
information content) increases slowly.

Fortunately (as Wojciech H. Zurek said) we live in a far from
equilibrium universe (or multiverse, etc.). So "It pays to measure".

R.P. Feynman [Simulating Physics with Computers, Int. J. Theor.
Phys., 21-1982, p. 467] considered the possibility that "there is
to be an exact simulation, that the computer will do exactly the same
as nature". And that "everything that happens in a finite volume of
space and time would have to be exactly analyzable with a finite
number of logical operations". Now this brings us directly to the
halting problem, computability, analog-digital, quantum computing, etc.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT