> Dossy Shiobara:
> Even in a system of low entropy, there's still an immeasurably
> large amount of disorder even if we can't measure or observe it
In general, one can regard measurements, performed on a system,
also as a way of turning (statistical) uncertainty into (algorithmic)
randomness. And this is interesting.
For "equilibrium" systems,
S = H + K = nearly constant
[sometimes this is called Zurek's "triangle"]
where S is the physical entropy,
H is the statistical entropy (Shannon entropy,
in presence of some partial information or data),
K is the algorithmic information content (of data).
When measurements are carried out on "equilibrium"
systems, the randomness in the data increases at
a rate given by the decrease of ignorance.
But for "far from equilibrium" systems,
S = H + K = not constant
where S is the physical entropy,
H is the statistical (Shannon) entropy,
K is the algorithmic information content.
When measurements are carried out on systems
which are "far from equilibrium", the increase of
randomness is _much_ smaller than the decrease
of ignorance. And that allows us to extract useful
work.
DW = kT (DH + DK) = kT DS
The universe is precisely such a "far from
equilibrium" system. Fortunately.
- Wojciech H. Zurek, Nature, 341, (1989), pages 119-124
- W.H. Zurek, in "Complexity, Entropy and the Physics
of Information", ed. W. H. Zurek, Perseus Books, 1991.
This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:25 MDT