By information you usually mean a sequence of bits, ie a string of 1s and
0s. A bit is an abstract thing, and must be encoded somewhere, eg on a
magnetic tape. Information does not provide meaning until you interpret it.
(It is possible for the same piece of information to provide multiple
interpretations, and can thus mean different things to different observers.)
So maybe one should regard the interpretation as more fundamental that the
information. I like that because it gives you a more process oriented view
(its not as platonic). But then, given the right interpretation, an
interpretation itself can be interpreted as information. And we're kind of
back were we started...
Gerhard:
>'Information' is an extremely esoteric concept, particularly when taken
>from an information theoretic point of view. I'm fairly certain that there are
>deep connections between information and existence - take, for example,
>the information theoretic definition of entropy and the definition of entropy
>in statistical physics.
Yes, I recall a formula like H(X) = -SUM ( p_i log p_i ) as a measure of
both information and entropy. What I meant by simple minded was that it's a
very quantitative measure. Meaning on the other hand is very qualitative. I
have a 2G disk drive full of apps, but that figure doesn't tell me what I
can do with those apps.
For that I need something better, perhaps the application version number.
Isn't there a omega point theorem that says: when the amount of information
and processing power goes to infinite, so does the version number of Excel.
FELIX'98 - CITIUS . ALTIUS . FORTIUS