|
Re: information/ entropy
Lief M. Hendrickson (hendrick@NOSC.MIL)
Thu, 16 Feb 1995 17:21:17 PST
In response to Mike Lieber's post on Feb. 15 discussing entropy:
The topic of entropy came about due to use of the term
"anti-entropic" in a post by Thomas Rimkus. I used the term,
"pro-entropic" to describe an opposite condition. So were we
both wrong? It might help if Thomas Rimkus would tell us
what he meant by "anti-entropic". Or maybe there is some
confusion over interpretation.
Information theorists have borrowed the term "entropy" from the
primary meaning which is part of basic physics. I did use the
term "information", so I guess I'm partly to blame for the
confusion. What I meant was use of the term in the primary sense
which here would relate to order vs. chaos.
Entropy is a well known term in the physical sciences, and its
description can be found in any book covering thermodynamics.
A section on thermodynamics is included in most general physics
and chemistry textbooks. There are books devoted solely to
entropy. As I stated in my previous message, it is a measure of
the amount of disorder. In a closed system, entropy always
increases. All actual processes are irreversible (i.e. until we
find a perpetual motion machine!) and therefore occur with a
decrease in the amount of energy available for doing something.
Entropy changes is the opposite direction from available energy.
A decrease in available energy coincides with an increase in
entropy. Any physicist will tell you the entropy of the universe
is always increasing (assuming no outside intervention!).
Mike referred to the works of Shannon and Wieren in the 1950's.
Their works on the statistical theory of communication were
within the broader field of information theory which was earlier
established as a "discipline". The First International Symposium
on Information Theory was held in London in 1950. The meeting of
some 120 people from 8 countries included mathematicians,
physicists, engineers, linguists, physiologists, geneticists, and
others. The studies have far reaching applications. For
example, a linguist's interest in measuring how much information
is contained in different speech sounds. Besides many
applications in electronic communication, the methodologies also
provide useful concepts that could be applied to collection and
processing of information from the field.
In information theory, in a certain context, the measure of the
amount of information is mathematically similar to the
statistical thermodynamic definition of entropy- hence,
appropriation of the term. However, the mathematical similarity
does not mean equivalence. To complicate matters, there is a
difference between information and information-content, and such
things as meaning and relevance of information are not regarded
in the same manner as in ordinary speech (in fact are not
generally covered).
Colloquial usage of "entropy" is thoroughly rooted in the
physicists definition as being the amount of disorder. As thus
applied, something that is not useful then has high entropy-
including reference to the relevance of particular information it
may contain. Redundant information is then just noise. I
recommend we stick to that meaning unless we are within the
context of applying information theory. Even when in the realm
of information theory, there are perils in using the term because
the coincidence of its meaning with common usage only applies in
a special sense.
Problems in commingling usage of the term, "entropy", were noted
by John R. Pierce in his book, An Introduction to Information
Theory, (Dovor Publications, page 80). In it he states: "Once we
understand entropy as it is used in communication theory
thoroughly, there is no harm in trying to relate it to the
entropy of physics, but the literature indicates that some
workers have never recovered from the confusion engendered by an
earlier admixture of ideas concerning the entropies of physics
and communication theory."
Hopefully, those who have stuck it out thus far in reading this
message will recover from the confusion over the unfortunate dual
usage of "entropy". It did, however, bring up application of
information theory in many types of research. I'd be very
interested in how Mike has applied the concepts as well as in
hearing from others who have an application or potential
applications- perhaps it would be better as off-line messages due
to the specialized nature of the topic.
|