|
information
Mike Lieber (U28550@UICVM.BITNET)
Wed, 15 Feb 1995 12:27:09 CST
This is not to be taken as a put-down of Mr. Hendrickson, but FYI the
formula for information measurement is the entropy formula with a minus
sign in front of it. This is from the work of Shannan and Weaver in the
earty 1950s. This is why system theorists refer to both order and
information as negentropy (negative of entropy). Information is defined as
the probability of occurrence of a thing or event. The less probable the
event, the more information is conveyed by its occurrence. The more
probable an event, the less information an observer gets by its occurrence.
In the context of English text, for example, seeing a "q" leads us to
expect that a "u" will follow. The occurrence of u after q, then, gives us
no information at all. In a well shuffled deck of cards, the probability
of picking an ace of spades on the first draw of a card is 1 in 52. If the
first card is that ace, then its occurrence gives the observer a lot of
information. If it is not, and there have been 19 more draws without that ace,
then the ace appears on the next draw, its probability is 21/52, a lot less
information. Audio engineers use information theory to manipulate the
signal to noise ratio, trying to make random noise like static in a circuit
as improbable as money and equipment permit. There are some very interesting
implications of information theory/systemic order for anthropology and
history, and I've taken advantage of these in my own research.
Mike Lieber
|