
entropy and chaos
Mike Lieber (U28550@UICVM.BITNET)
Fri, 28 Oct 1994 16:50:31 CDT
The theories are not the same. No matter. We've had 2 rounds on chaos on
the Net, but this is the first time someone has had the guts to bring it up
admitting to knowing nothing about either. That means we can start from
scratch. Entropy and chaos are both about order and disorder. Entropy is
a measure of disorder using the formula LOG base 2 H. H is the
probability of occurrence of some one thing or event from a set of possible
events. (Information is measured by the same formula, but with a negative
sign in front of it; thus information = order = negentropy). Chaos is, as
I understand it, in the most general terms about underlying order in at
least some phases of seemingly disordered phenomena. The math for chaos is
quite different from that for entropy. That's it for me. Now someone else
come and clean up the mess I started with chaos and make it real clear.
Mike lieber
