|
Re: information/ entropy
Thomas W. Rimkus (trimkus@COMP.UARK.EDU)
Fri, 17 Feb 1995 00:47:07 -0600
On Thu, 16 Feb 1995, Lief M. Hendrickson wrote:
> In response to Mike Lieber's post on Feb. 15 discussing entropy:
>
> The topic of entropy came about due to use of the term
> "anti-entropic" in a post by Thomas Rimkus. I used the term,
> "pro-entropic" to describe an opposite condition. So were we
> both wrong? It might help if Thomas Rimkus would tell us
> what he meant by "anti-entropic". Or maybe there is some
> confusion over interpretation.
> Colloquial usage of "entropy" is thoroughly rooted in the
> physicists definition as being the amount of disorder. As thus
> applied, something that is not useful then has high entropy-
> including reference to the relevance of particular information it
> may contain. Redundant information is then just noise. I
> recommend we stick to that meaning unless we are within the
> context of applying information theory.
I, like Mike Lieber, also see no need for disagreement.
I was, in fact, using the colloquial and by "anti-entropic" I was
refering to the credible and rapid focus that a tool like the listservers
on the net can bring. Traditional media material gets filtered and
merged to the point that it is unreliable and therefore, to me, contains
little "information". Professional journals are held to a higher
standard by peer review and are thus more reliable than the common media,
but information is slow to appear. Net lists can and should be held to
those same standards by immediate review. That is not to say that a
little gets on your shoes occasionally, but if the system is bounded and
stabilized by an active membership, it will remain healthy and valid. I
have this little feeling, however, that if the ugly things about the
powers that be were to fall on the wrong ears, certain funding which
helps run this technological wonder would dry up faster than the "oil" on an
original Mapplethorpe.
|