Information theory, entropy, and evolution

Steve Mizrach (SEEKER1@NERVM.NERDC.UFL.EDU)
Mon, 20 Feb 1995 02:00:13 +0000

How gloriously synchronistic that the IT/entropy and
dimorphism/evolution/intelligence threads all seem to be converging! (At
least in my mind they are.)

I recently wrote a little piece for Crash Collusion (not a peer-reviewed
journal, mind you, but this little 'zine was rated #3 on FactSheet Five's
top ten list... ) dealing on these very matters entitled "Flesh Made Word,"
pondering the curious entanglements between language, information,
evolution, entropy, and order.

It basically derived from memes that invaded my head with some fury after
reading Jeremy Campbell's _Grammatical Man_: an excellent book that I
recommend for all to read.

Campbell basically looks at the primary principle behind meaning in
language and communication systems (i.e. that which makes one meaning more
probable than others) and suggests that it is redundancy. Some letters are
more common than others ("e" for example), and this has been known to
cryptographers since time immemorial. This is the same principle which
checks against errors in telecommunications - the extra bit your modem
handles to do a CRC (cyclic redundancy check) of the other 7, for example.

Anyway, not to go into a lot of detail here, but Campbell suggests the
rather interesting hypothesis that the main problem with the neo-Darwinian
evolutionary synthesis is that it considers all possible "sentences" formed
from the DNA "alphabet" to be equally probable.

(Hmmm. It seems to me that earlier on the list, we were told to choose
either the Way of Darwin, or the Way of Creationism. I guess this puts me
in league with the creationists. Onward, then.)

Anyway, Campbell suggests that there may be a certain redundancy built into
DNA (evidence for which comes out of the "jumping genes" theory of Barbara
McClintock and the Human Genome Project) - a "grammar" if you will that
constrains DNA 'messages' and guards against errors. All possible mutations
(reshuffling of genetic information) are therefore not equally probable.
Some sentences are more likely than others.

Evolution, assumed to be ateleological, may in fact be negentropic after
all. It was formerly assumed that mutations were purely random, and thus
natural selection was merely the bullet killing off every monkey that
failed to produce a perfect Shakespeare from their typewriter. It turns out
the monkeys working the DNA typewriters may know more than they're letting
on.

Many computer scientists in the A-life field are parting with the
neo-Darwinian synthesis. They are daring to suggest the heretical
possibility that mutation may be *algorithmic*, and not just purely random.
(The next blows to the neo-Darwinian synthesis will not be from Lamarckians
or Creationists; they will be from computer scientists.)

The significance of this possibility for the emergence of human
intelligence in our particular hominid line I leave to the reader.

Yours,




Seeker1 [@Nervm.Nerdc.Ufl.Edu] (real info available on request)
CyberAnthropologist, TechnoCulturalist, AnthroFuturist, Topothesian
Home Page URL: http://www.clas.ufl.edu/anthro/Seeker1_s_CyberAnthro_Page.html
"One measures a circle, beginning anywhere." -- Charles Fort