I expect that Ratzsch correctly describes the arguments of *some*
creationists, but not others. It is possible that he is right in that
some critics of creationism have actually responded to the latter when
they thought they were responding to the former. I doubt this
anti-evolutionary argument from the 2nd law pertaining to the whole
universe is as general and universal among creationists as Ratzch has led
you to believe.
>Are the evolutionists that respond generally missing the point
>of the creationists?
Maybe, sometimes. However, even if they are missing the point,
that point (of the 2nd law forbidding evolution based on an application
of that law to the whole universe) is, I think, *still* incorrect.
>In his book "Information theory and molecular biology" Hubert
>Yockey responds to the challenge given by the creationists.
>But he deals with the problem on the molecular level arguing
>that the entropy in question is the Shannon entropy from
>information theory and not the Maxwell-Gibbs-Boltzmann entropy
>from thermodynamics. He argues that it is important to differentiate
>between order and complexity and that it is complexity that is the
>important parameter (among those two) in genetic evolution.
>He also argues that the two entropies have nothing to do with
>each other, and therefore that "thermodynamics has nothing to do
>with Darwin's theory of evolution" (page 313).
I think Yockey is correct in his conclusion here if not necessarily in
his premises.
First of all, the supposed Shannon entropy -SUM_i{p_i*log(p_i)} is not
original with Shannon. It was first worked out by Gibbs in the area of
statistical mechanics in which he extended beyond the initial work of
Boltzmann. What Shannon did was use this measure in the area of
communication theory rather than in the area of statistical mechanics
and thermodynamics where it had previously been confined. Since that
time the formula for the Gibbs-Shannon entropy has been widely
applied to many disparate fields, essentially, whereever (usually,
discrete) probability measures are to be found. It is firmly entrenched
in Bayesian probability theory as a preferred intrinsic measure of the
uncertainty associated with a given probability measure on a given
probability space. What this entropy function actually measures for some
given probability distribution is the average minimal amount of further
information required to determine, with certainty, which outcome occurs
from a process described by that probability distribution, given that the
only prior information known about the process is in the specification of
the probabilities of that distribution itself. This concept is of
great generality and has applications in many fields, among which is
the thermodynamic entropy of physics and the entropy of communication
theory.
So, contrary to what Yockey implies, it is not the case that the Shannon
entropy has *nothing* to do with thermodynamic entropy, because in fact,
the thermodynamic entropy is a *special case* of the general Shannon
(actually, the Gibbs-Shannon) entropy concept which happens to
specifically apply when the probability distribution in question is the
distribution of microscopic states accessible to a macroscopically
described thermodynamic system defined solely by its macroscopic
properties. In Shannon's communication theory the relevant probability
distribution is the set of probabilities for each of the possible
messages to be transmitted down the communication channel to the
receiver. In other contexts the relevant probability distribution is
something else. For instance, in Yockey's case the relevant probability
distribution is for the probabilities of the various amino acid sequences
in the specification of a protein or the probabilities of the various
nucleotide sequences in a strand of nucleic acid. These probability
distributions have precious little to do with the thermodynamic entropy
for a living organism that contains such macromolecules.
All these different applications are just different special cases of
the use of the general entropy concept where each one is applied to the
appropriate probability distribution at hand. Essentially, each discrete
probability distribution, *no matter where it comes from*, has its own
unique (up to the chosen scale of units) Shannon entropy value.
This being said, this does not mean that the entropy that Yockey
considers is constrained by naive applications of the 2nd law of
thermodynamics, since that law only applies to thermodynamic entropy and
not to other kinds of entropy defined for other contexts. So Yockey is
correct that the questions of the viability or otherwise of Darwinian
evolution are decided by considerations other than by the typical (for
many creationists) applications of 2nd law arguments to evolution.
Yockey is also correct that entropy (defined a la Gibbs & Shannon) should
not be confused with complexity, which is an very different concept and
property. The main similarity between these two concepts is that both
entropy and complexity are denominated in terms of amounts of
information. In the case of complexity, it is defined for each given
realization of some system and represents the least amount of information
required to exactly specify the full reconstruction or production of that
system. In the case of entropy, it is not defined on the individual
exact realizations of a system, but is defined as a statistic or
functional of a probability distribution which is defined on some
probability space (even though those probabilities themselves may refer
to the individual realizations of the system). In the case of entropy,
the mean minimal information that it describes is just the amount of
information needed to uniquely distinguish one outcome (of the
distribution) from another in the sense of just the amount information
required to uniquely *label* or enumerate the various outcomes, *not* the
full information required to *reconstruct* each realization or outcome--
as is the case with the complexity.
>He even charges Prigogine for missing the point. On page 281
>Yockey writes
>"Shannon entropy does not enter in the formalism of Prigogoine et.al.,
>and thus they attempt to force non-equilibrium thermodynamics to
>play the role that should be assigned to information theory and coding
>theory in describing complexity and order. This again results in a
>confusion between order and complexity."
>
>Is Yockey right here ?
Not quite, but it is possible that Yockey is just being a little sloppy
in his wording. Since the non-equilibrium thermodynamic entropy that
Prigogine uses *is* the usual Gibbsian entropy (albeit for a
non-equilibrium rather than an equilibrium distribution of the
microscopic states of the macroscopic system), it is also an example of
the -SUM_i{p_i*log(p_i)} Shannon-esque entropy formula, and can be
thought of as a special example of the Shannon entropy. But, I believe,
the "information theory and coding theory in describing complexity and
order" to which Yockey is referring, involves the entropy and the
complexity associated with specifying various monomer sequences in
biologically relevant macromolecules, and this application is of *very*
little overlap with the non-equilibrium thermodynamic entropy that is
produced in the dissipative structures with which Prigogine is concerned.
>Are his claims generally accepted among
>other scientists ?
I don't know for sure because I haven't checked. But I doubt it (as
they are stated in the above quote).
>Is the whole discussion just a pseudo problem,
>because one has mixed order and complexity and mixed the
>entropy of information theory and the entropy of thermodynamics.
I doubt this as well. Both Yockey and Prigogine are too smart in
their fields to inadvertantly conflate these concepts. My guess (since
I haven't studied the relevant work or either of these people) is that
Prigogine is inclined to make some overstated claims for the relevance
of his dissipative structures to the process of biological evolution,
and Yockey is, properly, calling him on it--but Yockey does so in a
manner which is sloppy enough to be misunderstood. I suspect that
what Yockey most objects to in Prigogine's claims is that they probably
ignore or skip over some real problems associated with the very low
probabilities that Yockey calculates for some key macromolecules using
his specific application of information theory.
>Prigogine is Prigogine, and I have some problems understanding that
>an expert in thermodynamic like him could fail here. On the other hand
>Yockey is also an expert on the topic...
>Anwers or comments are welcome !
You have mine above.
Regarding David C's comment:
>
> ...I believe that informational
>entropy is linked to thermodynamics by a minimum amount of energy being
>necessary to store information.
That's one link. It takes at least k*T*ln(2) amount of energy for each
operation to store, retrieve, and/or manipulate a bit of information in
a classically operating computer so that the background thermal noise in
the computer doesn't accidentally erase or change the value of the bit
during the operation of the computer. In the case of the nucleic
acids it means that for them to work as an information memory the binding
energy in the bonds holding the bases in place must be significantly
greater than k*T so that thermal fluctuations do not inadvertantly
disassociate or resubstitute other bases into the strand. Needless to
say, this is hardly ever the case, since the mutation rate due to thermal
fluctuations is very low.
The other link of an informational theoretic entropy to thermodynamics is
in just the information theoretic meaning of the thermodynamic entropy
itself. The *meaning* of the thermodynamic entropy in a thermodynamic
system is that it is the mean minimal amount of further information
necessary to determine, with certainty, the exact microscopic state that
macroscopic thermodynamic system is actually in, given that the only
information known about the system is the macroscopic description which
defines that system's macrostate. In order to convert this information
measure in bits to the conventional thermodynamic entropy as measured in
energy/temperature units, one needs to use the conversion factor of
1 J/K of thermodynamic entropy is equal to 1.044933 x 10^23 bits (or
1 J/K = 1.21646 x 10^13 gigabytes) of needed microscopic information.
David Bowman
David_Bowman@georgetowncollege.edu