Inge Frette wrote:
> Is Ratzsch correct when he argues that creationists GENERALLY have the
> entire universe (as a system) in mind in their argumentation ?
> Are the evolutionists that respond generally missing the point
> of the creationists?
Not in my experience. Most are not thinking so universally.
> In his book "Information theory and molecular biology" Hubert
> Yockey responds to the challenge given by the creationists.
> But he deals with the problem on the molecular level arguing
> that the entropy in question is the Shannon entropy from
> information theory and not the Maxwell-Gibbs-Boltzmann entropy
> from thermodynamics. He argues that it is important to differentiate
> between order and complexity and that it is complexity that is the
> important parameter (among those two) in genetic evolution.
> He also argues that the two entropies have nothing to do with
> each other, and therefore that "thermodynamics has nothing to do
> with Darwin's theory of evolution" (page 313).
> He even charges Prigogine for missing the point. On page 281
> Yockey writes
> "Shannon entropy does not enter in the formalism of Prigogoine et.al.,
> and thus they attempt to force non-equilibrium thermodynamics to
> play the role that should be assigned to information theory and coding
> theory in describing complexity and order. This again results in a
> confusion between order and complexity."
> Is Yockey right here ? Are his claims generally accepted among
> other scientists ? Is the whole discussion just a pseudo problem,
> because one has mixed order and complexity and mixed the
> entropy of information theory and the entropy of thermodynamics.
> Prigogine is Prigogine, and I have some problems understanding that
> an expert in thermodynamic like him could fail here. On the other hand
> Yockey is also an expert on the topic...
As I see it, the 2'nd law is fundamental; involving heat (energy) transfer and
therefore must apply to
biological systems, as Prigogine purports. This is why we need to eat.
Information entropy has a mathematical structure similar to Boltzmann's later
probabilistic founding of thermal
entropy but the former deals with an epiphenomenon losely called information and
not directly with physical
forces (dissipate or generative). Complexity and information are correlated, but
still being worked out allowing for differences; so, one really does have to be
Content-Type: text/x-vcard; charset=us-ascii;
Content-Description: Card for George Andrews Jr.
org:College of William & Mary
title:Graduate Student, Applied Sciences
fn:George Andrews Jr.