At 11:53 PM 4/8/99 -0700, Ami wrote:
>Pim: Indeed, or that evolution requires such an increase in complexity.
>Evolution, the observed fact, may not require complexity, but doesn't the
>Theory? Isn't increased complexity in organisms through the history of life
>an observed fact, that must be explained by the theory?
Yes, it must be explained by the theory but it would not be a part
of the statement of the theory since it is not inevitable that complexity
increases. That complexity *has* increased in many instances is part
of the "facts about evolution" that need to be explained by the theory.
Let me give an analogy. Nonlinear dynamics sometimes leads to chaotic
behavior but not always. One would not expect chaos to part of the
definition of a nonlinear dynamical system but you would expect
nonlinear dynamics to be able to expain chaos.
>Also, tell me if I'm wrong, but isn't a bacterium which gains the ability to
>break down polyurethane(? is that the one?) more complex than its
>predecessor which can't? Let me elaborate.
>Lets imagine a gene for a protein very similar to the one which is the
>enzyme which can break down the polyurethane. At some point, this gene gets
>duplicated. There are now two of them in the genome of the bacteria. By
>chance, one of them mutates into this beneficial protein which opens up a
>whole new food source to the bacteria.
>Two steps: 1.duplication 2.mutation Now, either of these steps taken in
>isolation may not be an increase in complexity. However, both of them taken
>together cause the bacteria, which before could only produce the one useful
>protein, to now be able to produce two different useful proteins. Isn't
>that an increase in complexity?
Pim has already indicated that one needs a definition of complexity
to answer this. Let me take it further by saying that if one adopts one
of the standard (and IMHO best) definitions of complexity (Kolmogorov
Complexity) then there is no question that the situation you
describe above is an increase in complexity. Kolmogorov Complexity
comes from Algorithmic Information Theory. In this theory,
information content and complexity are the same thing, hopefully
this will be apparent from the following description.
The easiest way to think of Kolmogorov Complexity is in terms of
the closely related Descriptive Complexity. Here, the complexity
of some object A is the length of the "shortest" description of
A. The main difference between Descriptive and Kolmogorov Complexity
is that in Descriptive Complexity the descriptions are in English
(or some other language) whereas in Kolmogorov Complexity the
descriptions are computer algorithms.
Ok, let's consider your two steps above and let A refer to the
original gene and A' to the copy. The duplication process itself
results in an increase in complexity (descriptive length) which
is probably relatively minor. Once A has been fully described
the case with the dupicated gene can be desribed compactly
by "repeat A". You don't actually have to describe A again,
you just have to indicate that there's another one now. If
"repeat A" is very short in comparison with the original
description of A, then the increase in complexity is very minor.
The real increases in complexity come when one starts modifying
A', but note that the first modification would not result in a
sudden doubling of the complexity. At first you would still
not require a complete description of A' since it would be
more compact to say "repeat A with these changes: .....".
As the number of changes increase, the descriptive length
also increases, but gradually, not as an immediate doubling.
Thanks again for the example. Its been discussed before so
I'm kicking myself for not thinking of it. It seems to
me the simplest and easiest to understand mechanism for
increasing complexity wherein one has a very precise meaning
attached to "complexity".
The Ohio State University
"All kinds of private metaphysics and theology have
grown like weeds in the garden of thermodynamics"
-- E. H. Hiebert