# Randomness-chance-accidents

From: mortongr@flash.net
Date: Sat Sep 23 2000 - 15:42:33 EDT

• Next message: Vandergraaf, Chuck: "RE: What happens when the oil runs out."

Hi James,

James Stark wrote:

>>Can somebody paint a clearer picture of how chance, randomness, and accident
fit together in a consistent story that separates order events from disorder
events for Darwinian evolution? <<

I think Brian and I would probably follow Yockey in the definitions (Brian of course can disagree but I bet he won't). Life is NOT ordered. Life is organized. Order is what a crystal has. It can be represented by a sequence like:

01010101010101010101010101...

There is a pattern or repeating segments. The problem with saying life is ordered is that there is very little information in an ordered sequence. Yocky would define a random sequence as one in which the probabilities of each letter were the same. If you have 26 letters as in English, you would expect a long sequence to give each letter 1/26 of the time. That would be random. Random sequences have a high informational content using the defintions of information theory. Most anti-evolutionists use information to mean 'semantic meaning'. This is not the defintion of information theory. A sequence can have lots of information regardless of whether it has any meaning or not. Now comes the problem that most anti-evolutionists don't quite grasp. Organized sequences are quite similar to random sequences. Yockey writes:

"Organisms are often characterized as being 'highly ordered' and in the same paragraph as being 'highly organized'. Clearly these terms have opposite meanings in the context of this chapter. The first message discussed in section 2.4.1 is highly ordered and has a low entropy.[the first message is '0101010101010101010101' the second message a higher organized one is '0110110011011110001000' --GRM] Being 'highly organized' means that a long algorithm is needed to describe the sequence and therefore higly organized systems have a large entropy. Therefore highly ordered systems and highly organized ones occupy opposite ends of the entropy scale and must not be confused. Since highly organized systems have a high entropy, they are found embedded among the random sequences that occupy the high end of the entropy scale."
"Kolmogorov (1965, 1968) and Chaitin (1966, 1969) have called the entropy of the shortest algorithm. needed to compute a sequence its complexity. Chaitin (1975b) proposed a definition of complexity that that has the formal properties of the entropy concept in information theory. Chaitin (1970, 1979) and Yockey (1974, 1977c) pointed out the applicability of this concept in establishing a measure of the complexity or the information content of the genome.
...
"Thus both random sequences and highly organized sequences are complex because a long algorithm is needed to describe each one. Information theory shows that it is fundamentally undecidable whether a given sequence has been generated by a stochastic process or by a highly organized process. This is in contrast with the classical law of the excluded middle (tertium non datur), that is, the doctrine that a statement or theorem must be either true or false. Algorithmic information theory shows that truth or validity may also be indeterminate or fundamentally undecidable."~Hubert Yockey, Information Theory and Molecular Biology, (Cambridge: Cambridge University Press, 1992), p. 81-82.

The key point in the above is that it is IMPOSSIBLE to tell an organized (designed) sequence from one which is merely random. I can't stress the importance of that enough. If you can't tell an organized sequence with high informational content from a random sequence, then you can't tell if the sequence arose through random processes or through an intelligence who designed it. That is really why the random sequence of letters I 'decoded' can give rise to organized sentences. If it gave rise to ordered sequences then you would have a string like abababababab... which of course is highly ordered but has no meaning and has a low information content.

When Brian was talking about the high probability set he was pointing out that meaningful high informational content sequences are rare compared to meaningless sequences with a high info content. However, that was not what I was pointing out with my first post. I was pointing out that randomness can give rise to meaning. And that is something that anti-evolutionists fail to grasp.

This archive was generated by hypermail 2b29 : Sat Sep 23 2000 - 15:42:38 EDT