RE: Emergence of information out of nothing?

From: Glenn Morton (
Date: Sun May 05 2002 - 00:35:03 EDT

  • Next message: Walter Hicks: "Re: Black Sea Flood"

    Hi Peter,

    >-----Original Message-----
    >From: Peter Ruest []
    >Sent: Saturday, May 04, 2002 6:47 AM

    >I call this a fundamental distinction between the following two
    >(I) Maximum information carrying capacity;
    >(II) Functional information relevant for biological systems.

    I am delighted to see this split because too many people who get into this
    area totally confuse the two. I think I would prefer to not to call II
    'information' but would call it functional probability between two
    molecules. Consider an isolated protein which is in a universe of its own
    with no other object. No matter how long it's sequence is, does nothing
    useful. It just sits there letting quantum fluctuations wiggle it. It has
    no 'information relevant for biological systems.'

    Now bring in a second protein into this universe and let it bounce around
    bumping into the original one. At some of the collisions, the two molecules
    stick together for a bit. at others, one of the proteins splits--i.e. it has
    been catalyzed by the other one. What we have is not information but a
    probability of a particular functional interaction. This can only happen as
    two molecules interact. Lacking a second a second protein in the universe,
    there is no function and thus no biologically relevant information in your

    In some sense this is identical with language where meaning is a private
    agreement between various peoples that certain sounds 'mean' certain things.
    Biological functionality is only possible when two molecules combine to
    perform some chemical action.

    >For at least 20 years, I have drawn this distinction between (I) and
    >(II) clearly, in both talks and articles. Glenn, thank you for your fine
    >summary of Shannon information, which I knew in principle, but could not
    >have formulated as well as you did.

    THank you for the kind words.

    It corresponds to (I), and to what
    >C.J. Hogan discussed in the paper I cited. I agree that, in itself, this
    >has nothing to do with semantics, meaning or function denoted by concept
    >(II). But it denotes an absolute upper limit of the amount of semantic
    >information that can be transmitted or stored in a given system. And
    >this is one of the points I wanted to make in my post.

    I could not agree that I places a limit on the transmission of semantic
    information. Semantic information simply isn't related to I. Suppose you are
    a commander in my army and I give you a book with detailed plans on what to
    do and where to go under 3 different conditions, plan A, plan B and plan C.
    And I tell you that I will 3 days from now send you on your computer a % for
    plan 1, a * for plan 2 and a R for plan 3. Each symbol stands for or 'means'
    the entire relevant plan. When I send you that signal, I have sent you lots
    and lots of sematic information but very little Shannon information.

    Because of this private agreement for meaning, one can't quantify it. And
    unless one can quantify it, he can't quantify your 'biologically relevant

    It is the same problem as trying to determine which of the following
    sequences has meaning.

    ni ru gua wo shou bu de bu dui jiao wo hao hao?
    ru wo de bu dui ni jiao shou wo gua hao bu hao?
    gua wo ru shou de bu wo hao bu dui ni jiao hao?
    ru gua wo shou de bu hao dui ni jiao wo hao bu?
    gua wo shou de bu dui ni jiao ru wo hao bu
    wo shou de bu dui ni hao jiao wo ru gua bu hao?
    dui ni jiao ru gua wo hao shou de bu wo hao bu?
    ru gua wo shou ni jiao hao wo hao bu de bu dui?
    dui ni jiao wo hao bu hao ru gua wo shou de bu?
    shou de bu wo hao bu hao ru gua wo dui ni jiao?
    ru dui ni gua wo de bu hao wo bu shou jiao hao?
    bu dui ni ru wo shou de gua bu hao jiao wo hao?

    If you can tell which has meaning, then you can determine biological

    >The semantic, meaningful, or functional information (II) is extremely
    >difficult to define properly for natural systems, as Howard correctly
    >points out.
    >-- The term "semantic" indicates that it is coded in DNA, in analogy to
    >a language.
    >-- The term "functional" indicates that it provides a specification for
    >a function, or what a biological macromolecule, complex, or other system
    >part will _DO_, as Howard emphasizes.
    >-- The term "meaningful" indicates a teleological view which designates
    >the effect of this function in the context of the whole organism.

    Function is only between 2 or more molecules, not necessarily within a
    whole system. a protein catalyzes a certain reaction regardless of whether
    it is in an organism or not.

    >While it is easy to compute the amount of "information" (I), as Glenn
    >has shown, different factors make it difficult to estimate an amount of
    >"information" (II).
    >(1) Synonymy: different molecular structures or molecules may have the
    >same effect, such that it doesn't matter which one is used.
    >(2) Redundancy: different operational pathways may salvage a system in
    >case one of them is damaged.
    >(3) Ecology: depending on the current environment, a given function may
    >or may not be needed, or may have different selective values.
    >(4) Population dynamics: population size and time may determine the
    >survival of a given feature.
    >(5) Microevolutionary accessibility: different sequence configurations
    >may be more or less easily reached by a mutational random walk.
    >(6) Robustness: depending on its location in sequence space, a
    >macromolecule may be more or less apt to survive during evolution.
    >These factors happen to come to mind at present ... there may be more.
    >I'm sure biologists will be able point out others.
    >Now, is this information (II) perhaps equal to zero, such that it can be
    >neglected entirely? Then a virtual infinity of viable evolutionary paths
    >would be possible, and it would be certain that life evolved wherever
    >the conditions are not extremely inimical. In this case, E.L. Shock,
    >whom I also quoted, probably wouldn't consider it to be "a major
    >challenge" to find feasible ways leading from the ubiquitous small
    >organic "building-blocks of life" to living systems. This is the second
    >point I wanted to make in my last post.

    Information is related to entropy and must have a p log(p) format. Can you
    derive such a thing for what you define as information II? If you can't,
    then it isn't any form of information. It may be something else, but just
    not information.

    As to multiple pathways. I recall back in the early 1960s, the argument
    against the chance formation of a peptide was based upon the chance of
    finding a single sequence out of all sequence space. So for oxytocin, an 8
    amino acid peptide, the chance of making human oxytocin was 1 in 8^20 or 1
    in 10^18. Of course they would use a 100 unit long peptide and have 20^100
    or 1 in 10^130. and give an indignant conclusion as to how could anyone
    believe such odds. But as we have learned things over the past 30 years, we
    have brought those numbers down to 1 in 10^40 because we know now that more
    than one sequence can perform the same task. And experiments by Gerald
    Joyce, Jack Szostak and others, show that functionality in a test tube full
    of RNA is found at a rate of 1 in 10^14 to 1 in 10^18. That is observation.
    So the problem isn't nearly as bad as apologists have been saying for years
    and years.


    for lots of creation/evolution information
    personal stories of struggle

    This archive was generated by hypermail 2b29 : Sat May 04 2002 - 17:04:24 EDT