Re: Evolutionary rate

From: Iain Strachan (iain.strachan.asa@ntlworld.com)
Date: Sat May 10 2003 - 06:50:10 EDT

  • Next message: Iain Strachan: "Re: Evolutionary rate"

      ----- Original Message -----
      From: D. F. Siemens, Jr.
      To: iain.strachan.asa@ntlworld.com
      Cc: jarmstro@qwest.net ; joela@umn.edu ; asa@calvin.edu
      Sent: Friday, May 09, 2003 8:56 PM
      Subject: Re: Evolutionary rate

      Iain and Jim,
      It strikes me that Iain is emphasizing the need for multiple changes because engineers demand answers /now/, not eventually. Jim approaches matters from the viewpoint of eventual improvement, with no reason to label anything "rush!" Both outlooks are functional in their place, though it must be granted that comprehensive alterations would radically speed up evolution.
      Dave
    It is indeed a fair point that you don't have to rush, but the problem is that "irreducible complexity" (under my definition for multiple changes) causes exponential increase in the length of time to find a solution (exponential in the number or required simultaneous changes). This is simply a result of the fact that probabilities multiply up for coincident mutations. There is no need to rush, but it still has to take place within the lifetime of the universe :-)

    This problem is in fact recognized by all designers of Genetic Algorithms & a workround is put in. This is the well-recognized problem of "Hamming Cliffs", when GA bitstrings are used to encode numeric parameters (of course this is non-biological; but the principle is just the same). For example the binary representation of the number 511 is 0111111111 and for 512 it is 1000000000 . In order to progress from 511 to 512 you need to have 10 coincident mutations, because every single bit must change. Suppose the mutation rate in your program is 1e-3 (not unrealistic for a GA problem, though several orders of magnitude higher than in biological systems). The chance, therefore of 10 coincident mutations occurring here is therefore 10^-30 & therefore even if one evaluation takes place every millisecond, will not occur in the lifetime of the universe. Now, this is extreme of course, and assumes that 511-512 is the only way to improve. But it is still a major problem in GA's that optimize numeric functions; they invariably get stuck at the "Hamming Cliff". The workround is not to use binary coding for the numeric values, but instead to use "Gray coding", a scheme where successive integers always differ by 1 bit, for example for a 3-bit Gray code you could have 000 , 100, 110 010 011 111 101 001; a point to point traversal of the corners of a cube.

    But the BIG problem for GA's is that though there is a workround for a single numeric parameter, there isn't such a workround for multiple numeric parameters by doing a clever coding scheme from genotype to phenotype. If there is then the designer of the algorithm has to formulate the problem in such a way that the variables are decoupled. And if the variables are decoupled, then there isn't much point in using a GA in the first place because other methods would undoubtedly work much better.

    The real argument is whether the variables of life are decoupled, so that evolution can work. If they are indeed decoupled, then there is no such thing as Irreducible Complexity & evolution is a perfectly satisfying and logical explanation as to how everything came about. But if IC systems really do exist, then the law of multiplying probabilities still applies; the expected time to a solution blows up exponentially & 15 billion years begins to look like a drop in the ocean.

    There is one other subtlety, and that is the case of "neutral mutations"; i.e. mutations that cause no harm, and don't by themselves do any good, but when several of them accumulate, you have your irreducibly complex system in place. I won't go into the maths of this in detail, but the same exponential scaling of waiting time for the solution applies. The problem is that any mutation is just as likely to be UN-done as it is to occur, while it is waiting for all the others to occur. Imagine that you were collecting coupon cards (like you used to get in packets of tea or cigarettes). But suppose you had the rule that if you got a duplicate of one you already had, that you had to destroy both of them & wait for another one. Then it's relatively straightforward to show that the expected time you have to wait to get the whole collection rises exponentially with the size of the collection.

    Best,
    Iain.



    This archive was generated by hypermail 2.1.4 : Sat May 10 2003 - 06:50:46 EDT