On his blog, Sandwalk, University of Toronto biochemistry professor Laurence Moran expressed uncertainty concerning the basic argument of The Edge of Evolution: The Search for the Limits of Darwinism. So for anyone who wants a quick reprise of the book’s reasoning, below is a list of annotated bullet points plus some commentary summarizing it.
- If the development of some particular adaptive biochemical feature requires more than one specific mutation to an organism’s genome, and if the intermediate mutations are deleterious (and to a lesser extent even if they are neutral), then the probability of the multiple mutations randomly arising in a population and co-existing in a single individual so as to confer the adaptation will be many orders of magnitude less than for cases in which a single mutation is required.
- The decreased probability means either that a much larger population size of organisms would be required on average to produce the multiple mutations in the same amount of time as needed for a single mutation, or that for the same population size a multiple-mutation feature would be expected to require many more generations to appear than a single mutation one.
As a matter of simple population genetics theory, the two points above should be uncontroversial. Now let’s look at some empirical data.
- In The Edge of Evolution I cited the development of chloroquine resistance in the malaria parasite Plasmodium falciparum as a very likely real-life example of this phenomenon. The recent paper by Summers et al. confirms that two specific mutations are required to confer upon the protein PfCRT the ability to pump chloroquine, which is necessary but may not be sufficient for resistance in the wild.
- The best estimate of the per-parasite occurrence of de novo resistance is Nicholas White’s value of 1 in 1020. This number is surely made up of several components, including: 1) the probability of the two required mutations identified by Summers et al. coexisting in a single pfcrt gene; 2) the value of the selection coefficient (which can be thought of as the likelihood that the de novo mutant will successfully recrudesce in a person treated by chloroquine and be transmitted to another person); and 3) the probability of any possible further PfCRT mutation needed to confer chloroquine resistance in the wild coexisting in the same gene with the other mutations.
- The known point mutation rate of P. falciparum, combined with the apparent deleterious effect of the required mutations occurring singly, suggests that component 1 from the previous bullet point will account for the lion’s share of White’s estimate, probably at least a factor of 1 in 1015-1016 of it. The other factors would then account for 1 in 104-105. These values are somewhat flexible, accommodating the uncertainty in our knowledge of the exact values in the wild. In other words, a decrease in our best estimate of the value of one factor can be conceptually offset relatively easily without affecting the argument by supposing another factor is larger, to arrive at 1 in 1020.
These last three points, although based on inferences from empirical data rather than just pure theory, should also be pretty uncontroversial. Now let’s pass on to the dicier stuff.
- Any particular adaptive biochemical feature requiring the same mutational complexity as that needed for chloroquine resistance in malaria is forbiddingly unlikely to have arisen by Darwinian processes and fixed in the population of any class of large animals (such as, say, mammals), because of the much lower population sizes and longer generation times compared to that of malaria. (By "the same mutational complexity" I mean requiring 2-3 point mutations where at least one step consists of intermediates that are deleterious, plus a modest selection coefficient of, say, 1 in 103 to 1 in104. Those factors will get you in the neighborhood of 1 in 1020.)
- Any adaptive biological feature requiring a mutational pathway of twice that complexity (that is, 4-6 mutations with the intermediate steps being deleterious) is unlikely to have arisen by Darwinian processes during the history of life on Earth.
In the book I then go on to make a general argument that Darwinian processes could not have constructed the molecular foundation of life, but let’s leave that aside for now. Let’s just concentrate on the last two bullet points here.
Considered in the calmer context of the development of resistance to particular antibiotics (such as, say, a combination of chloroquine plus a second drug that is as difficult to evolve resistance to and works by an independent mechanism) — rather than in the highly charged context of intelligent design — even these two statements should seem reasonable to critics of ID. After all, many medical professionals searching for treatments for malaria are trying to do exactly that — to combine two very improbable mutational steps into an insuperable mutational pathway. If there were a second drug with the efficacy of chloroquine which had always been administered in combination with it (but worked by a different mechanism), resistance to the combination would be expected to arise with a frequency in the neighborhood of 1 in 1040 — a medical triumph.
Where a critic might demur is on the question of how many ways exist to solve an evolutionary problem of that mutational complexity. I think that’s due to a confusion about the need for particular mutations versus nonspecific mutations. While comparing the math of chloroquine resistance to mutations that have occurred in the primate line leading to humans, Professor Moran wrote, "Does he really mean that there can’t be any examples of two mutations occurring in the same gene since humans and chimps diverged?" No, of course not. That overlooks the requirement for the great specificity needed to build biochemical systems. For example, to achieve chloroquine resistance malaria must at least acquire the mutations K76T plus either N75E or N326D in PfCRT — two very particular amino acid positions in a very particular gene — not just any two amino acids in any gene. That of course makes a huge difference to the probability.
Moran also writes, "He seems to think that whenever we see such mutations they must have been the only possible way to evolve some new function or feature." Well, no, not the "only possible" way, but, yes, one of a very limited number of possibilities. (I wrote about this in my last article, too.)
In fact the number is limited enough that we can conclude with confidence that it won’t affect my argument summarized above. For example, suppose there were ten, or a hundred different ways to address a particular biochemical challenge. That would barely move the dial on a log scale that’s pointing at 1 in 1020.
What’s more, Nicholas White’s factor of 1 in 1020 already has built into it all the ways to evolve chloroquine resistance in P. falciparum. In the many malarial cells exposed to chloroquine there have surely occurred all possible single mutations and probably all possible double mutations — in every malarial gene — yet only a few mutational combinations in pfcrt are effective. In other words, mutation and selection have already searched all possible solutions of the entire genome whose probability is greater than 1 in 1020, including mutations to other genes. The observational evidence demonstrates that only a handful are effective. There is no justification for arbitrarily inflating probabilistic resources by citing imaginary alternative evolutionary routes.
To summarize, my argument concerns the evolutionary construction of biochemical features of specificity similar to malarial chloroquine resistance. The little-appreciated point I wanted to emphasize is that the likelihood of success decreases enormously if even a single mutational step of a pathway is disfavored. With more such steps, its improbability becomes prohibitive.