Jason Rosenhouse, a mathematician who teaches at James Madison University, is the author of the recent book The Failures of Mathematical Anti-Evolutionism. The purpose of the book is to discredit the mathematical and algorithmic arguments presented by ID proponents against the plausibility of undirected evolution crafting complex novelties. Rosenhouse focuses much of his critique on William Dembski’s design-detection formalism based on specified complexity. Dembski responded in detail to Rosenhouse’s arguments, highlighting Rosenhouse’s confusion over Dembski’s theoretical framework and its application to biological systems (here, here). Rosenhouse in turn responded to Dembski’s critique. His counter-response, published at Panda’s Thumb, reveals that his opposition to Dembski is not based on any flaws in the substance of Dembski’s work but instead on Rosenhouse’s unassailable faith in the limitless creative power of natural selection.
Dembski’s Framework for Design Detection
Dembski’s approach to detecting design centers on identifying the presence of specified complexity (SC). A pattern or object demonstrates SC if it has high probabilistic complexity but low descriptive complexity. Dembski explains:
To appreciate how probabilistic and descriptive complexity play off each other in specified complexity, consider the following example from poker. Take the hands corresponding to “royal flush” and “any hand.” These descriptions are roughly the same length and very short. Yet “royal flush” refers to 4 hands among 2,598,960 total number of poker hands and thus describes an event of probability 4/2,598,960 = 1/649,740. “Any hand,” by contrast, allows for any of the total number of 2,598,960 poker hands, and thus describes an event of probability 1. Clearly, if we witnessed a royal flush, we’d be inclined, on the basis of its short description and the low probability event to which it corresponds, to refuse to attribute it to chance.
Only a mind can generate significant quantities of SC (e.g., dealing four royal flushes in a row), so its presence reliably points to design.
Dembski has also collaborated with computer scientists Robert J. Marks and Winston Ewert to prove conservation of information theorems. These theorems demonstrate that search algorithms cannot find targets corresponding to high levels of specified complexity unless they are provided with information about the target (here, here, here, here). Consequently, no undirected evolutionary process, known or unknown, could achieve any innovation or transformation that requires a significant quantity of generated SC.
Rosenhouse responds to Dembski and his colleagues by asserting that their research has no relevance to biological evolution. This, in his mind, is for several key reasons. First, he claims that probabilities cannot be reliably assessed for the origin of biological structures:
Anti-evolutionists routinely present spurious probability calculations meant to refute evolution. In a lengthy chapter on probability, I explain that a proper calculation must take place in the context of what mathematicians refer to as a “probability space”. For our purposes, this means that you must have a good grasp on the range of possible outcomes, as well as an understanding of the probability distribution appropriate to those outcomes. In the context of the evolution of complex adaptations, we never have what we need to do this. As Harvard biologist Martin Nowak put it, “You cannot calculate the probability that an eye came about. We don’t have the information to make this calculation.”
Second, biological structures do not conclusively represent specified patterns:
But there is an obvious problem with applying this notion to evolutionary questions: How do we distinguish design-suggesting patterns from those we impose on nature through excessive imagination? Dembski claims that we can use the function of a complex adaptation as its specification, but biologists claim that natural selection crafts functional structures as a matter of course. Given this, does the function of a system suggest design or just natural selection? How can we be confident that in using function as a specification we are not doing the equivalent of looking at a fluffy, cumulus cloud and seeing a dragon?
And third, the conservation of information theorems do not challenge evolutionary theory since nature provides the required information to discover complex novelties:
As applied to biology, his argument is nothing more than the claim that nature has to be a certain way for evolution to work. Most of us did not need difficult mathematical theorems to understand this. The fitness landscapes confronted by evolving organisms arise ultimately from the laws of physics, and Dembski and his collaborators are really just asking why the universe is as it is. It’s a perfectly good question, but hardly one within biology’s domain.
Imagination Superseding Evidence
Rosenhouse’s response to Dembski ultimately fails since it is based on what he imagines to be true about biology instead of what has been empirically demonstrated. The assertion that probabilities cannot be evaluated for biological systems is highly misleading. Exact probabilities are typically impossible to compute but calculating upper bounds to probabilities is often tractable.
Douglas Axe demonstrated for the beta-lactamase enzyme that the upper bound for the enzyme’s larger domain is 1 functional sequence in every 1077 randomly selected ones. Rosenhouse attempts to discredit this estimate by citing Arthur Hunt’s critique, but he fails to acknowledge that Axe and others showed that such negative assessments reflect misunderstandings of his research and the technical literature (here, here, here, here).
The beta-lactamase enzyme performs a relatively simple function. Consequently, the sequence rarity for a significant proportion of proteins in life almost certainly corresponds to a quantity of specified complexity that could only have arisen through intelligent agency. This conclusion is reinforced by a Tian and Best 2017 article that estimated the rarity of functional protein domains using a method completely different from Axe’s. Their results confirm that most domains larger the 80 amino acids are too rare to have emerged through an undirected search.
In addition, Ola Hössjer, Günter Bechly, and Ann Gauger published a mathematical model for the time required for coordinated mutations to appear in a population. Their model demonstrates that for most animals the time available for major transitions is insufficient for even a few new regulatory sequences to emerge. Yet the evolution of a structure as simple as a lens for a vertebrate eye requires dozens if not hundreds of such specified sequences (here, here). An upper bound to the probability for such a large quantity of specified complexity to arise is minuscule.
Parallels with Human Engineering
In addition, engineers working with biologists have concluded that living systems demonstrate the same specified patterns as is seen in human engineering. For instance, leading experts in bacterial flagella have not simply concluded that these molecular machines resemble rotary motors. Instead, they concluded that they are rotary motors (here, here, here). And flagellar navigation systems perform perfect robust adaptation, which is only achievable by two classes of control modules (here, here). The conclusion that such biological systems display specified complexity is indisputable.
Finally, the view that nature provides the information for evolutionary searches conflicts with a torrent of recent literature that demonstrates that evolutionary/adaptive processes almost exclusively tweak preexistent structures or choose from a preexistent library of traits (here, here, here). Genetic information is never gained in significant quantities, but it is at best maintained and often lost (here, here, here). In short, Rosenhouse’s belief in the creative power of evolutionary processes is based not on hard data but on his faith in the philosophy of scientific materialism and on circular reasoning.