A peer-reviewed paper, “Information and Entropy — Top-Down or Bottom-Up Development in Living Systems?,” by University of Leeds professor Andy McIntosh in the International Journal of Design & Nature and Ecodynamics expressly endorses intelligent design (ID) via an exploration of a key question in ID thinking:
The ultimate question in origins must be: Can information increase in a purely materialistic or naturalistic way? It is not satisfactory to simply assume that information has to have arisen in this way. The alternative of original design must be allowed and all options examined carefully.
A professor of thermodynamics and combustion theory, McIntosh is well acquainted with the workings of machinery. His argument is essentially twofold:
(1) First, he defines the term “machine” (a device which locally raises the free energy) and observes that the cell is full of machines. Such machines pose a challenge to neo-Darwinian evolution due to their irreducibly complex nature.
(2) Second, he argues that the information in living systems (similar to computer software) uses such machines and in fact requires machines to operate (what good is a program without a computer to run it?). An example is the genome sitting on the DNA molecule. From a thermodynamics perspective, the only way to make sense of this situation is to understand that the information is non-material and constrains the thermodynamics so that the local matter and energy are in a non-equilibrium state.
McIntosh addresses the objection that, thermodynamically speaking, highly organized low entropy structures can be formed at the expense of an increase in entropy elsewhere in the universe. However, he notes that this argument fails when applied to the origin of biological information:
whilst this argument works for structures such as snowflakes that are formed by natural forces, it does not work for genetic information because the information system is composed of machinery which requires precise and non-spontaneous raised free energy levels – and crystals like snowflakes have zero free energy as the phase transition occurs.
McIntosh then tackles the predominant reductionist view of biological information which “regards the coding and language of DNA as essentially a phenomenon of the physics and chemistry of the nucleotides themselves.” He argues that this classical view is wrong, for “biological structures contain coded instructions which … are not defined by the matter and energy of the molecules carrying this information.”
According to McIntosh, Shannon information is not a good measure of biological information since it is “largely not relevant to functional information at the phenotype level.” In his view, “[t]o consider biological information as simply a ‘by product’ of natural selective forces operating on random mutations is not only counter-intuitive, but scientifically wrong.” According to McIntosh, one major reason for this is “the irreducibly complex nature of the machinery involved in creating the DNA/mRNA/ ribosome/amino acid/protein/DNA-polymerase connections.” He continues:
All of these functioning parts are needed to make the basic forms of living cells to work. … This, it may be argued, is a repeat of the irreducible complexity argument of Behe , and many think that that debate has been settled by the work of Pallen and Matzke  where an attempt to explain the origin of the bacterial flagellum rotary motor as a development of the Type 3 secretory system has been made. However, this argument is not robust simply because it is evident that there are features of both mechanisms which are clearly not within the genetic framework of the other. That is, the evidence, far from pointing to one being the ancestor of the other, actually points to them both being irreducibly complex. In the view of the author this argument is still a very powerful one.
Further citing Signature in the Cell, McIntosh states:
What is evident is that the initial information content in DNA and living proteins rather than being small must in fact be large, and is in fact vital for any process to work to begin with. The issue of functional complexity and information is considered exhaustively by Meyer [93, 94] who argues that the neo-Darwinist model cannot explain all the appearances of design in biology.
So how do biological systems achieve their highly ordered, low-entropy states? McIntosh’s argument is complementary to that of Stephen Meyer’s, but it takes a more thermodynamic approach. According to McIntosh, information is what allows biological systems to attain their high degrees of order:
the presence of information is the cause of lowered logical entropy in a given system, rather than the consequence. In living systems the principle is always that the information is transcendent to, but using raised free energy chemical bonding sites
McIntosh solves the problem of the origin of information by arguing that it must arise in a “top-down” fashion which requires the input of intelligence:
[T]here is a perfectly consistent view which is a top-down approach where biological information already present in the phenotypic creature (and not emergent as claimed in the traditional bottom-up approach) constrains the system of matter and energy constituting the living entity to follow intricate non-equilibrium chemical pathways. These pathways whilst obeying all the laws of thermodynamics are constantly supporting the coded software which is present within … Without the addition of outside intelligence, raw matter and energy will not produce auto organization and machinery. This latter assertion is actually repeatedly borne out by experimental observation – new machinery requires intelligence. And intelligence in biological systems is from the non-material instructions of DNA.
This thinking can be applied to DNA: since “the basic coding is the cause (and thus reflects an initial purpose) rather than the consequence, [the top-down approach] gives a much better paradigm for understanding the molecular machinery which is now consistent with known thermodynamic principles.” McIntosh explains that the low-entropy state of biological systems is the result of the workings of machines, which must be built by intelligence:
It has often been asserted that the logical entropy of a non-isolated system could reduce, and thereby new information could occur at the expense of increasing entropy elsewhere, and without the involvement of intelligence. In this paper, we have sought to refute this claim on the basis that this is not a sufficient condition to achieve a rise in local order. One always needs a machine in place to make use of an influx of new energy and a new machine inevitably involves the systematic raising of free energies for such machines to work. Intelligence is a pre-requisite.
He concludes his paper with an express endorsement of intelligent design: “the implication of this paper is that it supports the so-called intelligent design thesis – that an intelligent designer is needed to put the information into the biological system.”
I have no doubt that the editors of International Journal of Design & Nature and Ecodynamics will take much heat for publishing this paper. Even though they make it clear that “[t]he reader should not assume that the Journal or the reviewers agree with the conclusions of the paper,” they should be commended for their courage in publishing it it and calling it a “a valuable contribution that challenges the conventional vision that systems can design and organise themselves.” They write, “The Journal hopes that the paper will promote the exchange of ideas in this important topic” — showing that there is hope for true academic freedom on the debate over ID in some corners of the scientific community.