Intelligent Design Icon Intelligent Design

Life Fights Entropy with Intelligent Design

Image credit: Illustra Media.

Engineers know they can’t eliminate entropy when doing work or computation, but they are finding clever ways to minimize it — like, the way life does.

First, let’s remember that entropy is an unavoidable cost of doing anything useful. Living things fight this all the time. A paper in PNAS by Dominic J. Skinner and Jörn Dunkel explains:

Thermodynamic laws place fundamental limits on the efficiency and fitness of living systems. To maintain cellular order and perform essential biological functions such as sensing, signaling, replication or locomotion, organisms consume energy and dissipate heat. In doing so, they increase the entropy of their environment, in agreement with the second law of thermodynamics. [Emphasis added.]

Even thinking increases entropy, because the brain, running on potatoes (as A. E. Wilder-Smith used to quip), is sending electrical impulses and neurotransmitters along neurons and synapses. That requires work (W = F X D: work is force times distance). Work requires energy; energy necessarily increases entropy. 

The authors sought to find better ways to measure the entropy production in living systems. If engineers want to maximize efficiency of designed systems, they need to be able to measure it. In practice, too many variables are hidden even in a simple molecular machine. To address the challenge, they introduced a new approach to measure bounds on entropy production, and tried it on three molecular processes:

To help overcome these limitations, we introduce here a generic optimization framework that can produce significantly improved bounds on the entropy production in living systems. We will prove that these bounds are optimal given certain measurable statistics. From a practical perspective, our method requires observations of only a few coarse-grained state variables of an otherwise hidden Markovian network. We demonstrate the practical usefulness by determining improved entropy production bounds for bacterial flagella motors, growing microtubules, and calcium oscillations in human embryonic kidney cells.

(Regarding those human embryonic kidney cells, the authors used data from Kevin Thurley et al., in Science. Those authors do not state how they obtained the cells.)

Measuring the entropy of flagellar switching will help engineers, they say, as they recall the astonishing capabilities of these icons of intelligent design:

By rotating helical flagella, many species of bacteria can swim, reaching speeds of tens of body lengths per second. Each flagellum is driven by a remarkable nanoscale motor, powered by a flux of ions across the cytoplasmic membrane, which can achieve over 100 rotations per second. Measuring the entropy production of the motor promises insights into the efficiency of small self-assembled engines and microbial locomotion.

The authors did not state the efficiencies of the flagellum, microtubules and kidney cell oscillations; their focus was to define a coarse-grained, model-agnostic estimator that could be “used to infer additional thermodynamic costs of regulating molecular fluctuations by quantifying the entropic tradeoffs cells are forced to make.” And forced they are:

Entropic costs limit the accuracy of biological sensory systems, biological clocks, and intrinsic noise suppression in cells. Beyond direct applications to experimental data, the current framework can help us understand and quantify tradeoffs between the faithful execution of a biological function and the energy expended to do so.

Rethinking Computation

The National Institute of Standards and Technology (NIST) is also trying to minimize entropic costs. They announced, “Race Logic: Novel Circuitry Solves a Myriad of Computationally Intensive Problems with a Minimum of Energy.” That’s race as in footrace. Organisms face this kind of sport all the time.

From the branching pattern of leaf veins to the variety of interconnected pathways that spread the coronavirus, nature thrives on networks — grids that link the different components of complex systems. Networks underlie such real-life problems as determining the most efficient route for a trucking company to deliver life-saving drugs and calculating the smallest number of mutations required to transform one string of DNA into another.

To reduce energy costs of solving these complex problems, three NIST scientists did away with bits (0 and 1) and redesigned computation in terms of time stamps. Flipping bits from 0 to 1 and back again is energy intensive, they say. 

In contrast, race logic encodes and processes information by representing it as time signals — the time at which a particular group of computer bits transitions, or flips, from 0 to 1. Large numbers of bit flips are the primary cause of the large power consumption in standard computers. In this respect, race logic offers an advantage because signals encoded in time involve only a few carefully orchestrated bit flips to process information, requiring much less power than signals encoded as 0s or 1s.

The article includes diagrams of how the method could solve old problems like finding the most efficient paths for trucks to deliver packages. Although the scientists do not explicitly say so, this appears to be the logic behind some cell processes, such as regulation of enzyme concentrations. In PNAS, Surajit Chatterjee et al. introduce the concept that transcription and translation are tightly coupled. Since such processes involve time encoding, energy-efficient “race logic” would apply. Commenting on this paper, also in PNAS, Michael W. Webster and Albert Weixlbaumer write, 

Two conserved processes express the genetic information of all organisms. First, DNA is transcribed into a messenger RNA (mRNA) by the multisubunit enzyme RNA polymerase (RNAP). Second, the mRNA directs protein synthesis, when the ribosome translates its nucleotide sequence to amino acids using the genetic code. Because these two processes are so fundamental, a multitude of regulatory processes have evolved to regulate them. Most examples involve regulation of either transcription or translation. In PNAS, Chatterjee et al. instead describe a complex and intricate regulatory process in which transcription and translation are concurrently regulated by each other.

An example they give is “thermodynamically driven ‘consensus’ pauses” during transcription. It is indeed remarkable that cells “have evolved” regulation by race logic, when NIST’s best scientists are still trying to figure it out.

Optimizing Randomness

Under a microscope, tiny objects appear to wiggle at random, a phenomenon called Brownian motion due to the thermal dynamics of surrounding atoms. What if each upward bump could be stored and utilized to push another particle in a uniform direction? It would act as a ratchet that, in effect, stores information for free in the form of gravitational potential energy. Engineers at Simon Fraser University in British Columbia boast of having built the “World’s fastest information-fuelled engine” using this technique. SFU says their work “could lead to significant advances in the speed and cost of computers and bio-nanotechnologies” — i.e., fighting entropy with design.

SFU physics professor and senior author John Bechhoefer says researchers’ understanding of how to rapidly and efficiently convert information into “work” may inform the design and creation of real-world information engines.

“We wanted to find out how fast an information engine can go and how much energy it can extract, so we made one,” says Bechhoefer, whose experimental group collaborated with theorists led by SFU physics professor David Sivak.

Their creation, which has pushed the technique ten times farther than earlier implementations, is the “current best-in-class” in the field of information-fueled engine design, a concept going back 150 years. Only recently has it been possible to take theory into practice. The article describes how they did it. Then comes the punch line in the last sentence:

“Guided by this insight, we picked the particle mass and other engine properties to maximize how fast the engine extracts energy, outperforming previous designs and achieving power comparable to molecular machinery in living cells, and speeds comparable to fast-swimming bacteria,” says postdoctoral fellow Jannik Ehrich.

Consider: the best minds in science and engineering are trying to approach the capabilities of bacteria. That might suggest some thoughts about the origin of the target they are trying to reach.