Culture & Ethics
Between Sapientia and Scientia — Michael Aeschliman’s Profound Interpretation
Editor’s note: Today Discovery Institute Press releases the newly revised and expanded edition of Michael Aeschliman’s classic work The Restoration of Man: C.S. Lewis and the Continuing Case Against Scientism. The new edition includes this Foreword by James Le Fanu, physician and author of Why Us?: How Science Rediscovered the Mystery of Ourselves.
The three decades following the Second World War were arguably the most productive in the long history of science. Within living memory, we have learned how the “Big Bang” brought the universe into being, how the first stars were formed and how within their fiery interiors the chemical elements were created by the process of nuclear fusion — hydrogen into helium, helium into carbon, oxygen, silicon, all the way up through the periodic table. We have learned how four billion years ago a vast cloud of intergalactic gas and particles in that, by now, vast universe, coalesced to form our solar system; how our earth acquired its sustaining atmosphere and how the tectonic movement of massive plates of rock created the continents and oceans. We have identified the very first forms of life that emerged three billion years ago, delineated the internal workings of its fundamental unit, the cell, and discovered the “universal code” of nucleotides strung out along the double helix by which all living things replicate their kind. And we now have compelling evidence of the details of the physical characteristics of our earliest ancestors and their transformation to modern man.
This overarching historical narrative linking our existence all the way back to the moment when energy, time, and space appeared — apparently out of nothing — ranks amongst the greatest of all intellectual achievements permitting us, astonishingly, for the first time to hold in the mind’s eye the entire history of the universe from the moment of its inception to the present day. The triumph of science, one might suppose, is virtually complete. What during these times have we learned from the humanities — philosophy, say, theology or history — that begins to touch the profundity and originality of that intellectual achievement?
The Academy’s Common Currency
The authority and prestige conferred by those insights has marginalized those other ways of knowing, and inevitably the doctrine of scientism — “science is the only begetter of truth” — has, in its several guises, become the common currency of the Academy and public discourse: epistemic scientism, the contention that the scientific method is the only reliable path to knowledge; ontological scientism, the supposition that whatever is not encompassed within the domain of science does not exist or has only a secondary or subsidiary reality; promissory scientism, the assertion that there is nothing in principle that science cannot explain; and redemptive scientism, the idea, as philosopher Mary Midgley expresses it, “of salvation through science alone.”1
These several variations of scientism entail, by necessity, that “man is the result of a purposeless and natural process that did not have him in mind,”2 our seemingly unique attributes — the powers of reason and imagination, the moral law within and the sense of self — being no more than an illusion generated by our “selfish” genes and the electrochemistry of the brain to maximize our chances of survival. “Modern science directly implies that the world is organized strictly in accordance with mechanistic principles,” writes philosopher William Provine. “There are no purposive principles whatsoever in nature. There are no gods and no designing forces that are rationally detectable… modern science implies too there are no inherent moral or ethical laws… free will, the freedom to make uncoerced unpredictable choices among alternative possible causes of action, simply does not exist… there is no ultimate meaning for humans.”3
Modern science may imply that the world (and ourselves) are organized “strictly in accordance with mechanistic principles,” but it can scarcely be said to have demonstrated this to be so. On the contrary, impressed — dazzled even — by that overarching historical narrative, it is easy to overlook that its intellectual legacy is almost precisely the reverse of that which is commonly supposed. Those major landmarks — the scientific understanding of the origins of the universe, the creation of the chemical elements, the formation of our solar system and planet and so on — cannot by definition be rediscovered, so the major challenge for science in more recent times has been to refine and elaborate on what is already known. And that has proved surprisingly tricky. For while the broad outline still holds, the practicalities of how (or why) those major events came about in the way they did has proved impervious to scientific scrutiny.
The evidence for the origin of the universe at the moment of the Big Bang 14 billion years ago (or thereabouts) and its sudden dramatic expansion seems compelling enough but only serves to emphasize the inscrutable perplexity of this most influential of scientific theories. The proposition that the universe sprang into existence a finite time ago from nothing (or, at least, from nothing physical) places a heavy (insupportable) demand on any causal explanation that must — by necessity — transcend time, space, matter, and energy.
So, the scientific community has been compelled to acknowledge that the universe must have sprung into existence ab nihilo. It has subsequently emerged that the physical laws of the universe — e.g., gravity, electromagnetism, the strong and weak nuclear forces, the speed of light — are so finely tuned that the slightest alteration in their values would have rendered impossible the subsequent emergence of life on earth. It is very difficult to convey just how precise the values of those forces had to be, but physicist John Polkinghorne estimates their fine-tuning had to be accurate to within one part in a trillion trillion (and several trillion more), a figure greater by far than all the particles in the universe.4
As for the origins of life itself, the progressive delineation of the internal workings of its fundamental unit, the cell, over the past fifty years has revealed it to be “a world of supreme technology and bewildering complexity” in which many features of our most sophisticated machines have an analogue: memory banks for information storage and retrieval, elegant control systems regulating the assembly of parts, proofreading devices for quality control, and assembly processes utilizing the principles of modular construction.5 This “automated factory,” several thousand million times smaller than the smallest piece of functional man-made machinery, which in its various forms has the capacity to create every living organism that ever existed — from a giant redwood to the human brain — can replicate its entire structure within a matter of hours. The likelihood that the first cell’s many components might have arisen spontaneously from some prebiotic chemical soup is analogous to supposing, as the late astronomer Fred Hoyle observed in a memorable image, “a tornado sweeping through a junk-yard might assemble a Boeing 747 from the materials therein.”6
How Little Science Can Explain
The most economical interpretation of these several instances of the inscrutable perplexities revealed by “refining and elaborating” on that overarching historical narrative would be that by illuminating how little science can truly explain, they have undermined (annihilated, even) the fundamental tenets of scientism. But there is more, for the findings of the two most recent and ambitious of scientific projects contradict, if inadvertently, any pretensions to an exclusively materialist account of the phenomena of life and ourselves.
These projects were predicated on two remarkable technical developments that promised to resolve the two major obstacles to a truly comprehensive account of our place in the universe: how it is that those genetic instructions strung along the double helix contribute to the multitudinous diversity of form and attributes of the living world; and how the electrical activity of the human brain “translates” into our subjective experiences, memories, and sense of self. Those developments were, first, the ability to spell out the full sequence of genes, or genomes, of diverse species — worm, fly, mouse, man, and many others — and, second, the sophisticated scanning techniques that permit neuroscientists for the first time to observe the brain “in action” from the inside — thinking, memorizing, and looking out on the world.
The ability to spell out the full sequence of genes should reveal, it was reasonable to assume, the distinctive genetic instructions that determine the diverse forms of the millions of species, so readily distinguishable one from the other. Biologists were thus understandably disconcerted to discover precisely the reverse to be the case. Contrary to all expectations, many DNA sequences involved in embryo development are remarkably similar across the vast spectrum of organismic complexity, from a millimeter-long worm to ourselves.7 There is, in short, nothing in the genomes of fly and man to explain why the fly should have six legs, a pair of wings, and a dot-sized brain and we should have two arms, two legs, and a mind capable of comprehending that overarching history of our universe.
So we have moved in the very recent past from supposing we might know the principles of genetic inheritance to recognizing we have no realistic conception of what they might be. As Phillip Gell, professor of genetics at the University of Birmingham, observed, “This gap in our knowledge is not merely unbridged, but in principle unbridgeable and our ignorance will remain ineluctable.”8
It has been a similar story for neuroscientists with their sophisticated scans observing the brain “in action.” Right from the beginning, it was clear the brain must work in ways radically different from those supposed. Thus, the simplest of tasks, such as associating the noun “chair” with the verb “sit,” cause vast tracts of the brain to “light up” — prompting a sense of bafflement at what the most mundane conversation must entail.9 Then the sights and sounds of every transient moment, it emerged, are fragmented into a myriad of separate components without the slightest hint of the integrating mechanism that would create the personal experience of living at the center of a coherent, unified, ever-changing world.
Reflecting on this problem, Nobel Prize-winner David Hubel of Harvard University observes, “This abiding tendency for attributes such as form, color and movement to be handled by separate structures in the brain immediately raises the question how all the information is finally assembled, say, for perceiving a bouncing red ball. They obviously must be so assembled — but where and how, we have no idea.”10
Meanwhile, the great conundrum remains unresolved: how the electrical activities of the billions of neurons in the brain are related to the experiences of our everyday lives — where each fleeting moment has its own distinct, intangible feel: where the cadences of a Bach cantata are so utterly different from the taste of bourbon or the lingering memory of that first kiss. The implications are obvious enough. While it might be possible to know everything about the physical materiality of the brain down to its last atom, its “product,” the five cardinal mysteries of the non-material mind, remain unaccounted for: subjective awareness, free will, how memories are stored, the “higher” faculties of reason and imagination, and that unique sense of personal identity that changes and matures over time but remains resolutely the same.11
The Standard Response
The standard response to such imponderables is to acknowledge that perhaps things have turned out to be more complex than originally presumed, but to insist these are still “early days” to predict what might yet emerge. Now, biologists could, if they so wish, spell out the genomes of each of the millions of species with which we share the planet, but the interesting question of what determines the unique form and attributes of such diverse creatures would remain unresolved. And so too for observing the brain “in action,” where a million scans of subjects watching David Hubel’s bouncing red ball would not progress understanding any further as to how those neuronal circuits experience the ball as being round and red and bouncing.
The contrast with that supreme intellectual achievement of the post-war years is striking. At a time when cosmologists can infer what happened in the first few seconds of the birth of the universe, and geologists can measure the movements of continents to the nearest centimeter, it seems extraordinary that geneticists can’t tell us why humans are so different from flies, and neuroscientists are unable to clarify how we recall a telephone number.
Sapientia and Scientia
This necessarily poses the question as to why these recent findings have proved so difficult to unravel. There are two reasons why this might be so. The first, obvious on reflection, is that “life” is immeasurably more complex than matter: Its fundamental unit, the cell, though billions of times smaller than the smallest piece of machinery constructed by man, has nonetheless, as noted, the capacity to create every organism that has ever lived. A fly is billions upon billions upon billions of times more complex than a pebble of comparable size and possesses properties that have no parallel in the inanimate world: the capacity to transform the nutrients on which it feeds into its own tissues, and to repair and reproduce itself. So, too, the laws of biology must be incommensurately more complex than the laws of physics and chemistry. So, while it is extraordinary that cosmologists can infer the physical events in the wake of the Big Bang, this is a trivial matter compared to explaining the phenomena of life.
The further reason why these recent findings of genetics and neuroscience should have proved so perplexing is the assumption that the phenomena of life and the mind are indeed ultimately explicable in the materialist terms of respectively the genes and the brain that give rise to them. And an assumption it remains, for the distinctive feature of both the form and attributes of the living world and the thoughts, beliefs, and ideas of the mind is that they are unequivocally non-material in that they cannot be quantified, weighed, or measured. And thus, strictly speaking, they both fall outside the domain of the currently accepted methods of science to investigate and explain.
Thus, science, the dominant way of knowing of our age, now finds itself caught between the rock of the supreme intellectual achievement of delineating that overarching narrative of the history of the universe and the (very) hard place of its investigations of life and the mind proving inscrutable in materialist terms.
Within this context, Michael Aeschliman’s profound interpretation of the conflict between sapientia and scientia could scarcely be more relevant and compelling. This, as he points out, is an old argument engaged by many figures that include, notably, C.S. Lewis, who presciently anticipated how the subversion of the philosophic view by material science entailed the denial of human exceptionalism. On the contrary we are, it turns out, not only a mystery to ourselves, but our existence as the sole witness, through our powers of reason, of the splendors of the universe and all that is contained within, is its central mystery, the most persuasive of all evidences for there being “more than we can know.”
- Mary Midgley, quoted in Mikael Stenmark, Scientism: Science, Ethics and Religion (Aldershot, England: Ashgate, 2001), 14.
- George Gaylord Simpson, The Meaning of Evolution: Revised Edition (New Haven: Yale University Press, 1967), 345.
- William Provine, “Evolution and the Foundation of Ethics,” MBL Science 3 (1988), 25–29.
- John Polkinghorne, Beyond Science: The Wider Human Context (Cambridge University Press, 1995).
- Michael Denton, Evolution: A Theory in Crisis (Chevy Chase, MD: Adler and Adler, 1986), 328.
- “Hoyle on Evolution,” Nature 294 (1981), 105.
- James Randerson, “Fewer Genes, Better Health,” New Scientist, July 13, 2002, 19.
- Philip Gell, “Destiny and the Genes: Genetic Pathology and the Individual,” The Encyclopaedia of Medical Ignorance, ed.s R. Duncan and M. Weston-Smith (Kidlington: Pergamon, 1984), 179–87.
- S. E. Petersen et al., “Positron Emission Tomography Studies of the Cortical Activity of Single-Word Processing,” Nature, 331 (1988), 585–89.
- David H. Hubel, Eye, Brain and Vision (New York: Scientific American Library, 1988), 220.
- Robert W. Doty, “The Five Mysteries of the Mind and Their Consequences,” Neuropsychologia 36, no. 10 (October 1998): 1069–76.
Photo: The star Zeta Ophiuchi, image captured by Spitzer Space Telescope, via NASA/JPL-Caltech.