Does Nature Manifest Intelligence?
by Michel J. Behe
From the time of her birth a baby is on the proverbial voyage of discovery. She comes to recognize the faces of her parents and siblings, and learns their language. She sees the behaviors of pet cats, dogs, turtles, and more. As a baby in the United States learns to walk outside in the sunshine she glimpses wild creatures -- hawks in flight, butterflies, lightning bugs, snakes -- and squeals with delight or cries in fear. One early spring she notices snow melting, flowers blooming -- and she suffers her first poison ivy rash and bee sting. A later trip to a zoo, with its giraffes, tigers, gorillas, zebras, and crocodiles, shows her that life, as it exists in the wider world, is broader and more alien than that found just in her own backyard. When she's old enough to go to school she will study in depth arguably the most peculiar animal on earth -- humans, who accumulate knowledge, think abstractly, and build civilizations.
Whatever else can be said of this world of ours, few have found it to be boring. Young children delight in its myriad surprises. After accumulating lifetimes of wisdom, mature poets and philosophers through the ages have repeatedly written paeans to its wonder and beauty. The intuited contingency of nature and the felt certainty that it didn't have to be this way have led humans ever since they could think to ponder the question, where did it all come from?
Since antiquity the great majority of humans have answered that the lawfulness of the universe and the manifest skill standing behind the marvelous abilities of creatures directly implied a Law-Giver and Craftsman -- a powerful mind capable of comprehending and constituting nature. And, in the somewhat ambiguous locution of the great medieval philosopher Thomas Aquinas, "By this we mean God."
Yet the Psalmist wrote, "The fool says in his heart, ‘There is no God.'" Since antiquity there have also always been naysayers who argued that, despite appearances, the world and the life it contains were not intended to be -- they were some sort of accident. One of the earliest known skeptics was the pre-Socratic Greek philosopher Democritus. He maintained that matter was composed of indivisible, indestructible particles he called atoms. Rarely, atoms would collide in such a way that a fragment of some body (say, a fin or flower petal) would be formed. These quickly perished. Very, very rarely, however, atoms collided in such a way that a whole, intact, integrated body was formed, which could survive and prosper. In this way the earth was populated with various creatures, thought Democritus. One can see how such an idea might strike the Psalmist and others as foolish, and for millennia most educated people thought God a much more likely explanation for the order of the world than lucky atomic collisions.
Yet to understand how the fortunes of the argument have shifted over time, it is crucial to grasp just what kind of an argument it is. On the one hand, the old majority was saying, roughly, that the beauty and order and functionality we see in life and the universe are what we would expect if an intelligence were behind it all, because it takes intelligence to create function and order and beauty. On the other hand, the old minority was saying, no, we suspect that an unintelligent natural process can mimic the results of intelligence, at least under some circumstances.
Now, because we know from our own (admittedly limited) experience that intelligence is capable of ordering parts into functioning wholes, the case for design is an inductive one -- reasoning from similar effects to similar causes -- a venerable form of argument that is the stock-in-trade of science. The persuasiveness of the skeptics' argument, on the other hand, rests on the plausibility of the unintelligent natural process they propose to mimic intelligence. So the more we know about nature, the better able we are to evaluate whether such a hypothesized process actually exists and can do what is claimed for it. In other words, the balance of the argument can tip radically with our knowledge of nature.
From Democritus to Darwin: Scientific Discoveries Shift Public Opinion
Democritus's idea didn't persuade many people. However, the philosopher lived at a time when true scientific knowledge of the world was minimal. For example, in Democritus's day, instead of the more than one hundred elements that we now know can be neatly arranged by their properties in a periodic table, there were thought to be only four: earth, air, fire, and water. What made animals breathe or the stars shine was an utter mystery.
As history progressed, however, more and more was learned about nature, especially from the seventeenth century onward. In 1859, several thousand years after Democritus, a new proposal by Charles Darwin struck many people as much more plausible -- the theory of evolution of life by natural selection acting on random variation -- and virtually overnight the fortunes of the two sides of the argument were reversed. From what was then known about the natural world -- that living things were made of chemicals just like ordinary matter, that at least some of those chemicals could be synthesized in the laboratory, that at least parts of animals could be thought of as machine-like, that members of species could vary in specific traits, and that fossils of extinct animal forms existed -- made it much easier to accept the idea that the same sorts of natural forces that shaped the inorganic world of mountains and seas over long periods of time also shaped life over time.
The key to Darwin's success versus Democritus's failure was his proposal that animals and plants changed slowly, in small increments. If a minor random variation of a member of a species helped it survive, and if the trait could be inherited, then its offspring would eventually come to dominate the species. If this same process were repeated many times over thousands or millions of years, then the species might turn into something altogether different. Small, accumulated changes shaped by selection seemed credible in a way that body parts popping suddenly into existence did not. From that time to the present, much of the educated world has adopted Darwin's theory as the only rational way to view the physical development of life.
Yet, as I mentioned above, the balance of the argument between those who see an intelligence behind life and those who don't can change radically with our knowledge of nature. Knowledge of nature grew fitfully from the time of Democritus to the time of Darwin, but it has mushroomed in the past century and a half since the publication of On the Origin of Species. The detailed knowledge of life that we now have, which was unavailable to Darwin and his contemporaries, justifies us in asking whether the balance of the argument has shifted once again.
Nineteenth-Century Science: Candle-Powered Microscopes and Featureless Cells
Let's consider the state of biology in the mid-nineteenth century. The microscopes of that time were useful but nowhere near the sophisticated instruments that many laboratories have today. For one thing, electrical power was not available, so microscope stages had to be illuminated by candles. The sophisticated dyes that are routinely used now to highlight features of a cell were also unavailable. The invention of lasers and computers, which today are commonly used to tease out hard-to-see features of cells in modern image analysis, lay a century into the future. The upshot is that the cell, which we now know to be the fundamental unit of life, looked nearly featureless under nineteenth-century microscopes.
Because the cell looked featureless, many scientists naturally thought that, for all intents and purposes, it was indeed featureless. In a classic line, Ernst Haeckel, an eminent scientist of the age and a great admirer of Darwin, decreed that the cell was a "simple little lump of albuminous combination of carbon," not much different from a piece of microscopic jelly. Although from a modern perspective this sounds ludicrous, it was right in step with the science of the time. Several decades earlier a German chemist named Friedrich Wöhler performed perhaps the only chemical reaction in history that has had deep philosophical consequences. He heated ammonium cyanate, an inorganic material, and astounded the world by confirming that urea, a biological waste material, had been produced. This one experiment demonstrated that nonliving substances could give rise to biological material, and it shattered the distinction between the inorganic and living worlds. Living things were made of the same sorts of materials as nonliving things!
Wöhler's experiment also made the question of the origin of life seem trivial. After all, if heating ammonium cyanate created urea, why couldn't cells similarly arise from some simple process? In fact, some scientists of Darwin's day thought that cells were even then arising by spontaneous processes on the earth.
The idea of a simple cell was mirrored in the nineteenth century by the idea of a simple, eternal, and essentially unchanging universe. Astronomers thought we lived on a rather typical planet. What's more, since life was thought to be so easy to begin, it was speculated that life existed just about everywhere -- on other planets, on our moon, perhaps even on the sun. Well into the twentieth century, serious scientists thought that advanced civilizations might exist on Mars and Venus. The astronomer Percival Lowell, who regaled the public in the early twentieth century with tales of "canals" on Mars and the advanced civilization they clearly implied, thought that Darwinian principles demanded that life exist on other planets. "That evolution is nothing else than such a gradually increasing chemical synthesis is forced on one by the study of the facts," Lowell declared in Mars and Its Canals. "All that we know of the physical state of the planet points to the possibility of both plant and animal life existing there, and furthermore, that this life should be of a relatively high order is possible. Nothing contradicts this."
X-Ray Crystallography and the Advent of Modern Science
With the research of the past fifty years, the picture has shifted dramatically. If one event marks the break between premodern and truly modern biology, it is the discovery in 1958 of the first structure of a protein, myoglobin, by a new technique called X-ray crystallography. Crystallography allows scientists to actually "see" the detailed shapes of the individual kinds of molecules that perform the tasks of cellular life (individual molecules are too small to be seen in detail even by the most powerful microscopes). Before 1958, it had been widely expected that proteins would turn out to be simple, appealing, symmetrical objects like salt crystals. But on beholding the convoluted, bowel-like form of myoglobin, Nobel laureate Max Perutz lamented "Could the search for ultimate truth really have revealed so hideous and visceral-looking an object?"
The machinery of life, however, was not built to look pretty. Like machines in our everyday world -- car engines, sewing machines, farming equipment, and so on -- proteins are built to perform jobs, and their shapes must fit their cellular tasks. Furthermore, like everyday machines, proteins work by well-understood mechanical principles. That was something of a surprise a half-century ago. Because proteins had seemingly magical properties of accelerating chemical reactions, it was thought then that they might work by some theretofore unknown force of nature -- perhaps some strange magnetic field or other effect that science had never before come across. But, no, proteins are machines, just as articles in a hardware store are machines. They work by applying a physical force, as a wrench might do, or by storing force, as a loaded spring might do, or by conducting electricity, as a wire might do.
And just as sophisticated machinery in our everyday world is built up by combining many mechanical parts in precise fashion, so too is protein machinery. One particularly good example is a molecular machine called a flagellum, which quite literally is an outboard motor that many kinds of bacteria use to swim. Just as an outboard motor in our everyday world has a propeller, motor, and clamp to hold it onto the boat, so does the flagellum. And just as the motor itself contains many parts that are precisely adjusted to work with each other, so too with the flagellary motor.
One way in which the nano-outboard differs from outboard motors on human boats is that it is much more sophisticated. Outboard motors manufactured by humans are assembled in factories by intelligent workers. In the cell, however, the flagellum has to assemble itself. It does so by creating geometrically and chemically complementary surfaces on parts that are supposed to attach to each other and by avoiding the creation of such complementary surfaces on parts that should not stick to each other. Such self-assembly is common to all large cellular machines. In looking for an analogy to bring out the incredible difficulty of such a feat, a writer for Nature -- the world's most prestigious science journal -- evoked a fantastical image:
The cell's macromolecular machines contain dozens or even hundreds of components. But unlike man-made machines, which are built on assembly lines, these cellular machines assemble spontaneously from their protein and nucleic-acid components. It is as though cars could be manufactured by merely tumbling their parts onto the factory floor.
A "simple little lump" of jelly the cell most definitely is not. It is in fact an ultra-complex piece of nanotechnology, the likes of which humans never imagined. And as if all this were not enough, in recent years scientists have been discovering sophisticated controlling systems in the cell that act essentially like computer programs. The job of the control systems is to make decisions about when, where, and how much of a molecular machine to make, when to switch on or off some cellular process in response to a change in the environment, and other tasks. And like everything else in biology, the more that is discovered about the control systems, the more complex, elegant, and subtle they turn out to be.
As an aside, advances in astronomy and physics have paralleled those in biology in discovering unexpected complexity. No longer is the universe simple, eternal, and unchanging as it seemed in the nineteenth century. Advances in the past half-century have shown that our universe had a beginning in a "Big Bang" and that its laws are exquisitely fine-tuned in ways that -- purposefully or not -- allow life to exist. Change the force of gravity or electromagnetism by a fraction and no one would be here to wonder at our absence. Instead of expecting life to arise on every third planet or moon, modern scientists are now postulating the existence of uncounted trillions of unobserved universes just to increase the putative odds that life could have arisen by chance in one of them.
Is Darwinism Enough?
So the cell, the foundation of life, has turned out to be unimaginably sophisticated. Does that necessarily mean that Darwin's idea of natural selection culling random variation doesn't work? Here we have to make critical distinctions to avoid confusion; the answer doesn't have to be a flat yes or no. Perhaps Darwinian evolution can explain all of life, despite its unexpected complexity. Or it may explain some facets of life but not others. We need evidence to choose among these possible answers.
Unfortunately, until relatively recently, decisive evidence was unavailable. Darwinian evolution is thought to work over very long periods of time and to involve whole populations of a species -- thousands, perhaps millions or billions of organisms. Since the average government grant will support a scientist's research for three years, time runs out long before sufficient data can be gathered to show whether Darwin has offered us a completely satisfactory explanation. Since the first step in Darwinian evolution is thought to be the generation of a mutation in a creature's DNA, real progress in assessing the power of the Darwinian mechanism will have to await the development of scientific techniques that can track tiny changes in DNA. That technology has only become highly developed within the past decade. Now results are coming in that show, as Darwinian enthusiasts have long claimed, that random mutation and natural selection do indeed help organisms survive and thrive in changing environments. Unexpectedly, however, the great majority of such favorable mutations are ones that actually break genes or throw them away. They degrade the complexity of the genome rather than increasing it.
For example, much has been discovered in recent decades about genetic changes that give some humans resistance to malaria, a still-deadly disease that kills a million people per year worldwide. The most well-known mutation that helps humans in the fight against malaria is the sickle mutation. Curiously, the sickle mutation occurs in one of the genes for hemoglobin, the protein in red blood cells that carries oxygen from lungs to tissues; the mutation is not in a gene that's part of the immune system, which is the molecular machinery that ordinarily is responsible for fighting infections. Tragically, if a child inherits two copies of the sickle mutation (one from each parent), she contracts sickle cell disease, more deadly than malaria. The sickle mutation in hemoglobin seems a desperate measure, not the kind of process that builds the sophisticated machinery of the cell.
Just a handful of other mutations are known to guard humans against malaria. In one, an entire gene for one of the components of hemoglobin is either broken or thrown away, resulting in a disease called thalassemia. In another, a control region in DNA, which usually switches on the gene for a protein called Duffy antigen in the red blood cell, is broken, so the protein is not made. This is helpful because Plasmodium vivax, a species of malaria, binds tightly to Duffy antigen on the membrane of the red blood cell, and uses it as a portal to enter the cell to consume the hemoglobin inside. Turning off Duffy antigen is like blowing up a bridge across a river to keep an enemy army from invading your country -- without the Duffy antigen bridge the malaria army can't invade. Yet, while they may actually be helpful in desperate times, breaking genes and blowing up bridges tells us nothing about how genes or bridges can be built in the first place.
Similar evolutionary behavior has been seen in other systems. Perhaps the best examples are from the longest-running experimental evolution study ever conducted. At Michigan State University, professor Richard Lenski (recently elected to the prestigious National Academy of Sciences) has been growing cultures of the bacterium E. coli for several decades in order to see how they would evolve under his watchful eye. Although twenty years doesn't sound like a lot of time, it is actually 50,000 generations for the quickly reproducing bug, equivalent to a million years of evolution for a large animal species like humans. What's more, because they are so small, bacteria can be grown in enormous numbers -- multiple trillions of them over the course of the experiment. Because of the large number of organisms and generations, these experiments give us a wonderful opportunity to watch real Darwinian evolution in action. What has been seen?
Lenski's experimental results are similar in kind to the human evolutionary response to malaria that has been observed in nature: a number of genes have been broken or thrown away, and there is no indication that any new molecular machinery is on an evolutionary production path. The most helpful mutation seen in twenty years occurred when the E. coli genes that are used to make a sugar called ribose were switched off. This allowed the mutant bacteria to grow faster than their normal relatives. The degradation of some other genes was also beneficial.
It has long been known by scientists that the great majority of mutations that have a noticeable effect on an organism are deleterious. From the recent studies by Lenski on bacteria and other scientists on malaria and viruses, we now know that the great majority of even beneficial mutations are degradative, decreasing genetic information. Like blowing up a bridge to stymie an invading army or breaking off the side-view mirrors of your car to decrease wind resistance, these mutations help by eliminating something that may have been useful in itself but has become a net drag in changed circumstances.
From our best pertinent research, it appears that Darwinism does indeed work, and can contribute in crucial ways to the well-being of a species, such as fostering malaria-resistance in humans. However, recent studies show it most often helping by subtraction -- by breaking or degrading preexisting parts of an organism's genetic inheritance. It also fails to work in a coordinated, coherent fashion, as when it dragoons non-immune system genes into the fight against malaria. While this can be helpful, these are not the marks of a process that can build sophisticated molecular machinery.
The Limits of Darwinism
The argument over intelligence behind nature has come full circle from the days of Democritus. From an implausible minority position, the argument for the existence of a natural process that could mimic intelligence has seen its fortunes rise rapidly with the advance of biology in the nineteenth century, and then plummet back into implausibility in the wake of much greater advances in the twentieth and twenty-first centuries. For those people who discern that there are more things in heaven and earth than are dreamt of in Darwinian philosophy, the work of modern biology gives them much succor.
We are left with a final question: if the results of modern science point strongly to intelligence behind the universe and life, why do so many modern scientists and others resist the conclusion so fiercely, especially when polls show that the great majority of Americans hold some sort of religious belief? There are probably many factors that go into the answer; I will mention what I think are the two chief reasons. The first, most common among scientists, is a kind of intellectual inertia mixed with professional pride. Darwinian evolution has been the presumed explanation for life ever since the days when the cell was thought to be simple; it's hard to get out of an intellectual rut, especially when critical experiments are difficult to do. Furthermore, admitting to an intelligence behind life would seem to reduce the importance of science, which would then no longer be in the business of explaining everything about the world. Who wants their chosen field to be downgraded?
There is often a different reason why some non-scientists detest the conclusion of intelligence behind nature -- politics. In America, ever since the 1925 Scopes trial, antipathy to evolution in general and Darwinism in particular has been a trait associated with the religious right. Yet it has always been due more to a historical accident than to some necessary philosophical connection. William Jennings Bryan, the anti-evolutionary attorney of the trial, was himself a staunch progressive. A populist and champion of women's suffrage, Bryan fought Darwinism because he thought it portended baleful effects for mankind. The notion of "survival of the fittest" was used by some to justify ignoring the plight of the poor, promoting war between nations, and sterilizing tens of thousands of American citizens against their will who were deemed to have inherited criminal tendencies or deficient intelligence. Bryan, "The Great Commoner," would have no part of a school of thought that so devalued common people.
The argument over design of the universe has been going on ever since the beginning of recorded history, ever since the first person asked, "How did it all get here?" Across centuries and cultures, whatever their views of how life should be lived on earth, the great majority of humanity has discerned from nature the existence of some purpose for the universe and life. Since the answer to the question depends on our knowledge of nature, the balance of the argument shifted in the nineteenth century when humanity gained some knowledge of biology, and Darwin proposed his plausible theory. But with the past century's great leaps in knowledge, and especially with the discovery of the elegance of the cell -- the foundation of life -- Darwin's theory has now become as plausible as Democritus' theory, and humanity's ancient view of intelligence behind life has been vindicated. That nonsectarian knowledge is the inheritance of all peoples, regardless of political, cultural, or religious views.
Michael J. Behe, the author of Darwin's Black Box and The Edge of Evolution, is professor of biological sciences at Lehigh University.
Behe, Michael J. 2010. Does Nature Manifest Intelligence? Tikkun 25(6) Online Exclusives