Tag Archives: teleology

Evolutionary Just-So Story, Again!

So yet again we have a story of evolution that seems to say that evolution works like God, i.e., that it indulges in design. I am referring to an article recently published in the New York Times reporting on research into why the squid lost its shell. The phrasing of the article will, in the minds of the naive, create the impression that the squid lost its shell in order to move faster to escape its predators (shells being rather heavy and cumbersome). “The evolutionary pressures favored being nimble over being armored, and cephalopods started to lose their shells.” This seems to be an innocent enough statement, but its construction implies that the pressure to become nimble preceded and caused the loss of the shells.

That is design. It may not be God design, though one could easily make that leap, but it is design nonetheless.

Oh, if only they would read Lucretius!

Here’s what really happened: Originally, “squids” we shelled creatures; generation after generation were shelled. Occasionally, a genetic mutation or defect (call it what you will) resulted in progeny lacking shells. No doubt, most of these shell-less individuals quickly died or were eaten and left no progeny; but at some point, some of them survived (perhaps thanks to another mutation that enabled them to move more quickly than their shelled relatives) and reproduced, eventually giving rise to a new class of creatures, squids and octopuses, etc. In other words, the change occurred first, without intention or purpose, and the benefit followed. The change did not occur in order to confer the benefit. It just happened.

Of course, such changes often occur gradually, say by shrinking the shell over many generations, in what some have called “path dependency” (i.e., evolution follows an already established path and does not go backwards, in other words it doesn’t restore the shell to creatures who have lost it). But the principle remains the same: first the change, and then, if it happens to have an advantage, it sticks.

As Lucretius said, humans did not develop opposable thumbs in order to grasp tools; we can grasp tools because we have opposable thumbs.

What Is a Species?

That science is a human enterprise and not some pure and perfect object independent of culture is highlighted by a recent investigation into the DNA of American wolves—the gray wolf, the Eastern wolf, and the red wolf. An article in the New York Times (7/27/16) reports that analysis of the DNA of these three wolf species reveals that in fact “there is only one species [of wolf] on the continent: the gray wolf.” The other two are hybrids of coyotes and wolves—Eastern wolves are 50/50, red wolves are 75 percent coyote and 25 percent wolf. The investigators also concluded that the wolf and coyote species shared a common ancestor only 50,000 years ago, which is very recent in evolutionary terms.

Now, anyone comfortable with the fact that nature goes its own way without regard to the human need for tidy intellectual categories is not likely to be much disturbed by these findings. But such people are relatively rare, especially in academic and political circles, so it happens that certain people do find it disturbing that Eastern and red wolves are hybrids. That is, they are not “pure” and therefore may not be entitled to protection from, say, extermination—they are not “pure” and therefore not entitled to the protection of such laws as the Endangered Species Act. In a sense, they are not “natural” because—well, because they violate the notion of the purity of species, they don’t fit neatly into our conceptual categories. As one scientist was quoted (in dissension from the worry warts), “’We put things in categories, but it doesn’t work that way in nature.’”

Indeed it doesn’t. In fact, it couldn’t. The notion of “species” as neatly distinct forms of life, immune to crossings of the so-called “species barrier,” among other common myths of the “logic” of evolution, would cause evolution to grind to a halt. Evolution requires messiness, contingency, happenstance, the unexpected, for it to work. For example, genetic mutations do not magically appear in consequential response to environmental pressures, just in time to save a species from extinction. Instead, a mutation lies quietly in the background, sometimes for many generations, to emerge as the crucial factor of salvation (for those individuals who carry it, and their descendants) when and if a factor in the environment calls it forth.

I am reminded of a startling discovery during the height of the AIDS epidemic in America, that some individuals, despite a particularly risky lifestyle, were immune to the disease. Turns out, they carried a mutation that had first manifested itself centuries earlier, during an epidemic of an entirely different disease, bubonic plague. One could describe how this mutation protects against both diseases, but one could not explain why—why this gene mutation occurred in the first place, why it just happened to confer immunity or resistance to these two quite different diseases (one caused by a bacterium, the other by a retrovirus), and why it resided silently in the genomes of its fortunate carriers for so many generations before it could prove its usefulness.

A fundamental goal of all human endeavors is to reduce the entangled complexities of life, including our own, to a simple set of principles that fit the limitations of the computational power of our little brains, a mere three pounds of meat, of which only a relatively small portion engages in the tasks of reasoning. Not surprisingly, it is difficult to wrap our heads around the genuine complexity of the earth we inhabit, let alone of the cosmos. Being the limited creatures that we are, we need our categories—but let’s not worship them. Let’s not condemn the Eastern wolf and the red wolf to extermination just because they mess up our laws.

You Lie!

One of the questions epistemology tries to answer is, how do we know? This broad question breaks down into a number of narrower questions, among which is how do we know that what we know is true? Hence, how do we know that a statement (an assertion) by another person is true? How do we know that an assertion is not true? How do we determine that a statement is a lie?

Just as interesting: How is it that we are susceptible to believing a statement is a lie when in fact it is not?
How is it that climate deniers can continue to believe that climate change is a hoax, a deliberate lie, a conspiracy by a world-wide cabal of leftists and humanists (synonyms, I suppose)? I don’t refer here to the oil executives and conservative politicians who know perfectly well that climate change is real and that it is human activity that is causing it (i.e., the real conspirators), but the average Joes and Janes who believe firmly and without doubt that climate change is a lie, the ones who pepper the reader comments of, for example, the Wall Street Journal, with their skepticism at every opportunity—even when the article in question has nothing to do with climate change, or even the weather. Climate change deniers are just a convenient example of the problem—there is virtually no end to the number of topics on which people firmly, often violently disagree, on the left as well as the right.

There are two basic means by which we determine the truth or falsehood of statements (assertions), the specific and the general. By the specific I mean such things as data, facts, direct observation, and so forth—the basic stuff of science. Objective evidence, if you will. We determine the truth of a statement by the degree to which it conforms with the facts. If someone says, “It’s a bright sunny day,” but in looking out the window I can see that it is gray and raining heavily, I have proof based on direct observation that the statement “It’s a bright sunny day” is false. It might even be a lie, depending on the motive of the person who made the statement; or it might not be a lie but simply an error.

However, if I’m in the depths of an office building where there are no windows, and someone comes in and says, “It’s a bright sunny day outside,” how do I determine if his statement is true or false?
By the general I mean the use of such things as theories, ideologies, world views, traditions, beliefs, etc., as templates to determine the truth of statements by, in a sense, measuring how well the statement conforms to the parameters or principles of the theory (etc.)—a theory (etc.) which, of course, we have already accepted (through one or more of the various ways in which theories become accepted). For example, diehard creationists evaluate the claims of Darwinism according to a strict Biblical literalism, the theory that every word of the Bible was directly inspired by God and is therefore true and that the Bible taken as a whole conveys His divine plan of human history from beginning to end. So Darwinism, which not only denies the seven days of creation (in 4004 BC) but also provides no basis for teleological views of the history of life, is godless and therefore untrue. The “so-called facts” of Biblical scholarship and biology aren’t facts at all and can be dismissed out of hand.

Something not dissimilar occurred among Marxist intellectuals in England, France, and the United States during the Stalin era when such luminaries as Sartre refused to believe the horrors being perpetrated in the Soviet Union because they did not conform to Marxist theory. Theory trumps reality in multitudes of cases, usually in ways less obvious than the errors of creationists and Marxists. Consider the political situation in the United States today as a near at hand example of the power of ideology, any ideology, to deny facts, or worse, to consider facts, and the people who bring them to our attention, outright lies.

“It’s a bright sunny day.” Perhaps the person who makes this statement is an extreme optimist who believes that if he repeats the assertion often enough, it will be true; or perhaps she will point out that somewhere on this planet it is a bright sunny day even if it isn’t here. Or maybe it’s a cruel lie perpetrated so that you will walk out to lunch without your umbrella and get soaked to the skin (ha, ha!). Or maybe he’s a politician who fears that if he speaks the truth (“There’s a mighty storm brewing.”) he will lose the election. And if you are a true believer, you will believe him, walk out to lunch without your umbrella, get soaked, and declare that the Senator was right, it is a sunny day—or maybe deny that he ever said that it was. You might recall years later that the Senator said it was a sunny day, or morning in America or whatever, and by golly he was right and history will recognize him as a great man. There are people in Russia who are nostalgic for the days of Stalin. It is said that the current head man of China wants to return to the ways of Mao. Could the Confederacy rise again?

Theories, ideologies, world views, all the general ways in which we measure truth and falsity are particularly resistant to correction or debunking, in part because we invest a great deal of life’s meaningfulness in our own special theory, in part because once we have adopted a theory (which we often do, without thought, very early in life), we get in the habit of measuring, evaluating, judging, and deciding according to its parameters. It is our paradigm, our gestalt, without which we could not make sense of the world. True or false, creationism and the whole system of beliefs of which it is a part makes sense and gives meaning. True or false, evolution and all that follows from it makes sense, though for most people it does not provide much meaning. Many people think that the sciences in general don’t provide humanly useful meaning, the kind of meaning that motivates us to get up in the morning, to care about voting, to raise children, to have something meatier than “values” to guide our lives. We are even willing to “[depart] from the truth in the name of some higher order” rather than risk meaninglessness. Hence the attraction of -isms: Communism, Darwinism, Creationism, Capitalism, Feminism, Terrorism—any -ism that can confer on us something other than the inevitable insignificance of being only one out of seven billion people, who are only seven billion out of the 107 billion people who have ever lived—and who knows how many more in the future. The less significant we are, the more selfies proliferate. Every -ism is a kind of selfie.

Are We Alone?

Many people are fascinated by the Drake equation, which is said by enthusiasts (and Sheldon Cooper) to estimate the number of planets in the Milky Way Galaxy (or, in some minds, the universe) that could have intelligent life. Recent remote explorations of Mars have suggested that there may have been life on that planet at some time in the past, perhaps even that there is still some kind of microbial life there now. Science fiction thrives on speculation that other galaxies are inhabited by someone and/or that human beings could colonize other worlds. Elon Musk, inspired by his reading of the Dune and Foundation trilogies, believes that space colonization is the way to save humanity from extinction.

Supposedly more sober minds ponder the theological and philosophical implications of extraterrestrial life: Would religion survive such a revelation? Could our anthropocentric theologies survive the knowledge that there are other civilizations somewhere out there, which perhaps would have very different notions of both the questions and the answers that we think of as essential to religion? In Christian terms (and this seems to be a worry primarily within Christian cultures), did Christ die to save all those aliens, too? Or does each planet require its own redemption? Or are we the only planet to have fallen from grace (i.e., are all the other inhabited planets still in a state of Eden? Did their creatures in the image of God choose better than Adam and Eve? Are they, Gnostic-like, angels or demigods, watching our passion play unfold?)

Or worse. Would the discovery of (or our being discovered by) extraterrestrials put an end to religion for good? Some philosophers think so, and the discovery would certainly be an existential challenge to religion as we have always conceived it, that is, again, the Christian conception, which has always assumed a teleological narrative of history that puts mankind at the very center of the struggle between good (God, Jerusalem, spirit) and evil (Satan, Babylon, the flesh, etc.), culminating in the final triumph of good and the restoration of creation to its original innocence. This is not the narrative of Hinduism, Buddhism, and other non-Western religions, so perhaps for them the existence of extraterrestrials would be no problem.

It is both scientifically and popularly assumed that, given the infinity of space and the multi-multitudes of stars and planets, that there must be life elsewhere, likely many elsewheres, in the universe, some of which must be much more advanced than we are (oddly, the opposite, being much less advanced, is less often mentioned, but it’s perfectly possible that there are planets out there populated by nothing more complex than bacteria or slime mold); assumed even though we have no proof of any kind that in fact there are any other inhabited planets. So even if a scientist asserts his certain belief that there has to be life elsewhere in the universe he is indulging in science fiction or some form of religion.

But there is another possibility not to be unquestioningly dismissed, one that Marilynne Robinson posits in her most recent collection of essays, “The Givenness of Things”: What if “for all we know to the contrary, there is just one minor planet in a limitless field of stars where apple trees blossom and where songs are sung”? Would that not “grant an important centrality to that planet”? For Robinson that centrality would be a religious one; it would suggest that there is some likely divine reason for only one living planet, contrary as it is to (limited) human reason.

But even from a purely secularist viewpoint, to know that there is no other living planet, no other intelligent life than ourselves, and that if we were to go extinct there might not ever again be such intelligent life (no songs being sung, no theories being proposed, no knowledge of the kind we honor with that name); and worse, that if somehow we managed to extinguish all sensate life from this planet, perhaps never again to be formed; that to know that there is no escape or rescue, nor even an end to ourselves at the hands or tentacles of a superior alien race—to know that would put the responsibility for our own fate and that of all life permanently and squarely in our own hands and no others.

And isn’t that, from a practical point of view, exactly the position we are in? Colonies on Mars are a dangerous fantasy; colonies further out in the galaxy or the reaches of the universe more improbable than fairy tales. The human body evolved on this planet and is adapted (and adaptable) to no other. The location of any other planet that might have some form of life is many light years away—and remember that a light year means the distance it takes light a year to travel (5,878,499,810,000 miles); multiply that by 1400 to get the distance from earth to the nearest earth “twin,” a planet that by the way would be even less hospitable to humans than Mars. And of course, we know of no way to transport humans and cargo at anywhere near the speed of light. Planets even further out, and getting further away as the universe continues to expand, might as well, for practical purposes, not exist at all. In sum, it can make no difference to us if there are other inhabited planets. We are for all intents and purposes truly alone in the universe.

We Are All Still Animists

[Children do not] have to be taught to attribute people’s behavior to the mental states they’re in. Children tend, quite naturally, to anthropomorphize whatever moves. What they have to learn is which things don’t have minds, not which things do.”
–Jerry Fodor (“It’s the Thought That Counts,” London Review of Books, November 28, 1996.)

Iconoclastic statements have always appealed to me, particularly because they cause me to look at the iconic statements they are set against in a new and critical light. Sometimes the iconic statements survive the scrutiny; oftentimes they don’t. In this case the iconic statement, that children learn that other people have minds of their own (theory of mind) over time, seems commonsensical until it is re-read in light of Fodor’s statement. Then it appears less evidently true.

Look at the first part of Fodor’s statement, that children “quite naturally . . . anthropomorphize whatever moves.” To anthropomorphize is to attribute human characteristics, in particular a mind with such things as motives, desires, feelings, etc., to nonhuman things. But, in my experience, not just to things that move (pets, for example), but also to things that don’t move: Dolls and figurines don’t move, though they look like they could, but small children also attribute feelings to objects that, to an adult, clearly are inanimate, such as blankies and other favored possessions; hence their sense of tragedy when the blankie disappears into the laundry hamper, or the favorite rubber ball deflates.

To read the full article, click here.

Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

The Mismeasure of All Things

Some 2500 years ago, Protagoras said that man is the measure of all things. By this he meant something like, mankind can know only that which it is capable of knowing, which in effect is a recognition that the human mind does have its limits; but Protagoras’ statement has often been taken to mean that man is the standard by which all other things are to be measured, i.e., that mankind is the standard of comparison for judging the worth of everything else. This meaning may have been colored by the Christian concept of man as the object of divine history, of man as just a little lower than the angels. The Christian concept, in its turn, derives from a common interpretation of the creation story in Genesis, in which God gives man dominion over the rest of earthly creation.

However, while both Protagoras’ saying and the Genesis story carry the concept forward through history, neither explains how the idea actually originated. It may have been Giambattista Vico (1668-1744) who first recognized that it is ignorance rather than knowledge that makes man the measure of all things: “When men are ignorant of natural causes producing things, and cannot even explain them by analogy with similar things, they attribute their own nature to them.” That is, when primitive men and women surveyed the world and sought explanations of phenomena, they had nothing to go by other than what they knew about themselves, so that, for example, a terrible destructive storm could be explained as the anger of the gods, since when human beings became angry they too engaged in destructive behavior; or when a gentle rain caused plants to grow, the gods were in a good mood, perhaps pleased by some human act of worship, because when humans were in a good mood, they engaged in benevolent acts. After all, the earliest humans could not have had any knowledge of the material causes of storms, droughts, etc., nor of course of animal behavior, which they attributed to motives much like their own. As Stephen Toulmin and June Goodfield summarize Vico’s views, in primitive mythologies people “could measure the world of Nature only by that which they already knew—namely themselves” (The Discovery of Time).

Both Protagoras and Genesis simply give more sophisticated glosses on this primitive impulse. They reflect the increasing body and complexity of knowledge developed by ancient civilizations, particularly those that had developed writing systems, which in turn enabled them to impose order on what had been a plethora of local myths and their variants. Simply by creating relatively coherent pantheons containing gods with discreet attributes, roles, and positions in a divine hierarchy, ancient civilizations were able to organize their intellectual world and provide authoritative explanations. Monotheism carried this further, by providing an even more unified world view, but it also somewhat depersonalized the concept of God, making him more abstract and less personal (e.g., no images or idols, no household god or genie of the local spring, etc.). This was an important achievement in the ongoing development of knowledge, a necessary step in the process that led to the state of knowledge we enjoy today, in large part because it put more emphasis on cerebral, intellectual rather than personal and experiential modes of understanding—in a sense, creating theory to replace myth. Thus we see the Greek philosophers creating the first science and the Jews creating the first inklings of theology and, importantly, teleology (a sense of history with a goal towards which it was moving). Nevertheless, the Judeo-Christian god retained strong anthropomorphic features, especially in the popular imagination and in visual arts, in which, for example, God the Father was usually depicted as a white-haired old man. Perhaps as long as most people were illiterate and dependent on visual media for their abstract knowledge, anthropomorphism was to be expected.

The Western European, Christian intellectual (literate) tradition combined these two strands of ancient thought, the scientific/philosophical with the historic/teleological, setting the stage for a modern world view that sees the world as making coherent sense and as operating according to consistent, universal laws, which then can be exploited by human beings for their own betterment. As scientific knowledge expanded and material explanations could be provided for phenomena that once were viewed as signs of divine intervention, God receded to the back of men’s minds as less necessary to explain the world—at best, perhaps, He became little more than the Prime Mover, the one who got it all started or the one who established the universal laws which continue to operate without His immediate intervention. But if the Age of Reason or the Enlightenment put God into retirement, it did not give up the belief in coherent laws and the quest for universal theories, nor did it give up the teleological view of history.

It is important to note that the teleological view is always a human-centered view; history, whether of cosmos, nature, or society, was still about man; very few thinkers hazarded to speculate that man might be merely one among many creatures and phenomena rather than the point of the whole enterprise. In this sense, at least, the early modern era retained the primitive impulse to both anthropomorphism and anthropocentrism. The widespread acceptance of Darwin’s theory of evolution by means of natural selection did little, indeed perhaps nothing, to change that for most people. It was not difficult to switch from believing that God had created man for dominion over nature and as the center of the historical story of fall and redemption, to believing that evolution is teleological, both in the sense of inevitably leading to the emergence of homo sapiens as the crowning outcome of the evolutionary process and in the sense of evolution as a progressive process. And it was easy enough, in the context of nineteenth-century capitalism, to believe that modern industrial culture was the natural continuation of progressive evolution—indeed was its goal.

It took a generation or more for it to dawn on people that Darwinism, along with the geological discoveries regarding the great age of the earth and the astronomers’ and physicists’ discoveries of the even greater age of the universe, implied there is no god at all, not even the reticent god of the Deists. One would think that once this implication struck home, both the teleological and the anthropocentric views would fade away. But, perhaps due to human vanity, neither has done so.

In a supremely ironic twist, both teleology and anthropocentrism have been inverted. Whereas the theological age measured other creatures in human terms, the evolutionary age measures humans in animal terms. We are no longer a little lower than the angels but only a little bit higher than the other animals—or maybe not even that. We are naked apes, talking apes, singing apes. We are like social insects, we are vertebrates, we are aggressive because we are animals seeking to maximize our survival, we are merely transportation for the real biological players, selfish genes. We are not rational or conscious, we do not have free will, we operate by instinct, each of our seemingly advanced traits is hard-wired. Our morality is nothing more than an adaptation. We take a word like altruism, which originally meant a certain kind of human behavior, apply it to ants, where it becomes a description of instinctive eusocial behavior, and then re-apply that meaning back onto humans. Thus making us just like all the other animals. Therefore, we study them in order to understand ourselves. We focus on the similarities (often slim) and ignore the differences (often radical).

This continues the old habit of anthropomorphism in new guise and fails to recognize the independent existence of other creatures—their independent lines of evolution as well as their ontological separateness from us. We unthinkingly repeat that humans and chimps share 96 percent of their genes (or is it 98 percent?), as if that meant something—but then, it’s said we share 97 percent of our genes with rats. We neglect to mention that apes and humans diverged from each other some 7 to 8 million years ago and have followed independent lines of evolution ever since. We are not apes after all.

Consider the fruit fly, that ubiquitous laboratory subject which has yielded so much knowledge of how genes work. It is often cited as a model of human genetics and evolution. But consider what Michael Dickinson, a scientist (he calls himself a neuroethologist) at the University of Washington (Seattle), has to say about fruit flies: “I don’t think they’re a simple model of anything. If flies are a great model, they’re a great model for flies.” To me, this is a great insight, for it recognizes that fruit flies (and, frankly, insects in general) are so other than like us that to study them as if they were a model of anything other than themselves, as a model of us, is in a sense not to study them at all. It is rather to look into their compound eyes as if they were mirrors showing our own reflections. It is a form of narcissism, which perhaps contains our own demise.

Our demise because in continuing to look at nature as being about ourselves we continue the gross error of believing we can manipulate nature, other organisms, the entire world, to our own narrow purposes without consequences. It turns other organisms into harbingers of homo sapiens, narrows research to that which will “benefit” mankind, and misses the very strangeness of life in all its diversity and complexity. It continues the age-old world view of human dominion and fails to recognize that our “dominion” is neither a biological necessity nor a feature of the natural world. Dominion is a dangerous form of narcissism which a maturely scientific age should discard.

Against System

As many people know, the world’s domestic honeybees are seriously threatened by a condition called colony collapse disorder, in which once healthy colonies suddenly disappear altogether, virtually overnight. Many different possible causes are generally cited, including mites, malnutrition, viral diseases, pesticides, and so forth, and it is routine to call for “more research.” Don’t ya just love it? Whenever we don’t want to face the truth, we call for “more research.” Yet the film demonstrates that more research is not needed. The reason for colony collapse is crystal clear: we have turned living organisms, the bees, into industrial units, mere nuts and bolts moving along a factory line, without regard to the fact that they are living organisms—much as we have other animals, such as cattle, chickens, hogs, and fish.

Click here to read entire article

Why Determinism?

The eternal debate between determinism and free will has lately taken a new form. Determinism has been reincarnated in the shape of neuroscience, with attendant metaphors of computers, chemistry, machines, and Darwinism. Meanwhile, defenders of free will seem to have run out of arguments, particularly since, if they wish to be taken seriously, they dare not resort to a religious argument. That the debate is virtually eternal suggests that it is not finally resolvable; it could be said in fact that the two sides are arguing about different things, even though they often use the same terminology.

Determinism’s popularity is most clearly suggested by the sales figures for books on the subject and by the dominance of the view in popular science writing. Such books are widely reviewed, while those arguing for free will are neglected, especially by the mainstream press.

The question then is not whether or not we have free will, or whether or not we are wholly determined in all our thoughts and actions; but rather, why at this point in time, particularly in this country, determinism is so popular, more so than free will?

Today’s determinism is not the same as the ancient concept of fate. Fatalism was not so much about determinism or, as the Calvinists posited, predestination; fatalism did not pretend to know what would happen, but rather held that fate was a matter of unpredictability, of whim (on the part of the universe or of the gods, etc.), and in fact left some room for free will, in a what-will-be-will-be sort of way; i.e., because outcomes were unpredictable, one had to choose, one had to act, and let the dice fall where they may. The tragic flaw of hubris was exactly what is wrong with any determinism, the delusion that one could stop the wheel of fate from turning past its apex, i.e., that through prediction one could control.

Determinists worship predictability and control. I once read somewhere the idea that, if everything that has already happened were known, everything that will happen could be accurately predicted. Extreme as this statement is, it accurately summarizes the mindset of the determinists. It also suggests why determinism is so attractive in a scientific age such as ours, for science is not only about the gathering of facts and the formulation of theories but also about using those theories to make predictions.

Given the apparent power of science to accurately predict, and given that prediction is predicated on a deterministic stance, it is not surprising that scientists should turn their attention to the human condition, nor that scientists, being what they are, tend to look for, and find, evidence that human thoughts and behavior are determined by genes, neurons, modules, adaptations, what have you, and are therefore predictable. And it further is not surprising that, in a restless and rapidly changing world, laymen are attracted to these ideas. Certainty is an antidote to powerlessness.

If we are religiously minded, we find certainty in religion; hence the rise of politically and socially powerful fundamentalist movements today. If we are not religious, we may find certainty in New Age nostrums, ideologies, art, bottom lines, celebrity worship, or even skepticism (no one is more certain of his or her own wisdom than the skeptic). If we are politicians, we look for certainty and security in megabytes of data. If we are scientifically minded, we find certainty in science. But certainty is not science. It is a common psychological need in an age of uncertainty.

In satisfying this need for certainty, determinism often leads to excessive self-confidence and egotism—which in turn leads to simplifications and dismissal of complexity, ambivalence, and randomness. Determinism is teleology. Today’s determinists may have discarded God, but they still believe that He does not play dice. They are, in short, utopians. We all know where utopias end up. That much at least we can confidently predict.

Paleolithic Fantasies

We live in an age like all previous ages, one in which thinking people assess the state of the world, find it wanting, and consequently seek a better, even perfect, way of life. Such people tend to roughly divide into those who seek their utopias in a vision of the future (today: think digital prophets, genetically modified crops) or a return to a golden past when human beings were in perfect harmony with nature (past: think Eden and the Noble Savage; today: think organic farming, artisanal cheese). Interestingly, one finds both types among both liberals and conservatives, though usually with different emphases (liberals tend to go for the organic, conservatives for traditional morality, while both seem to think that digital technology holds great promise for the future, either through greater community or better security). And advocates of both sides seem to appeal, either implicitly or explicitly, to “human nature” as the ultimate measure of the perfect way of life (using either Darwin or the Bible as the validating text). Thus, amid all the changes of outward circumstance, human nature has remained unchanged through time.

Marlene Zuk, author of Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live (W. W. Norton, 2013), addresses the myth, the just-so story, of a fixed human nature from an evolutionary perspective. An evolutionary biologist currently associated with the University of Minnesota, Zuk has conducted extensive field research, particularly on crickets, and is the author of numerous specialized articles and several popular books on evolutionary biology, behavioral biology, and sexual selection. She is therefore particularly well-qualified to demolish popular myths about human evolution, which she does with clarity and wit in this new book. (Her wit is best illustrated by her statement that “After all, nothing says evolution like a brisk round of the plague.”) Her immediate targets here are evo-myths about diet and health, particularly those that base their tenets on the very false idea that contemporary human beings are Paleolithic creatures uncomfortably and unhealthily stuck in an unnatural modern industrial environment. In other words, the natural man, the Noble Savage, the Eden which we have lost, is to be found in the lifestyles of early Stone Age humans prior to the development of agriculture (the true Original Sin) and settled life, that is prior to about 10,000 years ago. Supposedly, humans of the Paleolithic lived in that much admired perfect harmony with nature, and to restore our health and souls, we need to retrieve that lifestyle and apply it to our urbanized lives today.

Alas, like all utopian dreams, whether of past or future, what Zuk calls paleofantasies are exactly that, fantasies, and in the course of demonstrating just how fantastic they are, she treats her readers to a particularly clear and nonidealogical series of lessons on what evolution really is. And what it is not: it is not purposeful and it is not perfect or ever perfected. Thus, she demolishes the notion of the Noble Savage (by whatever name) when she writes that there is no utopian moment of perfect synchronicity between human beings and their environment. Both organisms and environments constantly change (and both humans and environments certainly did over the 2.6 million years of the Paleolithic period), and to think that today’s human beings are unchanged from those of even a mere 10,000 years ago “misses the real lessons of evolution” and “is specious” (p. 59). And lest we think that evolutionary change moves in some kind of logical direction, she writes that “evolution is more of a drunkard’s walk than a purposeful path” (p. 78).

Evolution never intends anything. It is a Rube Goldberg contraption, or rather the creatures it throws up are, because, rather than aiming at or achieving perfection, it measures success only by reproductive success. “If something works well enough for the moment, at least long enough for its bearer to reproduce, that’s enough for evolution” (p. 8). When you think about it, this is actually an excellent measure, simply because “perfection” is purely a human concept, and no one can agree on just exactly what perfection is. Should we eat only meat, because, as some paleo diet buffs claim, that’s what our Pleistocene ancestors ate? Or should we eat only raw vegetables and fruit, because, as other buffs claim, those were the exclusive menu items of our ideal past? Should we eschew grains, because they are cultivated and therefore not natural? Just exactly what would the “perfect” diet for human beings consist of?

According to Zuk, it depends. As she shows, various populations of human beings have evolved to utilize foods that our hunter-gatherer ancestors would not have been able to eat. For example, adults of some populations can digest milk, while the majority of human adults cannot (lactose intolerance). Certainly, the latter should avoid dairy, but the former can consume dairy products pretty much as they please. Insofar as the deleterious effects of agriculture are concerned, yes, it appears to be true that initially human health and well-being declined after people began cultivating grain crops and living in permanent settlements, but Zuk points out that it did not take all that long for this disadvantage to disappear; and as we know, agricultural societies grew larger and faster than foraging societies (reproductive success again being the measure of evolutionary success). Certainly some kind of genetic mutations could have occurred that conferred a greater ability to prosper on a diet high in grains; but it is also possible that as people improved their knowledge of cultivation and selectively improved the quality of their crops, and also exploited the advantages of settlements in facilitating trade, they overcame the initial disadvantages of agriculture. But whatever the case, it’s important to keep in mind that the early agricultural peoples themselves apparently thought that the advantages of agriculture outweighed its disadvantages—why else persist in farming?

An analogous point could be made about our modernity: If modern urban life is so bad for us, so unnatural and maladaptive, why did we develop it in the first place? If we are really, as some do argue, merely products of biological evolution like any other animal and, as some do argue, our consciousness is merely an illusion, how did we “evolve” a state of affairs so contrary to our biological being? And why do we cling to it so tenaciously? If it were really so horrible, wouldn’t we be fleeing the city for the more natural environments of the northern woods or western prairies (the United States’ closest approximation of the Edenic savannahs)? The fact that we do not suggests that urban industrialized life may not be so bad for humans after all. (How bad it may be for other organisms is a different question.)

Whatever the sources of some people’s dissatisfaction with modern human life, a mismatch between our Paleolithic natures and modernity is not one of them, and the appeal to evolution is, as already noted, based on a misconception of what evolution is. A major aspect of that misconception is an over-emphasis on natural selection. But as Zuk points out, “it is important to distinguish between two concepts that are sometimes—incorrectly—used interchangeably, evolution and natural selection. At its core, evolution simply means a change in the frequency of a particular gene or genes in a population” (p. 251). The mechanisms by which these gene frequency changes occur include not only natural selection, but genetic drift, gene flow, and mutation. “Genetic drift is the alteration of gene frequencies through chance events” (p. 251). “Gene flow is simply the movement of individuals and their genes from place to place, and activity that can itself alter gene frequencies and drive evolution” (p. 252). “The final way that evolution sans natural selection can occur is via those mutations, changes in genes that are the result of environmental or internal hiccups that are then passed on to offspring” (p. 252). In order to see whether or not evolution is occurring in humans today, one does not look at superficially visible traits but at changes in gene frequency among human populations.

Another all too common misconception is that “evolution is progressing to a goal” (p. 252), what can be called the teleological error. Even well-known and well-informed people believe that evolution is goal directed. For example, Michael Shermer, the editor of The Skeptic magazine and the author of a number of pro-evolution books, writes in The Science of Good and Evil that “Evolutionary biologists are also interested in ultimate causes—the final cause (in an Aristotelian sense) or end purpose (in a teleological sense) of a structure or behavior” (p. 8); he then states that “natural selection is the primary driving force of evolution” (p. 9). In contrast, Zuk reiterates throughout her book that “everything about evolution is unintentional” (p. 223), that “all of evolution’s consequences are unintended, and there never are any maps” designating a foreordained destination—and she is in fact an evolutionary biologist!

A good example of an unintentional evolutionary consequence is resistance to HIV, the retrovirus that causes AIDS. As it happens, some individuals are resistant or immune to the retrovirus, but not because evolution or natural selection intended them to be so. Centuries ago, bubonic plague swept through Europe; millions died of this highly infectious disease, but some few people did not get the disease despite having been exposed to it. No doubt they thought God had spared them for some divine reason. Centuries later, some of their descendents were exposed to HIV and did not become ill. Did God plan that far ahead to spare these few lucky individuals? Did evolution? No. A random mutation happened to render human cells unreadable to the plague bacterium (or, as Zuk suggests is more likely, unreadable to the smallpox virus); consequently, the bacteria could not enter the cells and wreak their havoc. The mutation would have had to have occurred before the introduction of the disease into the lucky few’s environment (there would not have been enough time for it to occur and proliferate after the disease’s introduction), and may have had no prior function, good or bad. As chance would have it, centuries later, the same mutation also made the owner’s cells unreadable to the AIDS virus, thus rendering him or her immune to HIV—quite by chance. Pace Lamarck, perhaps we can say that it is not characteristics that are acquired, but functions. The gene mutation that confers HIV immunity has after many generations finally an acquired function.

Why then do organisms seem so perfectly adapted to their environments? Perhaps they are not so perfectly adapted as they appear to human eyes; more importantly, since environments change, organisms must change as well, but perhaps if they were too perfectly adapted (each and every individual of the species therefore being identical), they would rather quickly become imperfectly adapted to even small changes in their environment. Perhaps, then, perfection is an extinction trap rather than a desirable goal.