Tag Archives: adaptation

Evolutionary Just-So Story, Again!

So yet again we have a story of evolution that seems to say that evolution works like God, i.e., that it indulges in design. I am referring to an article recently published in the New York Times reporting on research into why the squid lost its shell. The phrasing of the article will, in the minds of the naive, create the impression that the squid lost its shell in order to move faster to escape its predators (shells being rather heavy and cumbersome). “The evolutionary pressures favored being nimble over being armored, and cephalopods started to lose their shells.” This seems to be an innocent enough statement, but its construction implies that the pressure to become nimble preceded and caused the loss of the shells.

That is design. It may not be God design, though one could easily make that leap, but it is design nonetheless.

Oh, if only they would read Lucretius!

Here’s what really happened: Originally, “squids” we shelled creatures; generation after generation were shelled. Occasionally, a genetic mutation or defect (call it what you will) resulted in progeny lacking shells. No doubt, most of these shell-less individuals quickly died or were eaten and left no progeny; but at some point, some of them survived (perhaps thanks to another mutation that enabled them to move more quickly than their shelled relatives) and reproduced, eventually giving rise to a new class of creatures, squids and octopuses, etc. In other words, the change occurred first, without intention or purpose, and the benefit followed. The change did not occur in order to confer the benefit. It just happened.

Of course, such changes often occur gradually, say by shrinking the shell over many generations, in what some have called “path dependency” (i.e., evolution follows an already established path and does not go backwards, in other words it doesn’t restore the shell to creatures who have lost it). But the principle remains the same: first the change, and then, if it happens to have an advantage, it sticks.

As Lucretius said, humans did not develop opposable thumbs in order to grasp tools; we can grasp tools because we have opposable thumbs.

What Is a Species?

That science is a human enterprise and not some pure and perfect object independent of culture is highlighted by a recent investigation into the DNA of American wolves—the gray wolf, the Eastern wolf, and the red wolf. An article in the New York Times (7/27/16) reports that analysis of the DNA of these three wolf species reveals that in fact “there is only one species [of wolf] on the continent: the gray wolf.” The other two are hybrids of coyotes and wolves—Eastern wolves are 50/50, red wolves are 75 percent coyote and 25 percent wolf. The investigators also concluded that the wolf and coyote species shared a common ancestor only 50,000 years ago, which is very recent in evolutionary terms.

Now, anyone comfortable with the fact that nature goes its own way without regard to the human need for tidy intellectual categories is not likely to be much disturbed by these findings. But such people are relatively rare, especially in academic and political circles, so it happens that certain people do find it disturbing that Eastern and red wolves are hybrids. That is, they are not “pure” and therefore may not be entitled to protection from, say, extermination—they are not “pure” and therefore not entitled to the protection of such laws as the Endangered Species Act. In a sense, they are not “natural” because—well, because they violate the notion of the purity of species, they don’t fit neatly into our conceptual categories. As one scientist was quoted (in dissension from the worry warts), “’We put things in categories, but it doesn’t work that way in nature.’”

Indeed it doesn’t. In fact, it couldn’t. The notion of “species” as neatly distinct forms of life, immune to crossings of the so-called “species barrier,” among other common myths of the “logic” of evolution, would cause evolution to grind to a halt. Evolution requires messiness, contingency, happenstance, the unexpected, for it to work. For example, genetic mutations do not magically appear in consequential response to environmental pressures, just in time to save a species from extinction. Instead, a mutation lies quietly in the background, sometimes for many generations, to emerge as the crucial factor of salvation (for those individuals who carry it, and their descendants) when and if a factor in the environment calls it forth.

I am reminded of a startling discovery during the height of the AIDS epidemic in America, that some individuals, despite a particularly risky lifestyle, were immune to the disease. Turns out, they carried a mutation that had first manifested itself centuries earlier, during an epidemic of an entirely different disease, bubonic plague. One could describe how this mutation protects against both diseases, but one could not explain why—why this gene mutation occurred in the first place, why it just happened to confer immunity or resistance to these two quite different diseases (one caused by a bacterium, the other by a retrovirus), and why it resided silently in the genomes of its fortunate carriers for so many generations before it could prove its usefulness.

A fundamental goal of all human endeavors is to reduce the entangled complexities of life, including our own, to a simple set of principles that fit the limitations of the computational power of our little brains, a mere three pounds of meat, of which only a relatively small portion engages in the tasks of reasoning. Not surprisingly, it is difficult to wrap our heads around the genuine complexity of the earth we inhabit, let alone of the cosmos. Being the limited creatures that we are, we need our categories—but let’s not worship them. Let’s not condemn the Eastern wolf and the red wolf to extermination just because they mess up our laws.

How We Think About Nature

It is so commonplace to think of nature as that which is free of human presence or interference that few people ever pause to consider how unnatural such a concept is. If human beings and their activities are not natural and do not occur in nature, where do they take place? If the answer is “in cities,” then we can further ask, “Where do cities exist?” Also, what are cities made of. Etc.

But in truth, human beings are as natural as grizzly bears and dandelions. We are animals; we have bodies which are composed of the same stuff as the bodies of all other mammals, of all other vertebrates as well, and at the microscopic level, our bodies’ cells are very like any other living cells. We reproduce as other animals do, we have DNA just as they do, and our brains, while noticeably more complex and capable that those of other animals, are otherwise pretty much the same as theirs. Like other animals, we must eat, breathe, and drink; we seek and/or build shelters, like beavers we divert water courses to our own benefit, and like many if not most creatures we create niches that are conducive to our well being.

It may also be said that we do not do anything that other animals don’t do, although we may do those things on a far greater scale. For example, we occasionally have carried companion organisms, whether deliberately as domesticated or useful, or inadvertently as parasites, to environments where they have not been present before and where they prosper in the absence of their traditional enemies: pigs, rats, and cats on islands, buffel grass in the American Southwest and Mexico, roses in South Africa, rabbits in Australia, foxes in New Zealand—there is a very long list. Those of us who are concerned about such things (apparently, not everyone is) call these creatures “invasive species” and would like to eliminate them from their colonial possessions. I agree: it is awful that Guam no longer has any native birds because of the introduction of the brown tree snake, and terrible that rabbits and mice wreak such destruction in Australia—an object lesson in how tragic it can be for a species to be without its predators, even for that species itself.

On the other hand, few of us would consider wheat an “invasive” species, yet with our help it has invaded every habitable continent and has taken over much of the American landscape from the native grasses; tomatoes, potatoes, and maize have spread from the Americas to the rest of the world, taking over vast tracts of land. But because we consider these species to be our allies, we do not call them invasive.

While island ecosystems can be disrupted by the introduction of new species, it is worth remembering that island ecosystems would not exist in the first place if islands were not invaded by organisms that had not previously existed there. When a new island forms, for example from a volcanic eruption from the ocean’s depths, there is no life on it, yet a few hundred or thousand years later it will be as verdant as the islands of Hawaii: green with flowering trees and shrubs, busy with the doings of birds and insects—all of which are, in a sense, invasive, their ancestors blown there by storms and winds or carried there on rafts of driftwood and debris. Organisms, even nonmobile ones like plants, do not stay in place—they wander, they spread, they invade, they take over, they flee, the die out, creating new species and new wonders in the process—a very long process, generally speaking. I’m sure that birds blown off course and landing on a less than ideal island in the storm may have had the seeds of some mainland plant in their digestive system, which, regardless of whether or not the bird survived, managed to sprout and struggle and survive and propagate, just as the seeds of some exotic plant have ridden on a human vessel and found themselves an hospitable new home. Kudzu, for example (intended), or Russian thistle (unintended).

What, then, is the difference between a seed carried in the gut of a bird and a seed carried in the pocket of a farmer, or an animal floating in on a raft of driftwood and seaweed and an animal floating in on the deck of a Polynesian canoe? They both accomplish the same thing, dispersing organisms to new ecosystems and keeping the evolutionary process churning. And the process of evolution over the billions of years to date has been marked by as much extinction as innovation. Human beings, themselves products of the same processes, are not engaged in an unprecedented activity—though we do seem, especially in the last 500 years or so (at least since 1492), to have accelerated the process to, comparatively, lightning speed. But aside from that, we are not actually doing something unusual, or even unnatural, in the annals of evolutionary time.

The difference is in ourselves, and is, broadly speaking, a moral difference. We are as capable of regret as we are of hope, of looking backward as forward; and while we often indulge in planning for the future and work towards improving our lot in life, we as often look to the past and itemize our mistakes as much as our triumphs. We can regret the passing of the dodo or the passenger pigeon, although none of us living has ever seen either; we can regret in foresight the impending extinction of the monarch butterfly or the African elephant—some of us can. Perhaps the moral sense arises from this ability to anticipate and retrospect, rather than (as some evolutionary psychologists are unduly prone to believe) from a moral molecule or altruistic gene. It is unlikely that our concern for the fate of other creatures is, or is entirely, out of concern for our own survival; we may have to adjust to a changing climate and a less “natural” world, but adjustment is not the same as extinction. From a practical point of view, i.e., from the view of human material needs, we probably have less to fear than the prophets lead us to believe—that is, if we learn to live without greed, that parasite that makes us want more than we need.

My concern, at least, is not with human survival, but rather with the survival of the many other creatures who also live on this planet, and insofar as aesthetics is a component of a moral vision, with the survival of the beautiful—I cannot see that human life is worthwhile without the beautiful. In one of his essays, Montaigne opined that voluptuousness is the equivalent of penitence; in religious terms, sin is its own punishment. One can also say that greed is its own punishment, for it destroys its object without gaining satisfaction. Of all the creatures on this earth, only human beings are greedy. Perhaps that is what makes us unnatural.

Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

Silent Spring: The Reckoning

Rachel Carson’s Silent Spring, a prophetic warning of the deleterious effects of pesticides such as DDT on the environment, was published in 1962. The book warned that the widespread use of pesticides was devastating bird populations, and that if such use was not eliminated or reduced, many species would become extinct. Carson detailed how DDT caused birds to lay eggs with shells so thin and fragile they broke before the embryos could develop into live chicks; birds of prey were especially affected because in their role as top predators, DDT became more concentrated in their bodies. At the time of publication, bald eagles had declined to near extinction because of the thin-shelled egg problem. Fortunately, despite heavy criticism by vested interests, Carson’s message was heard, DDT was banned, and the bald eagle has recovered, as have other raptors.

One would hope that the lesson had been learned and that similar mistakes would no longer be made. But nothing of the kind has in fact happened, despite all the earth days, demonstrations, supposed regulations, and lip service. A particularly striking and pertinent example of our failure to practice what we preach is the impending fate of the Monarch butterfly, that wonder of the insect world. This remarkable creature spends the summer months spread out in the northern United States (primarily in the upper Midwest) and southern Canada and winters concentrated in its millions in a small area of central Mexico. Even more amazing, this yearly migration covers multiple generations of the species, so that the butterflies that leave Mexico in the spring are not the same individuals who arrive in the north weeks later (they reproduce on the way), and a new generation leaves the north to return to Mexico in the fall. Yet they return to the same groves that their great-great-grandparents left months earlier!

Alas, the Monarch has one trait that has long served it well but which is now its Achilles’ heel: they lay their eggs on, and their caterpillars eat, only milkweed. They absorb the nasty taste of the milkweed, rendering them unpalatable to insect eating birds, which protects them from predation on their long, multigenerational migrations. Should the milkweed decline or disappear, so too will the Monarch.

Which is exactly what appears to be happening. Scientists and amateurs alike have noted a steady decline in the numbers of Monarchs gathering each year in Mexico (the best place to get a handle on their numbers), and this year (2013) the population has declined precipitously. According to a recent article in the New York Times, in 2012 the numbers of butterflies at the Mexico wintering site was approximately 60 million, itself a decline from previous years; but this year, only 2 million have showed up, and they showed up a week later (more on the implications of this fact later). Imagine if the human population had dropped from its current 7 billion to less than 300 million in just one year.

The most likely cause of this decline is the rapid disappearance of milkweed along the routes followed by the butterflies as they move north and south in their annual journeys. The American Midwest, that famous breadbasket to the world, is increasingly covered with corn and soybean fields, a large percentage of which are planted with so-called “Roundup ready” varieties, i.e., varieties that have been genetically engineered to resist glyphosate, the active ingredient in Roundup brand herbicide. Milkweed and other native species are not genetically engineered to resist that poison, so they die while the corn and soybeans prosper. With insufficient milkweed available on which to lay their eggs, Monarchs cannot renew their numbers, so they also die.

Likely compounding the problem is global warming. Canadian scientists have observed that many Monarchs are migrating further north than in the past, well past the natural range of the milkweed. While adults can feed on the flowers of other species, they can lay their eggs, and the caterpillars can dine, only on milkweed. Thus those Monarchs who went too far north (probably because of temperature) could not successfully reproduce. Warming may also explain why Monarchs arrived a week late in Mexico.

The phenomenon of crops genetically engineered to resist manmade herbicides is an example of System run amuck. System operates on the erroneous belief that a “problem” is singular and that its solution is also singular. So, if “weeds” are “invading” your crops, getting rid of them will take care of that problem. (Note: How can native species be invading on non-native, and artificial, varieties? Aren’t corn and soybeans invading on the native species? Aren’t corn and soybeans therefore the true weeds?) How very ironic that our capitalist system seems to be imitating a communist dictator: Chairman Mao once ordered that all sparrows be killed because they stole grain; consequently, crop-eating insects increased in numbers so sharply that he ordered widespread spraying of insecticides. Result: The elimination of pollinating species, particularly honeybees. If the Monarch is in such dire straits, are not other, likely beneficial species along its route also threatened? At the same time, some not so beneficial “weed” species are developing resistance to glyphosate, and it is likely that in the not so distant future, glyphosate herbicides will be rendered useless while some other pestilence will discover the vulnerabilities of genetically engineered crops. Thus the solution will turn out to be yet another of mankind’s many self-made problems.

See also this more recent article.

The Mismeasure of All Things

Some 2500 years ago, Protagoras said that man is the measure of all things. By this he meant something like, mankind can know only that which it is capable of knowing, which in effect is a recognition that the human mind does have its limits; but Protagoras’ statement has often been taken to mean that man is the standard by which all other things are to be measured, i.e., that mankind is the standard of comparison for judging the worth of everything else. This meaning may have been colored by the Christian concept of man as the object of divine history, of man as just a little lower than the angels. The Christian concept, in its turn, derives from a common interpretation of the creation story in Genesis, in which God gives man dominion over the rest of earthly creation.

However, while both Protagoras’ saying and the Genesis story carry the concept forward through history, neither explains how the idea actually originated. It may have been Giambattista Vico (1668-1744) who first recognized that it is ignorance rather than knowledge that makes man the measure of all things: “When men are ignorant of natural causes producing things, and cannot even explain them by analogy with similar things, they attribute their own nature to them.” That is, when primitive men and women surveyed the world and sought explanations of phenomena, they had nothing to go by other than what they knew about themselves, so that, for example, a terrible destructive storm could be explained as the anger of the gods, since when human beings became angry they too engaged in destructive behavior; or when a gentle rain caused plants to grow, the gods were in a good mood, perhaps pleased by some human act of worship, because when humans were in a good mood, they engaged in benevolent acts. After all, the earliest humans could not have had any knowledge of the material causes of storms, droughts, etc., nor of course of animal behavior, which they attributed to motives much like their own. As Stephen Toulmin and June Goodfield summarize Vico’s views, in primitive mythologies people “could measure the world of Nature only by that which they already knew—namely themselves” (The Discovery of Time).

Both Protagoras and Genesis simply give more sophisticated glosses on this primitive impulse. They reflect the increasing body and complexity of knowledge developed by ancient civilizations, particularly those that had developed writing systems, which in turn enabled them to impose order on what had been a plethora of local myths and their variants. Simply by creating relatively coherent pantheons containing gods with discreet attributes, roles, and positions in a divine hierarchy, ancient civilizations were able to organize their intellectual world and provide authoritative explanations. Monotheism carried this further, by providing an even more unified world view, but it also somewhat depersonalized the concept of God, making him more abstract and less personal (e.g., no images or idols, no household god or genie of the local spring, etc.). This was an important achievement in the ongoing development of knowledge, a necessary step in the process that led to the state of knowledge we enjoy today, in large part because it put more emphasis on cerebral, intellectual rather than personal and experiential modes of understanding—in a sense, creating theory to replace myth. Thus we see the Greek philosophers creating the first science and the Jews creating the first inklings of theology and, importantly, teleology (a sense of history with a goal towards which it was moving). Nevertheless, the Judeo-Christian god retained strong anthropomorphic features, especially in the popular imagination and in visual arts, in which, for example, God the Father was usually depicted as a white-haired old man. Perhaps as long as most people were illiterate and dependent on visual media for their abstract knowledge, anthropomorphism was to be expected.

The Western European, Christian intellectual (literate) tradition combined these two strands of ancient thought, the scientific/philosophical with the historic/teleological, setting the stage for a modern world view that sees the world as making coherent sense and as operating according to consistent, universal laws, which then can be exploited by human beings for their own betterment. As scientific knowledge expanded and material explanations could be provided for phenomena that once were viewed as signs of divine intervention, God receded to the back of men’s minds as less necessary to explain the world—at best, perhaps, He became little more than the Prime Mover, the one who got it all started or the one who established the universal laws which continue to operate without His immediate intervention. But if the Age of Reason or the Enlightenment put God into retirement, it did not give up the belief in coherent laws and the quest for universal theories, nor did it give up the teleological view of history.

It is important to note that the teleological view is always a human-centered view; history, whether of cosmos, nature, or society, was still about man; very few thinkers hazarded to speculate that man might be merely one among many creatures and phenomena rather than the point of the whole enterprise. In this sense, at least, the early modern era retained the primitive impulse to both anthropomorphism and anthropocentrism. The widespread acceptance of Darwin’s theory of evolution by means of natural selection did little, indeed perhaps nothing, to change that for most people. It was not difficult to switch from believing that God had created man for dominion over nature and as the center of the historical story of fall and redemption, to believing that evolution is teleological, both in the sense of inevitably leading to the emergence of homo sapiens as the crowning outcome of the evolutionary process and in the sense of evolution as a progressive process. And it was easy enough, in the context of nineteenth-century capitalism, to believe that modern industrial culture was the natural continuation of progressive evolution—indeed was its goal.

It took a generation or more for it to dawn on people that Darwinism, along with the geological discoveries regarding the great age of the earth and the astronomers’ and physicists’ discoveries of the even greater age of the universe, implied there is no god at all, not even the reticent god of the Deists. One would think that once this implication struck home, both the teleological and the anthropocentric views would fade away. But, perhaps due to human vanity, neither has done so.

In a supremely ironic twist, both teleology and anthropocentrism have been inverted. Whereas the theological age measured other creatures in human terms, the evolutionary age measures humans in animal terms. We are no longer a little lower than the angels but only a little bit higher than the other animals—or maybe not even that. We are naked apes, talking apes, singing apes. We are like social insects, we are vertebrates, we are aggressive because we are animals seeking to maximize our survival, we are merely transportation for the real biological players, selfish genes. We are not rational or conscious, we do not have free will, we operate by instinct, each of our seemingly advanced traits is hard-wired. Our morality is nothing more than an adaptation. We take a word like altruism, which originally meant a certain kind of human behavior, apply it to ants, where it becomes a description of instinctive eusocial behavior, and then re-apply that meaning back onto humans. Thus making us just like all the other animals. Therefore, we study them in order to understand ourselves. We focus on the similarities (often slim) and ignore the differences (often radical).

This continues the old habit of anthropomorphism in new guise and fails to recognize the independent existence of other creatures—their independent lines of evolution as well as their ontological separateness from us. We unthinkingly repeat that humans and chimps share 96 percent of their genes (or is it 98 percent?), as if that meant something—but then, it’s said we share 97 percent of our genes with rats. We neglect to mention that apes and humans diverged from each other some 7 to 8 million years ago and have followed independent lines of evolution ever since. We are not apes after all.

Consider the fruit fly, that ubiquitous laboratory subject which has yielded so much knowledge of how genes work. It is often cited as a model of human genetics and evolution. But consider what Michael Dickinson, a scientist (he calls himself a neuroethologist) at the University of Washington (Seattle), has to say about fruit flies: “I don’t think they’re a simple model of anything. If flies are a great model, they’re a great model for flies.” To me, this is a great insight, for it recognizes that fruit flies (and, frankly, insects in general) are so other than like us that to study them as if they were a model of anything other than themselves, as a model of us, is in a sense not to study them at all. It is rather to look into their compound eyes as if they were mirrors showing our own reflections. It is a form of narcissism, which perhaps contains our own demise.

Our demise because in continuing to look at nature as being about ourselves we continue the gross error of believing we can manipulate nature, other organisms, the entire world, to our own narrow purposes without consequences. It turns other organisms into harbingers of homo sapiens, narrows research to that which will “benefit” mankind, and misses the very strangeness of life in all its diversity and complexity. It continues the age-old world view of human dominion and fails to recognize that our “dominion” is neither a biological necessity nor a feature of the natural world. Dominion is a dangerous form of narcissism which a maturely scientific age should discard.

Against System

As many people know, the world’s domestic honeybees are seriously threatened by a condition called colony collapse disorder, in which once healthy colonies suddenly disappear altogether, virtually overnight. Many different possible causes are generally cited, including mites, malnutrition, viral diseases, pesticides, and so forth, and it is routine to call for “more research.” Don’t ya just love it? Whenever we don’t want to face the truth, we call for “more research.” Yet the film demonstrates that more research is not needed. The reason for colony collapse is crystal clear: we have turned living organisms, the bees, into industrial units, mere nuts and bolts moving along a factory line, without regard to the fact that they are living organisms—much as we have other animals, such as cattle, chickens, hogs, and fish.

Click here to read entire article

Marriage vs Mating

Yet Another Just-So Story

What is marriage? Ask an American of strong religious beliefs, and he is likely to say that it is a union between one man and one woman sanctioned by God. Ask more secular individuals, and they are likely to say that it is a civil contract between two individuals, committed to each other by love, but of practical importance in terms of legal and tax benefits, etc. Ask some biologists, and they will say that monogamous marriage is an evolutionary adaptation that increased the survival rate of helpless human infants, guaranteed to the father that the children produced by his wife were indeed his, and/or facilitated the development of human intelligence—or whatever, as long as the explanation can be stated in terms of natural selection. So at least is the impression one receives from a recent article in the New York Times (titled, somewhat misleadingly, since polygamy is discussed, “Monogamy’s Boost to Human Evolution”—but at least the title does neatly summarize the bias).

Ask an historian, a sociologist, or an anthropologist, and any one of them is likely to say that marriage practices vary over time and among cultures, from polygamy to monogamy, and they are also likely to mention that it varies by class. In warrior societies polygamy was common among the warrior elite (including kings and nobility, whose avocation was warfare, and who could have both many wives and concubines) to monogamy among the commoners; polygamy is common in societies in which there is a high mortality rate among young men (war, hunting mishaps, etc.) whereas monogamy is more common among societies in which the balance of adult males to females is more even, as well as in more egalitarian societies. Generally speaking, marriages were contracted for social purposes, to cement alliances, to protect inherited property, or to synchronize labor.

Marrying for love is a rather recent innovation and is characteristic of modern individualistic (and capitalist) countries, although monogamy has long been legitimized by Christianity, in part because of its dread of sexual license. Some people get around the stricture by having separate and unofficial multiple spouses, for example Charles Lindbergh, who had children in long-term relationships with three women other than his wife. Contemporary Americans seem to be practicing serial monogamy (divorce and remarriage) as well as unofficial and often temporary arrangements. In all cases, there has always been a whole lot of cheatin’ goin’ on. Then there is the added element of prostitution, including street walkers and courtesans, for which even the cleverest evolutionary biologist would have a hard time providing an evolutionary explanation. All of which suggests that marriage is different from mating. The latter is strictly biological—up until very recent times, there has been only one way to produce children, the sexual congress of a fertile man with a fertile woman, and this one way is unaffected by social customs. That is, socially sanctioned monogamy does not prevent either partner from producing a child with a person other than his/her spouse; eggs and sperm recognize no such boundaries.
I
t therefore seems both pointless and fruitless to try to concoct explanations for marriage customs and practices from natural selection. At some unknown point in the remote human past, people began creating nonbiological ways of organizing their lives. It’s what our big brains allow us to do. Mating may be in our DNA; marriage, however, is not.

Apart from the waste of time and grant money entailed in the pursuit of these evolutionary Just-So stories, the misguided notion, bordering on an ideology, that everything humans do can be explained solely in biological evolutionary terms, by a module in the brain, by DNA (i.e., instinct), denigrates other modes of knowledge that actually produce better explanations. We can learn more about marriage from historians and anthropologists than we can from biologists.

Paleolithic Fantasies

We live in an age like all previous ages, one in which thinking people assess the state of the world, find it wanting, and consequently seek a better, even perfect, way of life. Such people tend to roughly divide into those who seek their utopias in a vision of the future (today: think digital prophets, genetically modified crops) or a return to a golden past when human beings were in perfect harmony with nature (past: think Eden and the Noble Savage; today: think organic farming, artisanal cheese). Interestingly, one finds both types among both liberals and conservatives, though usually with different emphases (liberals tend to go for the organic, conservatives for traditional morality, while both seem to think that digital technology holds great promise for the future, either through greater community or better security). And advocates of both sides seem to appeal, either implicitly or explicitly, to “human nature” as the ultimate measure of the perfect way of life (using either Darwin or the Bible as the validating text). Thus, amid all the changes of outward circumstance, human nature has remained unchanged through time.

Marlene Zuk, author of Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live (W. W. Norton, 2013), addresses the myth, the just-so story, of a fixed human nature from an evolutionary perspective. An evolutionary biologist currently associated with the University of Minnesota, Zuk has conducted extensive field research, particularly on crickets, and is the author of numerous specialized articles and several popular books on evolutionary biology, behavioral biology, and sexual selection. She is therefore particularly well-qualified to demolish popular myths about human evolution, which she does with clarity and wit in this new book. (Her wit is best illustrated by her statement that “After all, nothing says evolution like a brisk round of the plague.”) Her immediate targets here are evo-myths about diet and health, particularly those that base their tenets on the very false idea that contemporary human beings are Paleolithic creatures uncomfortably and unhealthily stuck in an unnatural modern industrial environment. In other words, the natural man, the Noble Savage, the Eden which we have lost, is to be found in the lifestyles of early Stone Age humans prior to the development of agriculture (the true Original Sin) and settled life, that is prior to about 10,000 years ago. Supposedly, humans of the Paleolithic lived in that much admired perfect harmony with nature, and to restore our health and souls, we need to retrieve that lifestyle and apply it to our urbanized lives today.

Alas, like all utopian dreams, whether of past or future, what Zuk calls paleofantasies are exactly that, fantasies, and in the course of demonstrating just how fantastic they are, she treats her readers to a particularly clear and nonidealogical series of lessons on what evolution really is. And what it is not: it is not purposeful and it is not perfect or ever perfected. Thus, she demolishes the notion of the Noble Savage (by whatever name) when she writes that there is no utopian moment of perfect synchronicity between human beings and their environment. Both organisms and environments constantly change (and both humans and environments certainly did over the 2.6 million years of the Paleolithic period), and to think that today’s human beings are unchanged from those of even a mere 10,000 years ago “misses the real lessons of evolution” and “is specious” (p. 59). And lest we think that evolutionary change moves in some kind of logical direction, she writes that “evolution is more of a drunkard’s walk than a purposeful path” (p. 78).

Evolution never intends anything. It is a Rube Goldberg contraption, or rather the creatures it throws up are, because, rather than aiming at or achieving perfection, it measures success only by reproductive success. “If something works well enough for the moment, at least long enough for its bearer to reproduce, that’s enough for evolution” (p. 8). When you think about it, this is actually an excellent measure, simply because “perfection” is purely a human concept, and no one can agree on just exactly what perfection is. Should we eat only meat, because, as some paleo diet buffs claim, that’s what our Pleistocene ancestors ate? Or should we eat only raw vegetables and fruit, because, as other buffs claim, those were the exclusive menu items of our ideal past? Should we eschew grains, because they are cultivated and therefore not natural? Just exactly what would the “perfect” diet for human beings consist of?

According to Zuk, it depends. As she shows, various populations of human beings have evolved to utilize foods that our hunter-gatherer ancestors would not have been able to eat. For example, adults of some populations can digest milk, while the majority of human adults cannot (lactose intolerance). Certainly, the latter should avoid dairy, but the former can consume dairy products pretty much as they please. Insofar as the deleterious effects of agriculture are concerned, yes, it appears to be true that initially human health and well-being declined after people began cultivating grain crops and living in permanent settlements, but Zuk points out that it did not take all that long for this disadvantage to disappear; and as we know, agricultural societies grew larger and faster than foraging societies (reproductive success again being the measure of evolutionary success). Certainly some kind of genetic mutations could have occurred that conferred a greater ability to prosper on a diet high in grains; but it is also possible that as people improved their knowledge of cultivation and selectively improved the quality of their crops, and also exploited the advantages of settlements in facilitating trade, they overcame the initial disadvantages of agriculture. But whatever the case, it’s important to keep in mind that the early agricultural peoples themselves apparently thought that the advantages of agriculture outweighed its disadvantages—why else persist in farming?

An analogous point could be made about our modernity: If modern urban life is so bad for us, so unnatural and maladaptive, why did we develop it in the first place? If we are really, as some do argue, merely products of biological evolution like any other animal and, as some do argue, our consciousness is merely an illusion, how did we “evolve” a state of affairs so contrary to our biological being? And why do we cling to it so tenaciously? If it were really so horrible, wouldn’t we be fleeing the city for the more natural environments of the northern woods or western prairies (the United States’ closest approximation of the Edenic savannahs)? The fact that we do not suggests that urban industrialized life may not be so bad for humans after all. (How bad it may be for other organisms is a different question.)

Whatever the sources of some people’s dissatisfaction with modern human life, a mismatch between our Paleolithic natures and modernity is not one of them, and the appeal to evolution is, as already noted, based on a misconception of what evolution is. A major aspect of that misconception is an over-emphasis on natural selection. But as Zuk points out, “it is important to distinguish between two concepts that are sometimes—incorrectly—used interchangeably, evolution and natural selection. At its core, evolution simply means a change in the frequency of a particular gene or genes in a population” (p. 251). The mechanisms by which these gene frequency changes occur include not only natural selection, but genetic drift, gene flow, and mutation. “Genetic drift is the alteration of gene frequencies through chance events” (p. 251). “Gene flow is simply the movement of individuals and their genes from place to place, and activity that can itself alter gene frequencies and drive evolution” (p. 252). “The final way that evolution sans natural selection can occur is via those mutations, changes in genes that are the result of environmental or internal hiccups that are then passed on to offspring” (p. 252). In order to see whether or not evolution is occurring in humans today, one does not look at superficially visible traits but at changes in gene frequency among human populations.

Another all too common misconception is that “evolution is progressing to a goal” (p. 252), what can be called the teleological error. Even well-known and well-informed people believe that evolution is goal directed. For example, Michael Shermer, the editor of The Skeptic magazine and the author of a number of pro-evolution books, writes in The Science of Good and Evil that “Evolutionary biologists are also interested in ultimate causes—the final cause (in an Aristotelian sense) or end purpose (in a teleological sense) of a structure or behavior” (p. 8); he then states that “natural selection is the primary driving force of evolution” (p. 9). In contrast, Zuk reiterates throughout her book that “everything about evolution is unintentional” (p. 223), that “all of evolution’s consequences are unintended, and there never are any maps” designating a foreordained destination—and she is in fact an evolutionary biologist!

A good example of an unintentional evolutionary consequence is resistance to HIV, the retrovirus that causes AIDS. As it happens, some individuals are resistant or immune to the retrovirus, but not because evolution or natural selection intended them to be so. Centuries ago, bubonic plague swept through Europe; millions died of this highly infectious disease, but some few people did not get the disease despite having been exposed to it. No doubt they thought God had spared them for some divine reason. Centuries later, some of their descendents were exposed to HIV and did not become ill. Did God plan that far ahead to spare these few lucky individuals? Did evolution? No. A random mutation happened to render human cells unreadable to the plague bacterium (or, as Zuk suggests is more likely, unreadable to the smallpox virus); consequently, the bacteria could not enter the cells and wreak their havoc. The mutation would have had to have occurred before the introduction of the disease into the lucky few’s environment (there would not have been enough time for it to occur and proliferate after the disease’s introduction), and may have had no prior function, good or bad. As chance would have it, centuries later, the same mutation also made the owner’s cells unreadable to the AIDS virus, thus rendering him or her immune to HIV—quite by chance. Pace Lamarck, perhaps we can say that it is not characteristics that are acquired, but functions. The gene mutation that confers HIV immunity has after many generations finally an acquired function.

Why then do organisms seem so perfectly adapted to their environments? Perhaps they are not so perfectly adapted as they appear to human eyes; more importantly, since environments change, organisms must change as well, but perhaps if they were too perfectly adapted (each and every individual of the species therefore being identical), they would rather quickly become imperfectly adapted to even small changes in their environment. Perhaps, then, perfection is an extinction trap rather than a desirable goal.

Empathy Imperiled: A Review

One can to some extent understand the current enthusiasm of conservatives for Darwinian deterministic explanations of human behavior, inasmuch as determinism is compatible with the views of human nature already held by conservatives. Even religious conservatives, those who go so far as to deny evolution per se, subscribe to a deterministic view. The Edenic fall, the apocalyptic view of history, etc., are elements in God’s overarching plan, and human free will is largely limited to submitting to God’s will or facing the dire consequences. Secular conservatives hold that Evolution is the grand plan (even though they usually deny teleology for appearances’ sake) and that we should submit to the inevitabilities of our genes and our Pleistocene natures. But it is puzzling that a considerable number of scholars in the humanities and social sciences submit to Darwinian explanations of art, literature, philosophy, etc.; perhaps they do so in a desperate attempt to retain “relevance” in an age when technology, science, and the MBA have the hegemonic edge.

It is especially surprising when a writer of definitely left-wing political beliefs attempts to recruit biological evolution to the socialist or communitarian cause. Such is the case, sadly, with Gary Olson’s book Empathy Imperiled: Capitalism, Culture, and the Brain (Springer, 2013). Olson is a professor of political science at Moravian College in Bethlehem, Pennsylvania, and active in liberal causes. In this book, he explores a two-part thesis: the first is that mirror neurons in the brain hardwire us for empathy; the second is that the culture of capitalism thwarts this natural empathy in favor of selfishness.

Why is his first point important to his second point? According to Olson, that we (and at least some other animals) have mirror neurons has been proven by science, which in turn provides support for the idea that human beings are naturally (i.e., biologically) empathetic. It is not biology or our evolutionary history that makes us divisive and driven by selfishness and enmity but rather, culture, particularly capitalist culture, has thwarted this natural trait. However, while the existence of mirror neurons in macaques appears to be well established, their existence in human beings is not. It further is not at all certain that mirror neurons are the source of empathy. They seem instead to mirror others’ motor movements, such that when a macaque sees another macaque pick up a peanut and put it in its month, the first macaque can imitate that action, but it is a long way from motor imitation to empathy. But by means of a non sequitur, Olson evades the problem: “The monkey’s neurons were ‘mirroring’ the activity she was observing, suggesting she was responding to the experience of another, such as when we experience empathy for someone else’s circumstances” (p. 21). As in all non sequiturs, there is some verbal sleight of hand in this sentence: from mirroring an activity (outwardly visible) to mirroring an experience (inward and subjective), and then the leap from a monkey mirroring/responding to another monkey’s actions to a human being actually feeling with another human being (and what “circumstances” are implied here?). No explanation for this leap from an activity to a subjective state is provided.

It is worth pointing out here that complex animals like macaques, chimpanzees, or humans do not consist of one behavioral trait. Even if mirror neurons do exist in monkeys or humans, even if we are willing to make the leap of faith that mirror neurons hardwire us for empathy, empathy is not our only behavioral trait and can then, quite naturally rather than culturally, be over-ridden by other traits that might be more appropriate to a particular situation or circumstance. Thus a person might be empathetic one day and jealous the next, or understanding and helpful to one person and belligerent to another. None of us would hurt a fly—until the situation called for a fly swatter.

Perhaps “empathy” is a poor word, anyway. The observant macaque might use its ability to “mirror” another’s actions by stealing the peanuts; a human being who can “feel with” another person might use that to manipulate and outwit. Merely “mirroring” does not guarantee virtuous cooperation.

There are equally damaging inadequacies with Olson’s development of the second part of his thesis, that capitalism thwarts our natural empathy. He writes that “capitalism is by its very nature competitive and exploitive, not communal and empathetic except to the degree that empathy can enhance profitability” (p. 25). Well, true, at least to some extent. But is this true only of capitalism? As a leftist, Olson seems to think that it is. But Olson fails to show that capitalism is more destructive of empathy than other actual (rather than ideal) economic systems. To do so, some comparisons (other than to Cuba) would be necessary. For example, given the endemic slavery of the Roman Empire, which was not capitalist, surely we can say that Rome was destructive of empathy. Indeed, a major motive for official Roman antagonism to early Christianity was precisely its encouragement of empathy, particularly for the poor, the oppressed, and the enslaved. Ancient Greece, despite Athens’ reputation as the birthplace of democracy, also depended on slavery and denied citizen status to everyone except free-born, native-born males (i.e., not “foreigners,” women, slaves, etc.). It is the ancient Greeks who gave us the word barbarian, a pejorative for the “them” of the us vs. them dichotomy. In the Americas, aside from the Aztecs and Maya (human sacrifice, fierce warfare), there were the Iroquois, who made territorial war against their neighbors, and slavery was also practiced by numerous Indian tribes. While many sins have been committed under capitalism, so have they under all other actual economic systems.

On the other hand, some ancient sins withered away under capitalism. Chattel slavery was abolished after capitalism was established, the vote has been extended to all adults, men and women alike, of whatever class. The various products of industrial/scientific medicine have eliminated or vastly reduced the ravages of infectious diseases, to the point where infant and child mortality has gone from being a commonplace to an exception. The disease, smallpox, that many historians estimate killed as much as 90% of Native Americans after the arrival of Europeans has been eliminated. These examples are not meant to absolve capitalism of its sins, but to demonstrate that any political and economic systems, just like the human beings who create and sustain them, are complex mixtures and degrees of good, bad, and indifferent. Capitalism may have run its course and may, through the usual difficult process that attends major historical shifts, be replaced by something better suited to our new globalized, over-heated world, but I doubt that that new system will be as morally exemplary as many dream of.

In my opinion, mirror neurons, neuroscience, genetics, etc., add little of interest or usefulness to issues of morality. In Olson’s book, the best passages are not those which unsuccessfully attempt to recruit mirror neurons to moral purposes but those which explore the profound words of Jesus (e.g., the parable of the Good Samaritan) and Martin Luther King, Jr. (e.g., King’s interpretation and application of that parable). Such wisdom does not require a pseudoscientific gloss.