Tag Archives: evolutionary psychology

Evolution and Theodicy

“Why is there evil in the world?” This question has been asked by philosophers and theologians and ordinary men and women for millennia. Today scientists, particularly evolutionary biologists, neuroscientists, and evolutionary/neuropsychologists have joined the effort to explain evil: why do people indulge in violence, cheating, lies, harassment, and so on. There is no need here to itemize all the behaviors that can be labeled evil. What matters is the question of “why?”

The question of “why is there evil in the world?” assumes the premise that evil is abnormal while good however defined) is normal—the abnorm vs. the norm, if you will. Goodness is the natural state of man, the original condition, and evil is something imposed on or inserted into the world from some external, malevolent source. In Genesis, God created the world and pronounced it good; then Adam and Eve succumbed to the temptations of the Serpent and brought evil and therefore death into the world (thus, death is a manifestation of evil, immortality the natural state of good). Unfortunately, the Bible does not adequately account for the existence of the Serpent or Satan, so it was left to Milton to fill in the story. Gnostics, Manicheans, and others posited the existence of two deities, one good and the other evil, and constructed a vision of a cosmic struggle between light and darkness that would culminate in the triumph of good—a concept that filtered into Christian eschatology. The fact that Christian tradition sees the end times as a restoration to a state of Adamic or Edenic innocence underscores the notion that goodness is the natural, default state of man and the cosmos.

Contemporary secular culture has not escaped this notion of the primeval innocence of man. It has simply relocated Eden to the African savannah. When mankind was still at the hunter-gatherer stage, so the story goes, people lived in naked or near-naked innocence; they lived in egalitarian peace with their fellows and in harmony with nature. Alas, with the invention of agriculture and the consequent development of cities and civilizations, egalitarianism gave way to greed, social hierarchies, war, imperialism, slavery, patriarchy, all the factors that cause people to engage in violence, oppression, materialism, and so on; further, these faults of civilizations caused the oppressed to engage in violence, theft, slovenliness, and other sins. Laws and punishments and other means of control and suppression were instituted to keep the louts in their place. Many people believe that to restore the lost innocence of our hunter-gatherer origins, we must return to the land, re-engage with nature, adopt a paleo diet, restructure society according to matriarchal and/or socialist principles, and so on. Many people (some the same, some different from the back-to-nature theorists) envision a utopian future in which globalization, or digitization, or general good feeling will restore harmony and peace to the whole world.

Not too surprisingly, many scientists join in this vision of a secular peaceable kingdom. Not a few evolutionary biologists maintain that human beings are evolutionarily adapted to life on the savannah, not to life in massive cities, and that the decline in the health, intelligence, and height of our civilized ancestors can be blamed on the negative effects of a change in diet brought on by agriculture (too much grain, not enough wild meat and less variety of plants) and by the opportunities for diseases of various kinds to colonize human beings too closely crowded together in cities and too readily exposed to exotic pathogens spread along burgeoning trade routes. Crowding and competition lead to violent behaviors as well.

Thus, whether religious or secular, the explanations of evil generally boil down to this: that human beings are by nature good, and that evil is externally imposed on otherwise good people; and that if circumstances could be changed (through education, redistribution of wealth, exercise, diet, early childhood interventions, etc.), our natural goodness would reassert itself. Of course, there are some who believe that evil behavior has a genetic component, that certain mutations or genetic defects are to blame for psychopaths, rapists, and so on, but again these genetic defects are seen as abnormalities that could be managed by various eugenic interventions, from gene or hormone therapies to locking up excessively aggressive males to ensure they don’t breed and pass on their defects to future generations.

Thus it is that in general we are unable to shake off the belief that good is the norm and evil is the abnorm, whether we are religious or secular, scientists or philosophers, creationists or Darwinists. But if we take Darwinism seriously we have to admit that “evil” is the norm and that “good” is the abnorm—nature is red in tooth and claw, and all of the evil that men and women do is also found in other organisms; in fact, we can say that the “evil” done by other organisms long precedes the evil that men do, and we can also say, based on archaeological and anthropological evidence, that men have been doing evil since the very beginning of the human line. In other words, there never was an Eden, never a Noble Savage, never a long-ago Golden Age from which we have fallen or declined—and nor therefore is there any prospect of an imminent or future Utopia or Millennial Kingdom that will restore mankind to its true nature because there is nothing to restore.

The evolutionary function of “evil” is summarized in the term “natural selection”: the process by which death winnows out the less fit from the chance to reproduce (natural selection works on the average, meaning of course that some who are fit die before they can reproduce and some of the unfit survive long enough to produce some offspring, but on average fitness is favored). Death, usually by violence (eat, and then be eaten), is necessary to the workings of Darwinian evolution. An example: When a lion or pair of lions defeat an older pride lion and take over his pride, they kill the cubs of the defeated male, which has the effect of bringing the lionesses back into heat so that the new males can mate with them and produce their own offspring; their task is then to keep control of the pride long enough for their own cubs to reach reproductive maturity. Among lions, such infanticide raises no moral questions, whereas among humans it does.

There is no problem of evil but rather the problem of good: not why is there “evil” but rather why is there “good”? Why do human beings consider acts like infanticide to be morally evil while lions do not? Why do we have morality at all? I believe that morality is an invention, a creation of human thought, not an instinct. It is one of the most important creations of the human mind, at least as great as the usually cited examples of human creativity (art, literature, science, etc.), if not greater considering how much harder won it is than its nearer competitors, and how much harder it is to maintain. Because “good” is not natural, it is always vulnerable to being overwhelmed by “evil,” which is natural: Peace crumbles into war; restraint gives way to impulse, holism gives way to particularism, agape gives way to narcissism, love to lust, truth to lie, tolerance to hate. War, particularism, narcissism, etc., protect the self of the person and the tribe, one’s own gene pool so to speak, just as the lion kills his competitor’s cubs to ensure the survival of his own. We do not need to think very hard about doing evil; we do need to think hard about what is good and how to do it. It is something that every generation must relearn and rethink, especially in times of great stress.

It appears that we are in such a time today. Various stressors, the economy, the climate, overpopulation and mass migrations, religious conflict amid the dregs of moribund empires, are pushing the relationship of the tribes versus the whole out of balance, and the temptations are to put up walls, dig trenches, draw up battle lines, and find someone other than ourselves to blame for our dilemmas. A war of all against all is not totally out of the question, and it may be that such a war or wars will eventuate in a classic Darwinian victory for one group over another—but history (rather than evolution) tells us that such a victory is often less Darwinian than Pyrrhic.

Donald Trump: Psychoanalysis vs. Ethics

Is Donald Trump a narcissist? Is he a psychopath? Is he mentally unstable? These questions, and others of the same ilk, have been asked (and often answered in the affirmative) throughout the primary campaign season. To a lesser extent, similar questions have been asked about his followers. There has been, in other words, a lot of psychoanalyzing. It’s as if the DSM-5, the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, has become the primary guide to politics and politicians.

Hillary Clinton has also, and for a longer time (at least since the Lewinsky scandal), been subjected to armchair and coffee house analysis (she’s in denial, etc.), even though, given that she is, for a politician, a surprisingly private person (i.e., uptight? Secretive? Not warm?), one wonders how anyone can legitimately diagnose her. Bill Clinton has also, of course, been parsed and dissected (narcissist, sex addict, etc.). Surprisingly, there has been little psychoanalysis of Bernie Sanders, perhaps because, as Hillary’s gadfly, he has dominated the high ground of principle.

Perhaps when a serious candidate actually has principles and stays consistent with them, psychologizing is unnecessary and even irrelevant. Principles have the effect of overriding personal quirks and biases. They are not generated from within this or that individual, and therefore are not reflective only of that individual, but are generated in a long process of shared thought. We come to principles through reason (Hannah Arendt might have said, through reason paired with imagination), not through impulse; indeed, the point of principle is to put a bridle on impulse, to restrain the impetuousness of the moment in favor of the longer, wider view. In Pauline terms, it replaces the natural or carnal man with the spiritual man; in late Protestant terms, it replaces immediate with delayed gratification.

So while Trump may or may not be a psychopath, a narcissist, or mentally unstable or ill, which none of us can really know, he is an unprincipled man. His constant shape-shifting, self-contradictions, denials, and off-the-cuff bluster are the signs of an impulsive man whose thoughts and words are not subjected to the vetting of a set of principles that can tell him whether he is right or wrong. He has at long last no shame, no decency, because he has no principles to tell him what is decent or shameful. In other words, he is typical of human beings, men and women, when they have nothing higher or wider than themselves as guides to behavior. This is not the place to go in depth into the utility of moral principle, but just as an example, something as simple as “do unto others as you would have others do unto you” can restrain the natural selfish impulse to grab as much as you can for yourself.

Anyone who has taken an introductory course in psychology or who has paged through any of the editions of the DSM has found plenty of evidence that they are in some way or another mentally unstable or unhealthy. Just about anyone can look at the list of defining characteristics of, say, narcissistic personality disorder (do you think you are special or unique?), or antisocial personality disorder (are you opinionated and cocky), or a perfectionist, and wonder, in a bit of self-diagnosis, if they should seek help. Welcome to the cuckoo’s nest. Or rather, welcome to humanity.

But for the concept of a disorder to exist, there has to be a concept of an order, i.e., a definition of what being a normal person is. Ironically, psychology is of no help to us here. The DSM-V is nearly one thousand pages long, and according to its critics adds more previously normal or eccentric behaviors to its exhaustive, not to say fatiguing, list of mental maladies. Its critics also charge that it provides ever more excuses for psychiatrists and physicians to prescribe very profitable drugs to people who are really just normal people. After all, they point out, life is not a cakewalk, and people are not churned out like standardized units.

Principle, i.e., morality, ethics, on the other hand, can be of great help here. It is obvious that the followers of Trump have not been dissuaded from supporting him because of the amateur psychoanalyses of pundits and opponents. Clearly they like those traits which the alienists are diagnosing. But what if someone started criticizing him on moral grounds, what if someone performed something analogous to “Have you no decency, sir?” This question, posed by Joseph N. Welch to Senator Joe McCarthy in a full Senate hearing in 1954, was a key moment in the demise of one of the worst men in American political history. Welch did not psychoanalyze McCarthy, nor did Edward R. Murrow in his famous television broadcast on McCarthy’s methods, and McCarthy was not taken away in a straitjacket. He was taken down by morally principled men and women who had had enough of his cruelty and recklessness.

Plato’s Cave, Inside Out

The original story of Plato’s cave can be summarized as this: A group of men are bound inside a cave with a wide entrance, though which the sun streams, projecting shadows on the back wall of the cave. The men’s shackles force them to face that back wall, so that all they can see are the shadows, moving back and forth across the wall. They are watching a kind of shadow play, which however they take for reality, as it is the only thing they can see. One day, the men are set free and dragged out of the cave into the sunlight, where they can see for the first time that the shadows they took for reality were cast by other men walking back and forth in front of the cave, carrying various objects as they went about their business. For the first time in their lives, these former prisoners realize that what they had believed was real were merely insubstantial silhouettes of the actual things that cast the shadows.

This parable has traditionally been understood to explain Plato’s philosophical Idealism, that is that the objects of the world as we perceive them are imperfect embodiments of the ideal Forms, which are the real things of the Cosmos. Thus, for example, that table in the dining room is a representation, so to speak, of the ideal form of “Table” which, unlike your dining table, is immaterial, perfect, eternal, and the “idea” that informs all tables—dining and kitchen, coffee and end, writing and conference, etc. All specific things of the material world are likewise merely expressions of their ideal forms. Thus, the “idea” of a thing is its truth—the material embodiment of the idea is imperfect, temporary, and therefore in a sense “false.”

The task of philosophy is to contemplate the ideal forms, not the imperfect expressions of them; this puts the “idea” above everything. One can see why Plato’s view has a great deal of appeal to philosophers and other types of intellectual, including all too often, ideologues, for whom an ideology (“a system of ideas and ideals, especially one that forms the basis of economic or political theory and policy”) trumps practicality (and oftentimes, morality). Whether or not Plato and his legion of descendents believed in a literal heaven of ideal forms, in practice they have behaved as if their ideas were in fact perfect, eternal, “self-evident,” and true, truer than experience and superior to the stubborn resistance of material things to being shaped according to these truths. For these types, reality is a sin against reason.

So let us attempt to correct Plato’s parable: The prisoners in the cave are not trapped in the material world, but in the confines of their own mind; they are contemplating the flickering shadows of their own thoughts, stripping away the particulars of individual objects and constructing vast theories on the basis of these one dimensional, flat, featureless cutouts. (It is worth noticing that shadows are also dark, i.e., the blocking out or absence of light, as being in the shadow of a tree or building.) Once the prisoners are freed, they can see that what they thought was real (their own thoughts) were not real at all.

It is the material world of particular objects, particular individual persons for example, as well as trees, vases, tables, songs, flowers, dogs, etc., that is filled with real things, the ideas of which are figments piled on figments unto confusion. Ideas uninspired and uncorrected by reality can lead us very far astray.

A relevant quotation:
“I ran out of interest in my own consciousness around 1990, but there’s no reason ever to run out of interest in the world.” –Crispin Sartwell, “Philosophy Returns to the Real World,” The New York Times, April 13, 2015

We Are All Still Animists

[Children do not] have to be taught to attribute people’s behavior to the mental states they’re in. Children tend, quite naturally, to anthropomorphize whatever moves. What they have to learn is which things don’t have minds, not which things do.”
–Jerry Fodor (“It’s the Thought That Counts,” London Review of Books, November 28, 1996.)

Iconoclastic statements have always appealed to me, particularly because they cause me to look at the iconic statements they are set against in a new and critical light. Sometimes the iconic statements survive the scrutiny; oftentimes they don’t. In this case the iconic statement, that children learn that other people have minds of their own (theory of mind) over time, seems commonsensical until it is re-read in light of Fodor’s statement. Then it appears less evidently true.

Look at the first part of Fodor’s statement, that children “quite naturally . . . anthropomorphize whatever moves.” To anthropomorphize is to attribute human characteristics, in particular a mind with such things as motives, desires, feelings, etc., to nonhuman things. But, in my experience, not just to things that move (pets, for example), but also to things that don’t move: Dolls and figurines don’t move, though they look like they could, but small children also attribute feelings to objects that, to an adult, clearly are inanimate, such as blankies and other favored possessions; hence their sense of tragedy when the blankie disappears into the laundry hamper, or the favorite rubber ball deflates.

To read the full article, click here.

Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

Ethics and Human Nature

It is an unhappy characteristic of our age that certain ignoramuses have been elevated to the ranks of “public intellectual,” a category which seems to consist of men and women who provide sweeping theories of everything, especially of everything they know nothing about. Into this category fall certain writers whose sweeping theory is that, prior to the Enlightenment, everyone lived in abject superstition and physical misery. With the Enlightenment, reason and science began the process of sweeping away misery and ignorance, clearing the field for the flowers of prosperity and knowledge. Such a sophomoric view of human history and thought has the virtue (in their minds only) of rendering it unnecessary for them to acquaint themselves with a deep and nuanced knowledge of the past, an error which permits them to attribute all that is good in human accomplishment to the age of science and all that is bad to a dark past best forgotten.

Nowhere is this more evident than in the recent fad for publishing books and articles claiming that science, particularly evolutionary science, provides the necessary and sufficient basis for ethics.

To read the article, click here.

The Mismeasure of All Things

Some 2500 years ago, Protagoras said that man is the measure of all things. By this he meant something like, mankind can know only that which it is capable of knowing, which in effect is a recognition that the human mind does have its limits; but Protagoras’ statement has often been taken to mean that man is the standard by which all other things are to be measured, i.e., that mankind is the standard of comparison for judging the worth of everything else. This meaning may have been colored by the Christian concept of man as the object of divine history, of man as just a little lower than the angels. The Christian concept, in its turn, derives from a common interpretation of the creation story in Genesis, in which God gives man dominion over the rest of earthly creation.

However, while both Protagoras’ saying and the Genesis story carry the concept forward through history, neither explains how the idea actually originated. It may have been Giambattista Vico (1668-1744) who first recognized that it is ignorance rather than knowledge that makes man the measure of all things: “When men are ignorant of natural causes producing things, and cannot even explain them by analogy with similar things, they attribute their own nature to them.” That is, when primitive men and women surveyed the world and sought explanations of phenomena, they had nothing to go by other than what they knew about themselves, so that, for example, a terrible destructive storm could be explained as the anger of the gods, since when human beings became angry they too engaged in destructive behavior; or when a gentle rain caused plants to grow, the gods were in a good mood, perhaps pleased by some human act of worship, because when humans were in a good mood, they engaged in benevolent acts. After all, the earliest humans could not have had any knowledge of the material causes of storms, droughts, etc., nor of course of animal behavior, which they attributed to motives much like their own. As Stephen Toulmin and June Goodfield summarize Vico’s views, in primitive mythologies people “could measure the world of Nature only by that which they already knew—namely themselves” (The Discovery of Time).

Both Protagoras and Genesis simply give more sophisticated glosses on this primitive impulse. They reflect the increasing body and complexity of knowledge developed by ancient civilizations, particularly those that had developed writing systems, which in turn enabled them to impose order on what had been a plethora of local myths and their variants. Simply by creating relatively coherent pantheons containing gods with discreet attributes, roles, and positions in a divine hierarchy, ancient civilizations were able to organize their intellectual world and provide authoritative explanations. Monotheism carried this further, by providing an even more unified world view, but it also somewhat depersonalized the concept of God, making him more abstract and less personal (e.g., no images or idols, no household god or genie of the local spring, etc.). This was an important achievement in the ongoing development of knowledge, a necessary step in the process that led to the state of knowledge we enjoy today, in large part because it put more emphasis on cerebral, intellectual rather than personal and experiential modes of understanding—in a sense, creating theory to replace myth. Thus we see the Greek philosophers creating the first science and the Jews creating the first inklings of theology and, importantly, teleology (a sense of history with a goal towards which it was moving). Nevertheless, the Judeo-Christian god retained strong anthropomorphic features, especially in the popular imagination and in visual arts, in which, for example, God the Father was usually depicted as a white-haired old man. Perhaps as long as most people were illiterate and dependent on visual media for their abstract knowledge, anthropomorphism was to be expected.

The Western European, Christian intellectual (literate) tradition combined these two strands of ancient thought, the scientific/philosophical with the historic/teleological, setting the stage for a modern world view that sees the world as making coherent sense and as operating according to consistent, universal laws, which then can be exploited by human beings for their own betterment. As scientific knowledge expanded and material explanations could be provided for phenomena that once were viewed as signs of divine intervention, God receded to the back of men’s minds as less necessary to explain the world—at best, perhaps, He became little more than the Prime Mover, the one who got it all started or the one who established the universal laws which continue to operate without His immediate intervention. But if the Age of Reason or the Enlightenment put God into retirement, it did not give up the belief in coherent laws and the quest for universal theories, nor did it give up the teleological view of history.

It is important to note that the teleological view is always a human-centered view; history, whether of cosmos, nature, or society, was still about man; very few thinkers hazarded to speculate that man might be merely one among many creatures and phenomena rather than the point of the whole enterprise. In this sense, at least, the early modern era retained the primitive impulse to both anthropomorphism and anthropocentrism. The widespread acceptance of Darwin’s theory of evolution by means of natural selection did little, indeed perhaps nothing, to change that for most people. It was not difficult to switch from believing that God had created man for dominion over nature and as the center of the historical story of fall and redemption, to believing that evolution is teleological, both in the sense of inevitably leading to the emergence of homo sapiens as the crowning outcome of the evolutionary process and in the sense of evolution as a progressive process. And it was easy enough, in the context of nineteenth-century capitalism, to believe that modern industrial culture was the natural continuation of progressive evolution—indeed was its goal.

It took a generation or more for it to dawn on people that Darwinism, along with the geological discoveries regarding the great age of the earth and the astronomers’ and physicists’ discoveries of the even greater age of the universe, implied there is no god at all, not even the reticent god of the Deists. One would think that once this implication struck home, both the teleological and the anthropocentric views would fade away. But, perhaps due to human vanity, neither has done so.

In a supremely ironic twist, both teleology and anthropocentrism have been inverted. Whereas the theological age measured other creatures in human terms, the evolutionary age measures humans in animal terms. We are no longer a little lower than the angels but only a little bit higher than the other animals—or maybe not even that. We are naked apes, talking apes, singing apes. We are like social insects, we are vertebrates, we are aggressive because we are animals seeking to maximize our survival, we are merely transportation for the real biological players, selfish genes. We are not rational or conscious, we do not have free will, we operate by instinct, each of our seemingly advanced traits is hard-wired. Our morality is nothing more than an adaptation. We take a word like altruism, which originally meant a certain kind of human behavior, apply it to ants, where it becomes a description of instinctive eusocial behavior, and then re-apply that meaning back onto humans. Thus making us just like all the other animals. Therefore, we study them in order to understand ourselves. We focus on the similarities (often slim) and ignore the differences (often radical).

This continues the old habit of anthropomorphism in new guise and fails to recognize the independent existence of other creatures—their independent lines of evolution as well as their ontological separateness from us. We unthinkingly repeat that humans and chimps share 96 percent of their genes (or is it 98 percent?), as if that meant something—but then, it’s said we share 97 percent of our genes with rats. We neglect to mention that apes and humans diverged from each other some 7 to 8 million years ago and have followed independent lines of evolution ever since. We are not apes after all.

Consider the fruit fly, that ubiquitous laboratory subject which has yielded so much knowledge of how genes work. It is often cited as a model of human genetics and evolution. But consider what Michael Dickinson, a scientist (he calls himself a neuroethologist) at the University of Washington (Seattle), has to say about fruit flies: “I don’t think they’re a simple model of anything. If flies are a great model, they’re a great model for flies.” To me, this is a great insight, for it recognizes that fruit flies (and, frankly, insects in general) are so other than like us that to study them as if they were a model of anything other than themselves, as a model of us, is in a sense not to study them at all. It is rather to look into their compound eyes as if they were mirrors showing our own reflections. It is a form of narcissism, which perhaps contains our own demise.

Our demise because in continuing to look at nature as being about ourselves we continue the gross error of believing we can manipulate nature, other organisms, the entire world, to our own narrow purposes without consequences. It turns other organisms into harbingers of homo sapiens, narrows research to that which will “benefit” mankind, and misses the very strangeness of life in all its diversity and complexity. It continues the age-old world view of human dominion and fails to recognize that our “dominion” is neither a biological necessity nor a feature of the natural world. Dominion is a dangerous form of narcissism which a maturely scientific age should discard.

Marriage vs Mating

Yet Another Just-So Story

What is marriage? Ask an American of strong religious beliefs, and he is likely to say that it is a union between one man and one woman sanctioned by God. Ask more secular individuals, and they are likely to say that it is a civil contract between two individuals, committed to each other by love, but of practical importance in terms of legal and tax benefits, etc. Ask some biologists, and they will say that monogamous marriage is an evolutionary adaptation that increased the survival rate of helpless human infants, guaranteed to the father that the children produced by his wife were indeed his, and/or facilitated the development of human intelligence—or whatever, as long as the explanation can be stated in terms of natural selection. So at least is the impression one receives from a recent article in the New York Times (titled, somewhat misleadingly, since polygamy is discussed, “Monogamy’s Boost to Human Evolution”—but at least the title does neatly summarize the bias).

Ask an historian, a sociologist, or an anthropologist, and any one of them is likely to say that marriage practices vary over time and among cultures, from polygamy to monogamy, and they are also likely to mention that it varies by class. In warrior societies polygamy was common among the warrior elite (including kings and nobility, whose avocation was warfare, and who could have both many wives and concubines) to monogamy among the commoners; polygamy is common in societies in which there is a high mortality rate among young men (war, hunting mishaps, etc.) whereas monogamy is more common among societies in which the balance of adult males to females is more even, as well as in more egalitarian societies. Generally speaking, marriages were contracted for social purposes, to cement alliances, to protect inherited property, or to synchronize labor.

Marrying for love is a rather recent innovation and is characteristic of modern individualistic (and capitalist) countries, although monogamy has long been legitimized by Christianity, in part because of its dread of sexual license. Some people get around the stricture by having separate and unofficial multiple spouses, for example Charles Lindbergh, who had children in long-term relationships with three women other than his wife. Contemporary Americans seem to be practicing serial monogamy (divorce and remarriage) as well as unofficial and often temporary arrangements. In all cases, there has always been a whole lot of cheatin’ goin’ on. Then there is the added element of prostitution, including street walkers and courtesans, for which even the cleverest evolutionary biologist would have a hard time providing an evolutionary explanation. All of which suggests that marriage is different from mating. The latter is strictly biological—up until very recent times, there has been only one way to produce children, the sexual congress of a fertile man with a fertile woman, and this one way is unaffected by social customs. That is, socially sanctioned monogamy does not prevent either partner from producing a child with a person other than his/her spouse; eggs and sperm recognize no such boundaries.
I
t therefore seems both pointless and fruitless to try to concoct explanations for marriage customs and practices from natural selection. At some unknown point in the remote human past, people began creating nonbiological ways of organizing their lives. It’s what our big brains allow us to do. Mating may be in our DNA; marriage, however, is not.

Apart from the waste of time and grant money entailed in the pursuit of these evolutionary Just-So stories, the misguided notion, bordering on an ideology, that everything humans do can be explained solely in biological evolutionary terms, by a module in the brain, by DNA (i.e., instinct), denigrates other modes of knowledge that actually produce better explanations. We can learn more about marriage from historians and anthropologists than we can from biologists.

Why Determinism?

The eternal debate between determinism and free will has lately taken a new form. Determinism has been reincarnated in the shape of neuroscience, with attendant metaphors of computers, chemistry, machines, and Darwinism. Meanwhile, defenders of free will seem to have run out of arguments, particularly since, if they wish to be taken seriously, they dare not resort to a religious argument. That the debate is virtually eternal suggests that it is not finally resolvable; it could be said in fact that the two sides are arguing about different things, even though they often use the same terminology.

Determinism’s popularity is most clearly suggested by the sales figures for books on the subject and by the dominance of the view in popular science writing. Such books are widely reviewed, while those arguing for free will are neglected, especially by the mainstream press.

The question then is not whether or not we have free will, or whether or not we are wholly determined in all our thoughts and actions; but rather, why at this point in time, particularly in this country, determinism is so popular, more so than free will?

Today’s determinism is not the same as the ancient concept of fate. Fatalism was not so much about determinism or, as the Calvinists posited, predestination; fatalism did not pretend to know what would happen, but rather held that fate was a matter of unpredictability, of whim (on the part of the universe or of the gods, etc.), and in fact left some room for free will, in a what-will-be-will-be sort of way; i.e., because outcomes were unpredictable, one had to choose, one had to act, and let the dice fall where they may. The tragic flaw of hubris was exactly what is wrong with any determinism, the delusion that one could stop the wheel of fate from turning past its apex, i.e., that through prediction one could control.

Determinists worship predictability and control. I once read somewhere the idea that, if everything that has already happened were known, everything that will happen could be accurately predicted. Extreme as this statement is, it accurately summarizes the mindset of the determinists. It also suggests why determinism is so attractive in a scientific age such as ours, for science is not only about the gathering of facts and the formulation of theories but also about using those theories to make predictions.

Given the apparent power of science to accurately predict, and given that prediction is predicated on a deterministic stance, it is not surprising that scientists should turn their attention to the human condition, nor that scientists, being what they are, tend to look for, and find, evidence that human thoughts and behavior are determined by genes, neurons, modules, adaptations, what have you, and are therefore predictable. And it further is not surprising that, in a restless and rapidly changing world, laymen are attracted to these ideas. Certainty is an antidote to powerlessness.

If we are religiously minded, we find certainty in religion; hence the rise of politically and socially powerful fundamentalist movements today. If we are not religious, we may find certainty in New Age nostrums, ideologies, art, bottom lines, celebrity worship, or even skepticism (no one is more certain of his or her own wisdom than the skeptic). If we are politicians, we look for certainty and security in megabytes of data. If we are scientifically minded, we find certainty in science. But certainty is not science. It is a common psychological need in an age of uncertainty.

In satisfying this need for certainty, determinism often leads to excessive self-confidence and egotism—which in turn leads to simplifications and dismissal of complexity, ambivalence, and randomness. Determinism is teleology. Today’s determinists may have discarded God, but they still believe that He does not play dice. They are, in short, utopians. We all know where utopias end up. That much at least we can confidently predict.

Paleolithic Fantasies

We live in an age like all previous ages, one in which thinking people assess the state of the world, find it wanting, and consequently seek a better, even perfect, way of life. Such people tend to roughly divide into those who seek their utopias in a vision of the future (today: think digital prophets, genetically modified crops) or a return to a golden past when human beings were in perfect harmony with nature (past: think Eden and the Noble Savage; today: think organic farming, artisanal cheese). Interestingly, one finds both types among both liberals and conservatives, though usually with different emphases (liberals tend to go for the organic, conservatives for traditional morality, while both seem to think that digital technology holds great promise for the future, either through greater community or better security). And advocates of both sides seem to appeal, either implicitly or explicitly, to “human nature” as the ultimate measure of the perfect way of life (using either Darwin or the Bible as the validating text). Thus, amid all the changes of outward circumstance, human nature has remained unchanged through time.

Marlene Zuk, author of Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live (W. W. Norton, 2013), addresses the myth, the just-so story, of a fixed human nature from an evolutionary perspective. An evolutionary biologist currently associated with the University of Minnesota, Zuk has conducted extensive field research, particularly on crickets, and is the author of numerous specialized articles and several popular books on evolutionary biology, behavioral biology, and sexual selection. She is therefore particularly well-qualified to demolish popular myths about human evolution, which she does with clarity and wit in this new book. (Her wit is best illustrated by her statement that “After all, nothing says evolution like a brisk round of the plague.”) Her immediate targets here are evo-myths about diet and health, particularly those that base their tenets on the very false idea that contemporary human beings are Paleolithic creatures uncomfortably and unhealthily stuck in an unnatural modern industrial environment. In other words, the natural man, the Noble Savage, the Eden which we have lost, is to be found in the lifestyles of early Stone Age humans prior to the development of agriculture (the true Original Sin) and settled life, that is prior to about 10,000 years ago. Supposedly, humans of the Paleolithic lived in that much admired perfect harmony with nature, and to restore our health and souls, we need to retrieve that lifestyle and apply it to our urbanized lives today.

Alas, like all utopian dreams, whether of past or future, what Zuk calls paleofantasies are exactly that, fantasies, and in the course of demonstrating just how fantastic they are, she treats her readers to a particularly clear and nonidealogical series of lessons on what evolution really is. And what it is not: it is not purposeful and it is not perfect or ever perfected. Thus, she demolishes the notion of the Noble Savage (by whatever name) when she writes that there is no utopian moment of perfect synchronicity between human beings and their environment. Both organisms and environments constantly change (and both humans and environments certainly did over the 2.6 million years of the Paleolithic period), and to think that today’s human beings are unchanged from those of even a mere 10,000 years ago “misses the real lessons of evolution” and “is specious” (p. 59). And lest we think that evolutionary change moves in some kind of logical direction, she writes that “evolution is more of a drunkard’s walk than a purposeful path” (p. 78).

Evolution never intends anything. It is a Rube Goldberg contraption, or rather the creatures it throws up are, because, rather than aiming at or achieving perfection, it measures success only by reproductive success. “If something works well enough for the moment, at least long enough for its bearer to reproduce, that’s enough for evolution” (p. 8). When you think about it, this is actually an excellent measure, simply because “perfection” is purely a human concept, and no one can agree on just exactly what perfection is. Should we eat only meat, because, as some paleo diet buffs claim, that’s what our Pleistocene ancestors ate? Or should we eat only raw vegetables and fruit, because, as other buffs claim, those were the exclusive menu items of our ideal past? Should we eschew grains, because they are cultivated and therefore not natural? Just exactly what would the “perfect” diet for human beings consist of?

According to Zuk, it depends. As she shows, various populations of human beings have evolved to utilize foods that our hunter-gatherer ancestors would not have been able to eat. For example, adults of some populations can digest milk, while the majority of human adults cannot (lactose intolerance). Certainly, the latter should avoid dairy, but the former can consume dairy products pretty much as they please. Insofar as the deleterious effects of agriculture are concerned, yes, it appears to be true that initially human health and well-being declined after people began cultivating grain crops and living in permanent settlements, but Zuk points out that it did not take all that long for this disadvantage to disappear; and as we know, agricultural societies grew larger and faster than foraging societies (reproductive success again being the measure of evolutionary success). Certainly some kind of genetic mutations could have occurred that conferred a greater ability to prosper on a diet high in grains; but it is also possible that as people improved their knowledge of cultivation and selectively improved the quality of their crops, and also exploited the advantages of settlements in facilitating trade, they overcame the initial disadvantages of agriculture. But whatever the case, it’s important to keep in mind that the early agricultural peoples themselves apparently thought that the advantages of agriculture outweighed its disadvantages—why else persist in farming?

An analogous point could be made about our modernity: If modern urban life is so bad for us, so unnatural and maladaptive, why did we develop it in the first place? If we are really, as some do argue, merely products of biological evolution like any other animal and, as some do argue, our consciousness is merely an illusion, how did we “evolve” a state of affairs so contrary to our biological being? And why do we cling to it so tenaciously? If it were really so horrible, wouldn’t we be fleeing the city for the more natural environments of the northern woods or western prairies (the United States’ closest approximation of the Edenic savannahs)? The fact that we do not suggests that urban industrialized life may not be so bad for humans after all. (How bad it may be for other organisms is a different question.)

Whatever the sources of some people’s dissatisfaction with modern human life, a mismatch between our Paleolithic natures and modernity is not one of them, and the appeal to evolution is, as already noted, based on a misconception of what evolution is. A major aspect of that misconception is an over-emphasis on natural selection. But as Zuk points out, “it is important to distinguish between two concepts that are sometimes—incorrectly—used interchangeably, evolution and natural selection. At its core, evolution simply means a change in the frequency of a particular gene or genes in a population” (p. 251). The mechanisms by which these gene frequency changes occur include not only natural selection, but genetic drift, gene flow, and mutation. “Genetic drift is the alteration of gene frequencies through chance events” (p. 251). “Gene flow is simply the movement of individuals and their genes from place to place, and activity that can itself alter gene frequencies and drive evolution” (p. 252). “The final way that evolution sans natural selection can occur is via those mutations, changes in genes that are the result of environmental or internal hiccups that are then passed on to offspring” (p. 252). In order to see whether or not evolution is occurring in humans today, one does not look at superficially visible traits but at changes in gene frequency among human populations.

Another all too common misconception is that “evolution is progressing to a goal” (p. 252), what can be called the teleological error. Even well-known and well-informed people believe that evolution is goal directed. For example, Michael Shermer, the editor of The Skeptic magazine and the author of a number of pro-evolution books, writes in The Science of Good and Evil that “Evolutionary biologists are also interested in ultimate causes—the final cause (in an Aristotelian sense) or end purpose (in a teleological sense) of a structure or behavior” (p. 8); he then states that “natural selection is the primary driving force of evolution” (p. 9). In contrast, Zuk reiterates throughout her book that “everything about evolution is unintentional” (p. 223), that “all of evolution’s consequences are unintended, and there never are any maps” designating a foreordained destination—and she is in fact an evolutionary biologist!

A good example of an unintentional evolutionary consequence is resistance to HIV, the retrovirus that causes AIDS. As it happens, some individuals are resistant or immune to the retrovirus, but not because evolution or natural selection intended them to be so. Centuries ago, bubonic plague swept through Europe; millions died of this highly infectious disease, but some few people did not get the disease despite having been exposed to it. No doubt they thought God had spared them for some divine reason. Centuries later, some of their descendents were exposed to HIV and did not become ill. Did God plan that far ahead to spare these few lucky individuals? Did evolution? No. A random mutation happened to render human cells unreadable to the plague bacterium (or, as Zuk suggests is more likely, unreadable to the smallpox virus); consequently, the bacteria could not enter the cells and wreak their havoc. The mutation would have had to have occurred before the introduction of the disease into the lucky few’s environment (there would not have been enough time for it to occur and proliferate after the disease’s introduction), and may have had no prior function, good or bad. As chance would have it, centuries later, the same mutation also made the owner’s cells unreadable to the AIDS virus, thus rendering him or her immune to HIV—quite by chance. Pace Lamarck, perhaps we can say that it is not characteristics that are acquired, but functions. The gene mutation that confers HIV immunity has after many generations finally an acquired function.

Why then do organisms seem so perfectly adapted to their environments? Perhaps they are not so perfectly adapted as they appear to human eyes; more importantly, since environments change, organisms must change as well, but perhaps if they were too perfectly adapted (each and every individual of the species therefore being identical), they would rather quickly become imperfectly adapted to even small changes in their environment. Perhaps, then, perfection is an extinction trap rather than a desirable goal.