Tag Archives: instinct

Evolution and Theodicy

“Why is there evil in the world?” This question has been asked by philosophers and theologians and ordinary men and women for millennia. Today scientists, particularly evolutionary biologists, neuroscientists, and evolutionary/neuropsychologists have joined the effort to explain evil: why do people indulge in violence, cheating, lies, harassment, and so on. There is no need here to itemize all the behaviors that can be labeled evil. What matters is the question of “why?”

The question of “why is there evil in the world?” assumes the premise that evil is abnormal while good however defined) is normal—the abnorm vs. the norm, if you will. Goodness is the natural state of man, the original condition, and evil is something imposed on or inserted into the world from some external, malevolent source. In Genesis, God created the world and pronounced it good; then Adam and Eve succumbed to the temptations of the Serpent and brought evil and therefore death into the world (thus, death is a manifestation of evil, immortality the natural state of good). Unfortunately, the Bible does not adequately account for the existence of the Serpent or Satan, so it was left to Milton to fill in the story. Gnostics, Manicheans, and others posited the existence of two deities, one good and the other evil, and constructed a vision of a cosmic struggle between light and darkness that would culminate in the triumph of good—a concept that filtered into Christian eschatology. The fact that Christian tradition sees the end times as a restoration to a state of Adamic or Edenic innocence underscores the notion that goodness is the natural, default state of man and the cosmos.

Contemporary secular culture has not escaped this notion of the primeval innocence of man. It has simply relocated Eden to the African savannah. When mankind was still at the hunter-gatherer stage, so the story goes, people lived in naked or near-naked innocence; they lived in egalitarian peace with their fellows and in harmony with nature. Alas, with the invention of agriculture and the consequent development of cities and civilizations, egalitarianism gave way to greed, social hierarchies, war, imperialism, slavery, patriarchy, all the factors that cause people to engage in violence, oppression, materialism, and so on; further, these faults of civilizations caused the oppressed to engage in violence, theft, slovenliness, and other sins. Laws and punishments and other means of control and suppression were instituted to keep the louts in their place. Many people believe that to restore the lost innocence of our hunter-gatherer origins, we must return to the land, re-engage with nature, adopt a paleo diet, restructure society according to matriarchal and/or socialist principles, and so on. Many people (some the same, some different from the back-to-nature theorists) envision a utopian future in which globalization, or digitization, or general good feeling will restore harmony and peace to the whole world.

Not too surprisingly, many scientists join in this vision of a secular peaceable kingdom. Not a few evolutionary biologists maintain that human beings are evolutionarily adapted to life on the savannah, not to life in massive cities, and that the decline in the health, intelligence, and height of our civilized ancestors can be blamed on the negative effects of a change in diet brought on by agriculture (too much grain, not enough wild meat and less variety of plants) and by the opportunities for diseases of various kinds to colonize human beings too closely crowded together in cities and too readily exposed to exotic pathogens spread along burgeoning trade routes. Crowding and competition lead to violent behaviors as well.

Thus, whether religious or secular, the explanations of evil generally boil down to this: that human beings are by nature good, and that evil is externally imposed on otherwise good people; and that if circumstances could be changed (through education, redistribution of wealth, exercise, diet, early childhood interventions, etc.), our natural goodness would reassert itself. Of course, there are some who believe that evil behavior has a genetic component, that certain mutations or genetic defects are to blame for psychopaths, rapists, and so on, but again these genetic defects are seen as abnormalities that could be managed by various eugenic interventions, from gene or hormone therapies to locking up excessively aggressive males to ensure they don’t breed and pass on their defects to future generations.

Thus it is that in general we are unable to shake off the belief that good is the norm and evil is the abnorm, whether we are religious or secular, scientists or philosophers, creationists or Darwinists. But if we take Darwinism seriously we have to admit that “evil” is the norm and that “good” is the abnorm—nature is red in tooth and claw, and all of the evil that men and women do is also found in other organisms; in fact, we can say that the “evil” done by other organisms long precedes the evil that men do, and we can also say, based on archaeological and anthropological evidence, that men have been doing evil since the very beginning of the human line. In other words, there never was an Eden, never a Noble Savage, never a long-ago Golden Age from which we have fallen or declined—and nor therefore is there any prospect of an imminent or future Utopia or Millennial Kingdom that will restore mankind to its true nature because there is nothing to restore.

The evolutionary function of “evil” is summarized in the term “natural selection”: the process by which death winnows out the less fit from the chance to reproduce (natural selection works on the average, meaning of course that some who are fit die before they can reproduce and some of the unfit survive long enough to produce some offspring, but on average fitness is favored). Death, usually by violence (eat, and then be eaten), is necessary to the workings of Darwinian evolution. An example: When a lion or pair of lions defeat an older pride lion and take over his pride, they kill the cubs of the defeated male, which has the effect of bringing the lionesses back into heat so that the new males can mate with them and produce their own offspring; their task is then to keep control of the pride long enough for their own cubs to reach reproductive maturity. Among lions, such infanticide raises no moral questions, whereas among humans it does.

There is no problem of evil but rather the problem of good: not why is there “evil” but rather why is there “good”? Why do human beings consider acts like infanticide to be morally evil while lions do not? Why do we have morality at all? I believe that morality is an invention, a creation of human thought, not an instinct. It is one of the most important creations of the human mind, at least as great as the usually cited examples of human creativity (art, literature, science, etc.), if not greater considering how much harder won it is than its nearer competitors, and how much harder it is to maintain. Because “good” is not natural, it is always vulnerable to being overwhelmed by “evil,” which is natural: Peace crumbles into war; restraint gives way to impulse, holism gives way to particularism, agape gives way to narcissism, love to lust, truth to lie, tolerance to hate. War, particularism, narcissism, etc., protect the self of the person and the tribe, one’s own gene pool so to speak, just as the lion kills his competitor’s cubs to ensure the survival of his own. We do not need to think very hard about doing evil; we do need to think hard about what is good and how to do it. It is something that every generation must relearn and rethink, especially in times of great stress.

It appears that we are in such a time today. Various stressors, the economy, the climate, overpopulation and mass migrations, religious conflict amid the dregs of moribund empires, are pushing the relationship of the tribes versus the whole out of balance, and the temptations are to put up walls, dig trenches, draw up battle lines, and find someone other than ourselves to blame for our dilemmas. A war of all against all is not totally out of the question, and it may be that such a war or wars will eventuate in a classic Darwinian victory for one group over another—but history (rather than evolution) tells us that such a victory is often less Darwinian than Pyrrhic.


Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

Marriage vs Mating

Yet Another Just-So Story

What is marriage? Ask an American of strong religious beliefs, and he is likely to say that it is a union between one man and one woman sanctioned by God. Ask more secular individuals, and they are likely to say that it is a civil contract between two individuals, committed to each other by love, but of practical importance in terms of legal and tax benefits, etc. Ask some biologists, and they will say that monogamous marriage is an evolutionary adaptation that increased the survival rate of helpless human infants, guaranteed to the father that the children produced by his wife were indeed his, and/or facilitated the development of human intelligence—or whatever, as long as the explanation can be stated in terms of natural selection. So at least is the impression one receives from a recent article in the New York Times (titled, somewhat misleadingly, since polygamy is discussed, “Monogamy’s Boost to Human Evolution”—but at least the title does neatly summarize the bias).

Ask an historian, a sociologist, or an anthropologist, and any one of them is likely to say that marriage practices vary over time and among cultures, from polygamy to monogamy, and they are also likely to mention that it varies by class. In warrior societies polygamy was common among the warrior elite (including kings and nobility, whose avocation was warfare, and who could have both many wives and concubines) to monogamy among the commoners; polygamy is common in societies in which there is a high mortality rate among young men (war, hunting mishaps, etc.) whereas monogamy is more common among societies in which the balance of adult males to females is more even, as well as in more egalitarian societies. Generally speaking, marriages were contracted for social purposes, to cement alliances, to protect inherited property, or to synchronize labor.

Marrying for love is a rather recent innovation and is characteristic of modern individualistic (and capitalist) countries, although monogamy has long been legitimized by Christianity, in part because of its dread of sexual license. Some people get around the stricture by having separate and unofficial multiple spouses, for example Charles Lindbergh, who had children in long-term relationships with three women other than his wife. Contemporary Americans seem to be practicing serial monogamy (divorce and remarriage) as well as unofficial and often temporary arrangements. In all cases, there has always been a whole lot of cheatin’ goin’ on. Then there is the added element of prostitution, including street walkers and courtesans, for which even the cleverest evolutionary biologist would have a hard time providing an evolutionary explanation. All of which suggests that marriage is different from mating. The latter is strictly biological—up until very recent times, there has been only one way to produce children, the sexual congress of a fertile man with a fertile woman, and this one way is unaffected by social customs. That is, socially sanctioned monogamy does not prevent either partner from producing a child with a person other than his/her spouse; eggs and sperm recognize no such boundaries.
t therefore seems both pointless and fruitless to try to concoct explanations for marriage customs and practices from natural selection. At some unknown point in the remote human past, people began creating nonbiological ways of organizing their lives. It’s what our big brains allow us to do. Mating may be in our DNA; marriage, however, is not.

Apart from the waste of time and grant money entailed in the pursuit of these evolutionary Just-So stories, the misguided notion, bordering on an ideology, that everything humans do can be explained solely in biological evolutionary terms, by a module in the brain, by DNA (i.e., instinct), denigrates other modes of knowledge that actually produce better explanations. We can learn more about marriage from historians and anthropologists than we can from biologists.

Empathy Imperiled: A Review

One can to some extent understand the current enthusiasm of conservatives for Darwinian deterministic explanations of human behavior, inasmuch as determinism is compatible with the views of human nature already held by conservatives. Even religious conservatives, those who go so far as to deny evolution per se, subscribe to a deterministic view. The Edenic fall, the apocalyptic view of history, etc., are elements in God’s overarching plan, and human free will is largely limited to submitting to God’s will or facing the dire consequences. Secular conservatives hold that Evolution is the grand plan (even though they usually deny teleology for appearances’ sake) and that we should submit to the inevitabilities of our genes and our Pleistocene natures. But it is puzzling that a considerable number of scholars in the humanities and social sciences submit to Darwinian explanations of art, literature, philosophy, etc.; perhaps they do so in a desperate attempt to retain “relevance” in an age when technology, science, and the MBA have the hegemonic edge.

It is especially surprising when a writer of definitely left-wing political beliefs attempts to recruit biological evolution to the socialist or communitarian cause. Such is the case, sadly, with Gary Olson’s book Empathy Imperiled: Capitalism, Culture, and the Brain (Springer, 2013). Olson is a professor of political science at Moravian College in Bethlehem, Pennsylvania, and active in liberal causes. In this book, he explores a two-part thesis: the first is that mirror neurons in the brain hardwire us for empathy; the second is that the culture of capitalism thwarts this natural empathy in favor of selfishness.

Why is his first point important to his second point? According to Olson, that we (and at least some other animals) have mirror neurons has been proven by science, which in turn provides support for the idea that human beings are naturally (i.e., biologically) empathetic. It is not biology or our evolutionary history that makes us divisive and driven by selfishness and enmity but rather, culture, particularly capitalist culture, has thwarted this natural trait. However, while the existence of mirror neurons in macaques appears to be well established, their existence in human beings is not. It further is not at all certain that mirror neurons are the source of empathy. They seem instead to mirror others’ motor movements, such that when a macaque sees another macaque pick up a peanut and put it in its month, the first macaque can imitate that action, but it is a long way from motor imitation to empathy. But by means of a non sequitur, Olson evades the problem: “The monkey’s neurons were ‘mirroring’ the activity she was observing, suggesting she was responding to the experience of another, such as when we experience empathy for someone else’s circumstances” (p. 21). As in all non sequiturs, there is some verbal sleight of hand in this sentence: from mirroring an activity (outwardly visible) to mirroring an experience (inward and subjective), and then the leap from a monkey mirroring/responding to another monkey’s actions to a human being actually feeling with another human being (and what “circumstances” are implied here?). No explanation for this leap from an activity to a subjective state is provided.

It is worth pointing out here that complex animals like macaques, chimpanzees, or humans do not consist of one behavioral trait. Even if mirror neurons do exist in monkeys or humans, even if we are willing to make the leap of faith that mirror neurons hardwire us for empathy, empathy is not our only behavioral trait and can then, quite naturally rather than culturally, be over-ridden by other traits that might be more appropriate to a particular situation or circumstance. Thus a person might be empathetic one day and jealous the next, or understanding and helpful to one person and belligerent to another. None of us would hurt a fly—until the situation called for a fly swatter.

Perhaps “empathy” is a poor word, anyway. The observant macaque might use its ability to “mirror” another’s actions by stealing the peanuts; a human being who can “feel with” another person might use that to manipulate and outwit. Merely “mirroring” does not guarantee virtuous cooperation.

There are equally damaging inadequacies with Olson’s development of the second part of his thesis, that capitalism thwarts our natural empathy. He writes that “capitalism is by its very nature competitive and exploitive, not communal and empathetic except to the degree that empathy can enhance profitability” (p. 25). Well, true, at least to some extent. But is this true only of capitalism? As a leftist, Olson seems to think that it is. But Olson fails to show that capitalism is more destructive of empathy than other actual (rather than ideal) economic systems. To do so, some comparisons (other than to Cuba) would be necessary. For example, given the endemic slavery of the Roman Empire, which was not capitalist, surely we can say that Rome was destructive of empathy. Indeed, a major motive for official Roman antagonism to early Christianity was precisely its encouragement of empathy, particularly for the poor, the oppressed, and the enslaved. Ancient Greece, despite Athens’ reputation as the birthplace of democracy, also depended on slavery and denied citizen status to everyone except free-born, native-born males (i.e., not “foreigners,” women, slaves, etc.). It is the ancient Greeks who gave us the word barbarian, a pejorative for the “them” of the us vs. them dichotomy. In the Americas, aside from the Aztecs and Maya (human sacrifice, fierce warfare), there were the Iroquois, who made territorial war against their neighbors, and slavery was also practiced by numerous Indian tribes. While many sins have been committed under capitalism, so have they under all other actual economic systems.

On the other hand, some ancient sins withered away under capitalism. Chattel slavery was abolished after capitalism was established, the vote has been extended to all adults, men and women alike, of whatever class. The various products of industrial/scientific medicine have eliminated or vastly reduced the ravages of infectious diseases, to the point where infant and child mortality has gone from being a commonplace to an exception. The disease, smallpox, that many historians estimate killed as much as 90% of Native Americans after the arrival of Europeans has been eliminated. These examples are not meant to absolve capitalism of its sins, but to demonstrate that any political and economic systems, just like the human beings who create and sustain them, are complex mixtures and degrees of good, bad, and indifferent. Capitalism may have run its course and may, through the usual difficult process that attends major historical shifts, be replaced by something better suited to our new globalized, over-heated world, but I doubt that that new system will be as morally exemplary as many dream of.

In my opinion, mirror neurons, neuroscience, genetics, etc., add little of interest or usefulness to issues of morality. In Olson’s book, the best passages are not those which unsuccessfully attempt to recruit mirror neurons to moral purposes but those which explore the profound words of Jesus (e.g., the parable of the Good Samaritan) and Martin Luther King, Jr. (e.g., King’s interpretation and application of that parable). Such wisdom does not require a pseudoscientific gloss.

Boehm’s “Social Selection”

Christopher Boehm’s book Moral Origins: The Evolution of Virtue, Altruism, and Shame (Basic Books, 2012) is yet another sad example of the futility of the widespread hope that Neo-Darwinism, as over extended by evolutionary psychology and sociobiology, can ever be a theory of everything, particularly a theory that explains modern human behavior and values. It is not science. It is an ideology, or perhaps merely a hope, dressing up in a sloppy imitation of science.

Boehm’s thesis is that human moral values, the virtue, altruism, and shame of his subtitle, evolved through a process of what he calls “social selection,” which can be defined as the selecting out of socially uncooperative individuals (whom Boehm equates with psychopaths) and the selecting in of cooperative ones. Lengthy as the book is (at 362 pages of text), with its elaborate arguments and numerous examples, Boehm fails to support his thesis with anything more than supposition and false analogies.

First let’s consider what social selection would have to do in order to affect the evolution of human beings:

1) It would require a concerted effort species-wide over a great swath of time to define, identify, and eliminate socially uncooperative individuals (psychopaths and free riders).

2) In order to affect the gene pool, undesirable individuals would have to be identified very early in life, before they had the chance to reproduce. Killing the parent without killing the child does not eliminate the parent’s genes.

3) The criteria for determining whom to eliminate would not only have to be clear but consistent over many generations. Any change in the standards midstream would ruin the whole scheme. Yet any historian can tell you that standards have changed over time, sometimes quite sharply.

There is no evidence that any of this obtained at any time in human history or prehistory. There is also no evidence that if it did occur it would have had a significant impact on human evolution. Prior to modern medicine and germ theory, infant and child mortality, not to mention plagues and epidemics that affected adults as well, would have had an impact many times that of social selection, effectively diminishing its proportionally infinitesimal effects.

In order to compensate for the serious lack of evidence, Boehm resorts to highly suppositional phrasing and subjunctive grammar. The following examples from pages 80 and 81 are illustrative of far too much of the book:

“prehistoric forager lifestyles could have generated distinctive types of social selection” (Perhaps they could have, but science wants to know if they actually did.)

These types of social selection “could have supported generosity outside the family at the level of genes.” (Again, did they actually do so?)

“were likely to have”
“could have become”
“It’s even possible . . . if”
“may have begun to differ”
“it’s likely that”
“would have been”
“would not have negated”
“they would have”
“were likely to have been”
“what could have happened”
“very likely”

And all these from just two pages! The careless or naïve reader might not notice this suppositional language and therefore mistakenly believe that Boehm is solidly establishing his argument; but the careful reader will find these to be crippling stumbling blocks.

There are also problems of self-contradiction. For example, Boehm seems to be saying that social selection eliminates psychopaths, but then states that psychopaths constitute a significant percentage of modern day populations. He claims that “People very significantly [psychopathic] probably number as high as one or more [vague: how many more?] out of several hundred in our total population,” which may not seem all that many, but perhaps too many if humans began socially selecting these people out thousands of years ago. Other sources put the percentage as low as 2% and as high as 4%, but no doubt problems of definition affect the numbers. Whatever the true number may be, I think Boehm does need at the very least to clarify just how effective social selection really is.

The examples he pulls from contemporary forager societies are also contradictory of his thesis. He cites the example of Cephu, a Mbuti Pygmy who, as recounted by Colin Turnbull, let his greed overcome his responsibility to the rest of his group. His colleagues caught him in the act of helping himself to more game than he was entitled to and subjected him to an intense course of humiliation—but they did not kill him or his progeny, and after he had adequately apologized and humbled himself, he was readmitted to the group. The story of Cephu, meant to illustrate the book’s thesis, actually proves its opposite. Cephu’s behavior was corrected not genetically, but culturally.

Perhaps a comparison would clarify the problems with Boehm’s thesis. There is another form of behavior that one might think would have been socially eliminated fairly early in human evolution, male homosexuality. It is not, after all, conducive to reproductive survival, and has often been punished, quite horribly in many instances, not only with shunning and shaming techniques but with imprisonment, torture, and execution; yet it has persisted through thousands of years, in part because homosexuals can camouflage themselves but also because efforts of social selection to eliminate the behavior have proven to be ineffectual—just as has been, I would argue, social selection to eliminate socially uncooperative individuals. This analogy suggests that social selection is a very weak hook on which to hang the hope that biology and genetics can account for all human behavior in terms of “fitness.”

Finally, we should note that throughout history there have been people we would today label as psychopaths who have been quite successful leaders, often revered not only in their own times but long after their deaths. One thinks of Napoleon Bonaparte, killer of millions yet romanticized and admired by other millions, credited with the Napoleonic Code and sympathized with in his exile. One also thinks of Genghis Khan, the great butcher who, far from being selected out of the gene pool, is now thought to be the ancestor of as many as 16 million people living today. Of course, being a psychopathic great leader is no guarantee of reproductive success; Hitler, fortunately, had no children, and though he did have nieces and nephews, none of them has followed his example. While Boehm believes that psychopaths and free riders were (at least to some extent) weeded out of the gene pool through social selection, it may be that such individuals were selected for because in some ways that we 21st century Americans may not comprehend, they were in fact socially useful. Perhaps they made good warriors, or maybe they built the great empires that encouraged the arts and sciences, or maybe they made their liege lords great fortunes (perhaps Cortez and Pizarro were useful psychopaths, enriching the Spanish treasury while taking all the risks). What we can say is that they have been, and are, legion.


“Can Horses Feel Shame?”
This was a question posed by a friend who is both a horsewoman and an equine therapist. We discussed the question at some length over lunch, sharing our different perspectives or slants on the topic, while agreeing on the answer.

The short answer is “No.” Or rather, “Probably not.”

Why not?

First, what is “shame”? Is it merely an emotion, or is it something more complex? If it is merely an emotion, then horses and other mammals should be able to feel shame, because all mammals share the basic limbic brain structures that regulate emotions (including the relevant hormones). This includes humans, so at the basic level, humans feel the same emotions as other mammals.

However, humans have something that other mammals lack, language, by which I mean not merely communication but the ability to organize and elaborate abstract thought (concepts), particularly on a cultural rather than individual level. Language itself is a cultural rather than individual trait, which is why children must learn their native language rather than being born able to speak it instinctively. Through language/culture, we abstract and reify all our experiences, including our emotions.
We can consider the emotions as the substrate or foundation of our linguistically/culturally organized feelings. Fear, for example, can be seen as a foundation of shame, as suggested by the body language of shame, which looks much like the body language of fear (head down, tail tucked, a pulling in of the limbs, looking away from the, for example, threatening dominant animal, etc.). But shame elaborates fear, mixing it with other elements, into an abstraction, a concept which can vary markedly among cultures and situations.

Emotion fades with the withdrawal of the stimulus as the hormones clear from the body. An animal confronting a threat feels fear and reacts, but when the threat is removed, the fear abates and the animal returns to normal. Of course, if a threat is continually repeated over time, the animal will either become skittish and wary or, if it learns that the threat is actually not a threat at all, will come to ignore the stimulus. But generally, the animal’s reaction is in response to a concrete, perceived, and present threat.

Shame, however, does not fade with the removal of the (social) situation that triggered it. Because it is a concept rather than purely an emotion, shame can be recollected at a later time, when the threatening situation is long since removed, and trigger an emotional response—including the release of the associated hormones. We mull over the experience, reliving it as if it were in fact unfolding in the present, and re-feeling the emotions we experienced during the actual event. In fact, we can make the experience and the attendant emotions worse by these mental re-enactments. In so doing, we also can turn shame (as well as guilt, joy, love, etc.) from responses to motives for future actions (or inaction—it can prevent us from engaging in actions which we anticipate will bring shame upon us). Shame might cause us to plot revenge, for example, or guilt may prompt us to apologize or make it up to a person we have wronged. And that person may decide to forgive us or to punish us. For humans, all these social emotions are a two-way street, or perhaps more accurately multiple streets of many ways.

Shame can only be experienced when we feel ourselves to be negatively judged by others or by the norms of our society. We feel that we have not measured up to the expectations of others or that we have acted or thought in ways that society would disapprove; we can feel shame for actions or thoughts that no one else has seen. But first, we must have learned what our society considers shameful—this knowledge is not instinctive. Shame also, and crucially, involves empathy or a theory of mind, the ability to recognize the subjectivity of others (akin to “mind reading,” etc.).

It is likely that sociopaths do not feel shame or guilt, no matter how culturally/socially adept they may be. It is the sad situation of the sociopath that points to the positive aspects of shame, his/her pathology being precisely of the social kind—without shame, the sociopath visits misery on everyone he comes into contact with, depriving both them and himself of the joyful experiences of social life. Too much shame, or shame imposed on us by totalitarian persons and regimes, results in neuroses, but too little shame causes a lack of restraint and consideration of others. A person who feels ashamed of having been rude to another is less likely to act rudely in the future; a political leader who is not likely to anticipate his feelings of shame (perhaps because he cannot have them) will not hesitate to kill millions to achieve his ambitions. To repress appropriate feelings of shame to protect one’s reputation or self-image will condition one to continue his/her bad behavior. Thus the popular idea that one should not feel shame (often expressed as an I-don’t-care-what-others-think attitude) has serious drawbacks. We cannot be successful social creatures without caring what others think.

The existentialists recognized the central importance of shame, as a form of self-consciousness. Sartre’s classic example of a man suddenly aware of being looked at, caught picking his nose, underscores the self-consciousness of shame; Sartre held that such self-consciousness was crucial to developing an authentic sense of self. He writes, “Nobody can be vulgar all alone!” We must be seen picking our noses in order for that act to be vulgar and to feel ashamed of our vulgarity (though as mentioned earlier in this article, we can relive and/or anticipate the shameful situation). Sartre also writes, “I am ashamed of myself as I appear to another” and “Shame is by nature recognition,” both as oneself and as an Other to another person—that is, a “self which is not myself”. (The sociopath may feel irritation or anger at being perceived negatively, perhaps precisely because he does not want to be known at all, but he will not feel shame.)

This realization of our own Other-ness to others is particularly important. Not only do we feel subjectively ourselves, and not only do we recognize the subjective existence of another person (empathy, theory of mind), but we also become aware that to that other person we are Other, in the full sense of being a subjective being in our own right. Perhaps it is in this sense that the greatest joy of love is experienced. For if as the Beloved, I am more than merely an object to the Lover (a blank screen perhaps on which he/she projects an image of himself), then I am truly loved, rather than possessed. As Simone de Beauvoir wrote, “to love him genuinely is to love him in his otherness.”

All this takes us a very long way from the simple emotions. Whether or not one agrees with Sartre and de Beauvoir, or any other thinker on the subject of shame, what is clear is that to human beings, through language/culture, shame means a great deal more than a momentary hormonal response of fear.

Animal Intelligence, Or the Cat that Came Home

An article in the New York Times of Sunday, January 20, 2013, tells the amazing story of Holly, a calico cat who found her way home after being lost by her owners almost 200 miles from home. A microchip under her skin confirmed that this was not a case of mistaken identity, of some other very similar looking cat being accepted by bereft owners, not a case of Martin (or Martina) Guerre. There are many stories of animals accomplishing similar and even greater feats of navigation—and there continue to be skeptics of the veracity of such stories, especially, oddly enough, among scientists.

Pet owners are notable, or perhaps notorious, for ringing the praises of their pets’ cleverness. I am one of those. I have had multiple cats continuously for the last forty plus years, and I can attest to their intelligence. They seem as various in smarts as human beings (and occasionally can also be quite stupid, alas). I currently have two cats, Speckles and Blackie, who are bright though not brilliant. Their mother, whom I also owned, was a genius, one of the two or three most intelligent cats I have ever known—if she had had the mouth structure to enable speech, she most definitely would have talked.

Many years ago, I had a tom named Tennessee who was a mechanical genius. For weeks he had me stumped: I would put him outside, only to find him inside a few minutes later; I would shut him inside, only to find him sauntering up to me outside in the garden shortly thereafter. He finally revealed his secret one day when he and I were both out in the driveway. Bored with my activity (I was polishing the car), he decided to go back inside. He leapt up onto the wall of the house, grabbed the wooden sill of a window, braced his back legs on the rough stucco of the exterior wall, inserted the claws of one front paw through the mesh of the window screen and pulled it open, wedged his head through the resulting gap, and pulled himself in.

I figured that I had inadvertently left the screen unhooked, so I went inside, hooked the screen, went back outside to finish my chore, and a few minutes later watched Tennessee reverse the operation and exit the house through the same window. It wasn’t I who had unhooked the screen, it was the cat! I could share many another anecdote of Tenny’s brilliance. But I have my own tale of a cat who found his way home. I once agreed to take care of a friend’s cat at my house while she was on a trip. The cat did not like being away from home and my cats didn’t like him, so one evening he escaped and disappeared. Both my friend and I were certain we had seen the last of him, but several weeks later, he reappeared at her door, somewhat the worse for wear, but alive and home at last. My friend lived five miles away from me, and to get home the cat had to cross many busy streets, a railroad, and two freeways.

Although I have experience only with cats, people with dogs and horses tell me similar stories of the intelligence of their animals. People have written books about the wit and wisdom of quail, chickens, crows, budgerigars and parrots, cows, pigs, mules, and raccoons. The evidence for the intelligence of animals is plentiful. So why, then, are we so amazed at the stories of Holly and other individual animals? Shouldn’t something so widely observed be taken as a matter of fact?

Perhaps the reason we don’t is that, despite our love of animals and the widespread practice of keeping pets, we urbanized, digital moderns don’t live in the same intimacy with animals, both wild and domestic, that our pre-industrial ancestors did. Whereas people who make their living by hunting must know as a survival strategy the animals they hunt, we who shop for meat at a super market, where it is pre-killed, pre-sliced, and attractively packaged in plastic, do not need to know the animals who die so that we may live. It’s easy enough to think of an overbred domestic turkey or fattened steer as just a dumb animal, if we think of them at all. Quite a different thing if you’re being led on a merry and often unsuccessful chase by a wily wild turkey who would prefer not ending up on your dinner plate.

That’s one reason. Another is likely the hegemony of the scientific mindset, which oftentimes requires that the obvious and commonplace be verified by the scientific method or be pronounced heresy, folklore, and superstition. If you read the New York Times article, notice the many quotations from an assortment of scientists, none of whom, it is worth pointing out, really have a clue as to how Holly accomplished her feat, though they are not admitting it. They are dancing around the issue, obfuscating through academic jargon, supposing and doubting. But Holly cannot be refuted. She has the microchip to prove the veracity of her adventure.

Our ancestors did not have modern science. They did not have hypotheses and theories, and they did not have a post-Enlightenment view of living creatures as mechanical things or as (our favorite today) computers preprogrammed or “hard wired” with the “software” otherwise known as instinct. When you explain all animal behavior and abilities by “instinct,” you bias against the possibility of intelligence. That is also a good way to maintain the post-Cartesian distance between animals and humans, a distance which makes it easier to treat farm animals, for example, as so many units of production on factory farms and feedlots.
Pre-scientific people did not have our contemporary concept of instinct and thus had no problem in recognizing the intelligence of the animals with whom they shared the world.

But while no doubt “instinct,” whatever that exactly is, is important for understanding the behavior of animals, when we are dealing with intelligent animals, animals who can think, as we are when dealing with most vertebrates, especially mammals, then we must concede that they are capable of doing things that can be best explained by an appeal to intelligence rather than to instinct. After all, animals live in the same world we do. The sun rises in the east and sets in the west in their world just as it does in ours, and they can see that just as readily as we can. To assume that only humans can make intelligent use of their own sense perceptions strikes me as nonsensical, and frankly as contrary to basic evolutionary theory—for surely human intelligence did not spring from nothing by some sort of spontaneous generation, nor in a simple straight line from hominid to hominim to homo sapiens, but must have been reached by a gradual process of increasing intelligence among species of many lines, just as did mammary glands and eyeballs.

When my father retired, he bought a farm in the Ozark Mountains of northwest Arkansas, very near the homestead where he had grown up. He bought a small herd of Angus cows and a bull, with the notion of selling the calves to the feedlots and thereby making a little extra money in his retirement years. After a few years he gave up on that idea, explaining that he could not stand to see the calves hauled away in tightly packed trucks to their not so distant slaughterhouse fate. He had come to recognize the intelligence of his cows and of their bond with him (quite different from a person recognizing his bonds with a cow, or horse or dog). Just one story will illustrate what he meant: One spring, while my parents were in the house getting dinner ready, my dad heard a cow mooing loudly nearby; he looked out and saw a cow standing at the fence, her nose pointed at the house, frantically vocalizing. When he went out to see what was wrong, the cow turned and headed back into the pasture, still mooing. Dad followed while the cow led him to what he knew was the area of the pasture where the cows liked to bed down their young calves. She called and called, but her calf did not come, so Dad started searching through the tall grass, the cow following close behind. He did find the calf, comfortably curled in the grass—the little brat had simply been ignoring his mother’s summons. The cow showed clear signs of her relief at having located her baby, licking him all over as a human mother might shower kisses on a child who had stayed out after dark. There is no instinct that programs a cow to seek human help in locating her baby.

Marvelous Learning Animal: A Review

The thesis of Arthur W. Staats’ new book The Marvelous Learning Animal: What Makes Human Nature Unique is that patterns of behavior in humans are all too often attributed to instinct and genes when in fact they are the result of learning, both individual and accumulative.  Well aware that his thesis goes against the currently popular, one might even say faddish, belief that genetics explains all, Staats develops his theory in thorough detail.  He begins the book with an indictment of what he calls “The Great Scientific Error,” i.e., the failure of Darwin and his most fundamentalist descendents, particularly in sociobiology and evolutionary psychology, to “distinguish between physical and behavioral traits” (p. 14).  Physical traits, whether eye color or the anatomy of human speech, are clearly genetic, but behavioral traits, even those which seem most rooted in anatomy, are learned.

In an interesting and insightful discussion of language, Staats argues that the great scientific error persists in large part because it is embedded in our everyday language.  As he rightly points out, “we all grew up absorbing the explanations of human behavior our language provides for us.  That language is part of each of us, rock-hard belief, so natural it is not open to consideration, giving all of us a common core theory” (p. 12).  For example, the word “altruism” is often applied to the behavior of the social insects such as honeybees and ants, in which it appears that the sterile workers sacrifice their individual chance at reproduction in service to the super fertile queen and the safety of her offspring, thus better ensuring that their gene pool survives into future generations.  But altruism is a moral term, not a biological one, and worker bees and ants are not moral creatures because they do not have minds capable of moral thought.  Jesus was an altruist, Barry B. Benson is not.  (This reference to an animated movie is not intended to be merely clever—language that attributes thoughtful motives to creatures who cannot think, regardless of how seemingly metaphorical and innocuous, create fantasies just as absurd.)  Likewise, consider all the trouble and misconceptions that have flowed from Dawkins’ misstep in calling genes “selfish” (an error, alas, which Staats himself also falls into:  on page 17 when he writes, “Has anyone ever found a selfish gene in an ant, let alone a human?  Has anyone ever manipulated a selfish gene and changed any behavior thereby?” he is misreading Dawkins in the same way many others have done).

The accumulated learning which Staats so beautifully describes later in his book explains the mess that language often lands us in.  As any amateur etymologist can attest, words may start out with precise denotations, but over time they become larded with multiple and often contradictory connotations.  They change their denotative meanings as well.  The word “intercourse” used simply to mean social interaction, but today refers exclusively to coitus.  I will never forget the student who wrote an entire paper on Henry James’ The Turn of the Screw maintaining that the governess was a child molester purely on the basis of one passage in which she wrote of the improvements in her intercourse with the ten-year-old Miles.

The confusions of an old language are not limited to the amusing missteps of young students.  Our current language, including the ways we use it when explaining or arguing a scientific theory, is laden with the legacy of bygone world views, including the teleological verbiage of that old-time religion.  It may be harmless that we still say that the sun rises and sets, but it is not harmless that we speak of traits evolving in order to achieve a certain survival goal or to adapt to a new environment.  Genes cannot think, so they cannot mutate towards a specific goal or purpose.  Feathers, for example, did not evolve in order to enable birds to fly; new fossil research reveals that many dinosaur species had “feathers” which could not have provided the slightest lift to their possessors—they were mutations which dinosaurs were able to use for other purposes, perhaps mating or territorial displays or insulation.  Flight feathers were a later “accident” (all mutations are accidental rather than intended).  Read any book by just about any well-known evolutionary scientist, and you will find it riddled with teleological language, including gushing over the beautiful perfections of nature that a theist preacher of old would approve[i]—and don’t get me started on amateur science journalists!  But it must be admitted that ridding one’s language of such encrustations would be a massive and probably thankless task.

But language is not the only problem.  I suspect there are much deeper levels of ideology underlying the linguistic surface, a possibility suggested by the furious atheism of certain icons of the genetic view such as Richard Dawkins, whose book The God Delusion is an embarrassment in the history of thought.  One very serious problem pointed out by Staats is that the human genome simply does not contain enough genes to account for all the behaviors human beings have displayed over time (p.37).  He cites the old number of approximately 100,000 genes, but the Human Genome Project discovered that in fact we have far fewer, perhaps only 20,000 to 30,000; in comparison, tomatoes have 31,760, and corn has 32,000.  The Paris japonica has a genome 50 times larger than the human genome.  One might wonder if the larger genomes of plants are necessary because they are sessile organisms, that is they do not move about under their own muscle power, and so have no “behaviors.”  Perhaps what genes accomplish in plants, in humans is accomplished by learning.

In Staats’ view, learning begins with anatomy.[ii]  A four-legged creature does not require a specific instinct for walking and running on four legs; having been given those four legs by genetic inheritance, it learns how to use them in a manner appropriate to their construction.  This explains why puppies, kittens, foals and calves do not take off running as soon as they’re born; there is a learning curve, sometimes protracted as in cats and dogs, sometimes very quick, as in calves and foals, which need to be able to escape predators as soon after birth as possible.  In contrast, less brainy creatures such as insects and spiders can run about immediately—there seems to be no learning curve for their behaviors.  Humans are bipedal, but it is months after birth that we take our first hesitant steps, and years before we attain the grace and efficiency of the typical human gait.  With further directed learning we can attain the proficiency of athletes and ballet dancers, gymnasts and marching soldiers; no other creature can achieve such diversity and high levels of locomotive skills.  And while we may think that such things as male aggression are instinctive, Staats asserts that they are learned.  A large muscular male will quickly learn that he can get what he wants (in behavioral terms, his rewards) through physically aggressive action—whereas a smaller, more gracile male will learn that social skills are more effective for him than is aggression.

Language, too, is not an instinct, but a cultural or learned behavior.  Yes, it is dependent on certain anatomical features, such as flexible lips and tongue, and of course a brain capable of managing the complexities of language.  But these features did not evolve in order to bless Homo sapiens with speech.  The structure of the human mouth and throat are in large part the effects of upright posture, of walking on two legs rather than four, and were already in place when our ancient ancestors first experimented with making controlled vocalizations that, through a process of accumulative learning over generations, were eventually developed into symbolic language.  (Staats suggests this occurred as early as the australopithecines, on the grounds that their tool-working abilities suggest accumulative learning, that is learning passed down from one generation to the next and modified and expanded upon by successive generations, i.e., culture, but there is no way of proving that.)  Just as feathers were not at first used for flight, so the anatomical features that enable language were not at first “intended” for it.[iii]

What is the role of the brain in all this?  Staats asserts that the particularly stereotyped and species-specific behavior patterns that we call instincts do not exist in humans, are not “hard-wired” in the brain.  Rather, the human brain is a learning machine that gets most of its “programming” from the environment, including other human beings.  We begin learning from the moment of birth and, potentially at least, continue learning throughout our lifetimes.  I won’t repeat the extensive explanations Staats provides for this view, but I will say that he makes his case.  That learning trumps instinct in human beings is evidenced by our flexibility in a wide variety of environments, not only as manifested today but throughout our history.  Learning certainly would serve a creature such as humans much better than instinct as we move about and interrelate with others in an environment rich in both patterns and surprises.

Although he makes no mention of it, I believe Staats would not be impressed by the naturist just-so story that because humans evolved on the African savannahs we are maladapted to our urban, digital social world of today.  Our culture is the result of learning accumulated over millennia; it’s a world we have made, and lacking instincts, we are perfectly capable of adapting to it.  The challenge is to continue learning and not to be discouraged by theories that say we are driven by instincts over which we have no control.

[i] For a rare exception, see John Gray’s Straw Dogs.

[ii] This idea is not original to Staats.  Lucretius stated the same idea over 2000 years ago.

[iii] See also Fodor and Piattelli-Palmarini, What Darwin Got Wrong, for a discussion of  insect wings as originally functioning as temperature regulators rather than for flight (p. 87).  See also R. C. Lewontin’s review of their book in the New York Review of Books, 27 May 2010.

Notes on Language Origin

Regardless of to what extent the ability of language is instinctive, the fact is that language per se is an invention—or more accurately, individual languages are inventions.  Whatever the “language instinct” is, it does not provide specific words, grammatical structures, or degrees of concreteness and abstraction.  These vary too widely to be instinctive.  It thus seems to me that the efforts of evolutionary psychologists to pinpoint the “modules” responsible for various aspects of language are, to say the least, unilluminating.

Go to the Notes on Language Origin page to read the full essay.