Tag Archives: evolutionary psychology

“The Elephant in the Brain”: A Review

One thing I’ve learned from years of wide reading is that every text, whether fiction or nonfiction, article or book, consists of two levels: the subject matter and the agenda. The subject matter is, basically, the topic or explicit contents of the text (e.g., the presidency of Andrew Jackson), and the agenda is the real point (sometimes stated, often hidden) of the text (Jackson prefigures the populist nationalism of Donald Trump, with likely similar results). “The Elephant in the Brain” which constitutes the subject matter of this book is the largely unconscious work we do to deny (to ourselves and others) the hidden motives that drive our behaviors in our everyday lives, how in fact we “accentuate our higher, purer motives” over our selfish ones. Kevin Simler and Robin Hanson describe the ways in which a number of these motives (competition, status seeking, self-protection, self-esteem, etc.) are expressed in various contexts, such as conversations, consumption, art, education, religion, and so forth.

They are, however, far from being the first to explicate such foibles of the human psyche. There is a vast literature from ancient times to modern that have already explored the same phenomena—one thinks of moments in the Old Testament (“the heart is deceitful in all things”), of Sir Walter Scott (“Oh what a tangled web we weave/When first we practice to deceive”); or of Erasmus’s “In Praise of Folly” (“No man is wise at all times, or without his blind side.”), Charles Mackay’s “Extraordinary Popular Delusions and the Madness of Crowds,” or just about anything by Freud. Nor should we exclude Thorstein Veblen’s “Conspicuous Consumption” from this short exemplary list, as his theory anticipates Simler and Hanson’s chapter on consumption.

That Simler and Hanson have little new to say about our propensity for self-deception does not mean they can’t occasionally be interesting and even fun. Their chapter on medicine makes an excellent point: that much (too much) of our consumption of health care exceeds our need for it; that such consumption amounts to “conspicuous caring,” a kind of status seeking or signaling that is analogous to Veblen’s conspicuous consumption, with the same kind of wastefulness of resources. Their chapters on education and charity demonstrate that, as they put it, “many of our most cherished institutions—charities, corporations, hospitals, universities—serve covert agendas alongside their official ones. Because of this, we must take covert agendas into account when thinking about institutions, or risk radically misunderstanding them.”

I can think of an example: At my state university, the football and basketball head coaches make $2,475,000 and $2,200,000 million (in base salary) respectively, while their putative boss, the university president, makes a base salary of $800,000. The university’s mission statement makes no mention of either the football or basketball program; instead it brags about research, learning, career success—the usual academic checkpoints. Yet money speaks louder than words.

But otherwise, the authors have little new to say about our hidden motives, so why then have they written this book, and why has it been received so positively by reviewers and readers? Because it fits smoothly into our contemporary Zeitgeist in which everything is explained (or re-explained) in terms of 1) digital technologies and 2) evolutionary theory (especially “fitness”). It’s as if these two paradigms supply us with the long-sought theory of everything and thereby relegate all that has come before to the dustbin of error and superstition; consequently, everything, including our propensity for denying our own motives, must be explained as if they had never been explained before, as if for the very first time in all of human history and thought, we have identified and explained the sources of all of our traits and quirks.

Now, while the hyper-reductionist world view of the digirati is not directly mentioned in the book, nor explicitly appropriated as a supportive argument, it is nonetheless fundamental to the authors’ thesis, both of whom are full-fledged members of the geek collective: Hanson is an economist with a degree in physics, a devotee of AI and robotics, and the author of “The Age of Em: Work, Love, and Life When Robots Rule the Earth” (2016); he has arranged to have his brain cryogenically frozen, perhaps in hopes that in the not-so-distant future it will be one of the brains (the brain?) that will be downloaded into all those em robots that will soon take over the world. His co-author Kevin Simler (shown in his blog photo as a typical Silicon Valleyboy in tee shirt, hoodie, and little wire-rimmed glasses) has studied philosophy and computer science (not as contradictory as one might assume) and has had an extensive career with start-ups. Both separately and in collusion, they view the world through logarithm-tinted glasses. There are certain background assumptions that, in this book at least, while unspoken nevertheless shape their conclusions.

One of the most important of these is the notion that the brain is a computer that runs on programs and apps (“modules,” etc.)—consciousness is the screen behind which these programs are silently running without our awareness, invisible but determinative. Which brings us to the second of their paradigms, evolution—especially the notion that unconscious instincts, running determinatively behind our conscious minds, are not only sufficient to explain all of human activity but have only one goal, to win the mating game. And while they may be expert in digitization, they are rank amateurs when dealing with supposedly Darwinian explanations of human behavior (neither has a background in neuroscience, evolutionary biology, or anthropology, the fields most relevant to their claims).

This lack is most evident in their chapter on art, which they claim is nothing more than a means of signaling fitness, i.e., that the production of art signals to prospective mates that the artist has the vigor and strength to waste on nonproductive or impractical activities (“survival surplus”). They provide only scenarios (made up illustrative fairy tales), tired analogies (bowerbirds), and modals (such words as “may” or “may have,” as in “Art may have arisen, originally, as a byproduct of other adaptations.” Note the provisional quality here: it “may have” [but maybe it didn’t] and “originally” [but maybe now it’s something else?].) We have almost no evidence of how or why human beings first began making art; there is no evidence that the making of art has enhanced the reproductive success of any artist; indeed, many of our most famous artists had no offspring at all. Maybe some of them had more sex than average, but evolutionarily that doesn’t count if it resulted in no children (and grandchildren, etc.; Shakespeare had children, but he has no living descendants today).

It is true that art, or at least the collecting of art, can enhance a person’s social status, but it’s interesting to note that enhanced social status does not necessarily result in more descendants. Indeed, at least in our current world, poorer people tend to have more children than the rich, despite the fact that the poor do not have the wherewithal to purchase prestige works of art.

There is also the problem of ethnocentrism (including what we can call present-centrism): “One study, for example, found that consumers appreciate the same artwork less when they’re told it was made by multiple artists instead of a single artist . . .” (p. 194). These few words reveal a) that neither author has the faintest acquaintance with the history of art, either Western or world; b) that they are unaware that the idea of the individual artist we’re familiar with today is an historically recent development that even in the West did not exist before the Renaissance; c) that they are unaware that even in the Renaissance, the great artists whom we revere today did not do all the work on a canvas (for example) themselves but rather headed studios in which much of the grunt work (backgrounds, preliminary work, etc.) was done by apprentices (a modern example, of course, is Andy Warhol’s Factory); d) and they are also obviously unaware that in many cultures (including traditional Chinese) originality was not valued. Simler and Hanson’s theory ignores the vast majority of artists and arts of the world and is therefore without merit.

Frankly, I don’t believe that the real issue for Simler and Hanson is evolutionary fitness; rather, it’s their bias towards the “practical” and against the “impractical” that’s at play here, as demonstrated by their chapter on education. Consider this passage: “In college we find a similar intolerance for impractical subjects. For example, more than 35 percent of college students major in subjects whose direct application is rare after school: communications, English, liberal arts, interdisciplinary studies, history, psychology, social sciences, and the visual and performing arts” (p.228). Say what? Thirty-five percent of all students is a sizable chunk and hardly indicative of an “intolerance for impractical subjects.” And given their earlier argument that art enhances the fitness of its practitioners, one wonders why it’s included on this list of impractical subjects. Such an egregious failure to pay attention to the illogic of their own arguments suggests that the authors have an unacknowledged agenda.

It’s an agenda that the authors themselves may not be fully aware of: by reducing the human mind to nothing more than the usually unconscious expression of instincts, they can convince themselves and their naïve readers that the mind can be reduced to a network of programs and applications. Thus they can justify the fantasy that AI can duplicate and eventually replace the human mind altogether; they can envision a future in which minds (particularly their minds) can be uploaded to a computer or to multiple robot machines and thereby defy mortality (the law of entropy) and achieve that Faustian dream (nightmare?) of complete power and knowledge—at least for them, not for the masses. This is nothing but digital-age Social Darwinism. But as Jaron Lanier recently wrote, “Every time you believe in A.I. you are reducing your belief in human agency and value.”

Advertisements

Evolution and Theodicy

“Why is there evil in the world?” This question has been asked by philosophers and theologians and ordinary men and women for millennia. Today scientists, particularly evolutionary biologists, neuroscientists, and evolutionary/neuropsychologists have joined the effort to explain evil: why do people indulge in violence, cheating, lies, harassment, and so on. There is no need here to itemize all the behaviors that can be labeled evil. What matters is the question of “why?”

The question of “why is there evil in the world?” assumes the premise that evil is abnormal while good however defined) is normal—the abnorm vs. the norm, if you will. Goodness is the natural state of man, the original condition, and evil is something imposed on or inserted into the world from some external, malevolent source. In Genesis, God created the world and pronounced it good; then Adam and Eve succumbed to the temptations of the Serpent and brought evil and therefore death into the world (thus, death is a manifestation of evil, immortality the natural state of good). Unfortunately, the Bible does not adequately account for the existence of the Serpent or Satan, so it was left to Milton to fill in the story. Gnostics, Manicheans, and others posited the existence of two deities, one good and the other evil, and constructed a vision of a cosmic struggle between light and darkness that would culminate in the triumph of good—a concept that filtered into Christian eschatology. The fact that Christian tradition sees the end times as a restoration to a state of Adamic or Edenic innocence underscores the notion that goodness is the natural, default state of man and the cosmos.

Contemporary secular culture has not escaped this notion of the primeval innocence of man. It has simply relocated Eden to the African savannah. When mankind was still at the hunter-gatherer stage, so the story goes, people lived in naked or near-naked innocence; they lived in egalitarian peace with their fellows and in harmony with nature. Alas, with the invention of agriculture and the consequent development of cities and civilizations, egalitarianism gave way to greed, social hierarchies, war, imperialism, slavery, patriarchy, all the factors that cause people to engage in violence, oppression, materialism, and so on; further, these faults of civilizations caused the oppressed to engage in violence, theft, slovenliness, and other sins. Laws and punishments and other means of control and suppression were instituted to keep the louts in their place. Many people believe that to restore the lost innocence of our hunter-gatherer origins, we must return to the land, re-engage with nature, adopt a paleo diet, restructure society according to matriarchal and/or socialist principles, and so on. Many people (some the same, some different from the back-to-nature theorists) envision a utopian future in which globalization, or digitization, or general good feeling will restore harmony and peace to the whole world.

Not too surprisingly, many scientists join in this vision of a secular peaceable kingdom. Not a few evolutionary biologists maintain that human beings are evolutionarily adapted to life on the savannah, not to life in massive cities, and that the decline in the health, intelligence, and height of our civilized ancestors can be blamed on the negative effects of a change in diet brought on by agriculture (too much grain, not enough wild meat and less variety of plants) and by the opportunities for diseases of various kinds to colonize human beings too closely crowded together in cities and too readily exposed to exotic pathogens spread along burgeoning trade routes. Crowding and competition lead to violent behaviors as well.

Thus, whether religious or secular, the explanations of evil generally boil down to this: that human beings are by nature good, and that evil is externally imposed on otherwise good people; and that if circumstances could be changed (through education, redistribution of wealth, exercise, diet, early childhood interventions, etc.), our natural goodness would reassert itself. Of course, there are some who believe that evil behavior has a genetic component, that certain mutations or genetic defects are to blame for psychopaths, rapists, and so on, but again these genetic defects are seen as abnormalities that could be managed by various eugenic interventions, from gene or hormone therapies to locking up excessively aggressive males to ensure they don’t breed and pass on their defects to future generations.

Thus it is that in general we are unable to shake off the belief that good is the norm and evil is the abnorm, whether we are religious or secular, scientists or philosophers, creationists or Darwinists. But if we take Darwinism seriously we have to admit that “evil” is the norm and that “good” is the abnorm—nature is red in tooth and claw, and all of the evil that men and women do is also found in other organisms; in fact, we can say that the “evil” done by other organisms long precedes the evil that men do, and we can also say, based on archaeological and anthropological evidence, that men have been doing evil since the very beginning of the human line. In other words, there never was an Eden, never a Noble Savage, never a long-ago Golden Age from which we have fallen or declined—and nor therefore is there any prospect of an imminent or future Utopia or Millennial Kingdom that will restore mankind to its true nature because there is nothing to restore.

The evolutionary function of “evil” is summarized in the term “natural selection”: the process by which death winnows out the less fit from the chance to reproduce (natural selection works on the average, meaning of course that some who are fit die before they can reproduce and some of the unfit survive long enough to produce some offspring, but on average fitness is favored). Death, usually by violence (eat, and then be eaten), is necessary to the workings of Darwinian evolution. An example: When a lion or pair of lions defeat an older pride lion and take over his pride, they kill the cubs of the defeated male, which has the effect of bringing the lionesses back into heat so that the new males can mate with them and produce their own offspring; their task is then to keep control of the pride long enough for their own cubs to reach reproductive maturity. Among lions, such infanticide raises no moral questions, whereas among humans it does.

There is no problem of evil but rather the problem of good: not why is there “evil” but rather why is there “good”? Why do human beings consider acts like infanticide to be morally evil while lions do not? Why do we have morality at all? I believe that morality is an invention, a creation of human thought, not an instinct. It is one of the most important creations of the human mind, at least as great as the usually cited examples of human creativity (art, literature, science, etc.), if not greater considering how much harder won it is than its nearer competitors, and how much harder it is to maintain. Because “good” is not natural, it is always vulnerable to being overwhelmed by “evil,” which is natural: Peace crumbles into war; restraint gives way to impulse, holism gives way to particularism, agape gives way to narcissism, love to lust, truth to lie, tolerance to hate. War, particularism, narcissism, etc., protect the self of the person and the tribe, one’s own gene pool so to speak, just as the lion kills his competitor’s cubs to ensure the survival of his own. We do not need to think very hard about doing evil; we do need to think hard about what is good and how to do it. It is something that every generation must relearn and rethink, especially in times of great stress.

It appears that we are in such a time today. Various stressors, the economy, the climate, overpopulation and mass migrations, religious conflict amid the dregs of moribund empires, are pushing the relationship of the tribes versus the whole out of balance, and the temptations are to put up walls, dig trenches, draw up battle lines, and find someone other than ourselves to blame for our dilemmas. A war of all against all is not totally out of the question, and it may be that such a war or wars will eventuate in a classic Darwinian victory for one group over another—but history (rather than evolution) tells us that such a victory is often less Darwinian than Pyrrhic.

Donald Trump: Psychoanalysis vs. Ethics

Is Donald Trump a narcissist? Is he a psychopath? Is he mentally unstable? These questions, and others of the same ilk, have been asked (and often answered in the affirmative) throughout the primary campaign season. To a lesser extent, similar questions have been asked about his followers. There has been, in other words, a lot of psychoanalyzing. It’s as if the DSM-5, the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, has become the primary guide to politics and politicians.

Hillary Clinton has also, and for a longer time (at least since the Lewinsky scandal), been subjected to armchair and coffee house analysis (she’s in denial, etc.), even though, given that she is, for a politician, a surprisingly private person (i.e., uptight? Secretive? Not warm?), one wonders how anyone can legitimately diagnose her. Bill Clinton has also, of course, been parsed and dissected (narcissist, sex addict, etc.). Surprisingly, there has been little psychoanalysis of Bernie Sanders, perhaps because, as Hillary’s gadfly, he has dominated the high ground of principle.

Perhaps when a serious candidate actually has principles and stays consistent with them, psychologizing is unnecessary and even irrelevant. Principles have the effect of overriding personal quirks and biases. They are not generated from within this or that individual, and therefore are not reflective only of that individual, but are generated in a long process of shared thought. We come to principles through reason (Hannah Arendt might have said, through reason paired with imagination), not through impulse; indeed, the point of principle is to put a bridle on impulse, to restrain the impetuousness of the moment in favor of the longer, wider view. In Pauline terms, it replaces the natural or carnal man with the spiritual man; in late Protestant terms, it replaces immediate with delayed gratification.

So while Trump may or may not be a psychopath, a narcissist, or mentally unstable or ill, which none of us can really know, he is an unprincipled man. His constant shape-shifting, self-contradictions, denials, and off-the-cuff bluster are the signs of an impulsive man whose thoughts and words are not subjected to the vetting of a set of principles that can tell him whether he is right or wrong. He has at long last no shame, no decency, because he has no principles to tell him what is decent or shameful. In other words, he is typical of human beings, men and women, when they have nothing higher or wider than themselves as guides to behavior. This is not the place to go in depth into the utility of moral principle, but just as an example, something as simple as “do unto others as you would have others do unto you” can restrain the natural selfish impulse to grab as much as you can for yourself.

Anyone who has taken an introductory course in psychology or who has paged through any of the editions of the DSM has found plenty of evidence that they are in some way or another mentally unstable or unhealthy. Just about anyone can look at the list of defining characteristics of, say, narcissistic personality disorder (do you think you are special or unique?), or antisocial personality disorder (are you opinionated and cocky), or a perfectionist, and wonder, in a bit of self-diagnosis, if they should seek help. Welcome to the cuckoo’s nest. Or rather, welcome to humanity.

But for the concept of a disorder to exist, there has to be a concept of an order, i.e., a definition of what being a normal person is. Ironically, psychology is of no help to us here. The DSM-V is nearly one thousand pages long, and according to its critics adds more previously normal or eccentric behaviors to its exhaustive, not to say fatiguing, list of mental maladies. Its critics also charge that it provides ever more excuses for psychiatrists and physicians to prescribe very profitable drugs to people who are really just normal people. After all, they point out, life is not a cakewalk, and people are not churned out like standardized units.

Principle, i.e., morality, ethics, on the other hand, can be of great help here. It is obvious that the followers of Trump have not been dissuaded from supporting him because of the amateur psychoanalyses of pundits and opponents. Clearly they like those traits which the alienists are diagnosing. But what if someone started criticizing him on moral grounds, what if someone performed something analogous to “Have you no decency, sir?” This question, posed by Joseph N. Welch to Senator Joe McCarthy in a full Senate hearing in 1954, was a key moment in the demise of one of the worst men in American political history. Welch did not psychoanalyze McCarthy, nor did Edward R. Murrow in his famous television broadcast on McCarthy’s methods, and McCarthy was not taken away in a straitjacket. He was taken down by morally principled men and women who had had enough of his cruelty and recklessness.

Plato’s Cave, Inside Out

The original story of Plato’s cave can be summarized as this: A group of men are bound inside a cave with a wide entrance, though which the sun streams, projecting shadows on the back wall of the cave. The men’s shackles force them to face that back wall, so that all they can see are the shadows, moving back and forth across the wall. They are watching a kind of shadow play, which however they take for reality, as it is the only thing they can see. One day, the men are set free and dragged out of the cave into the sunlight, where they can see for the first time that the shadows they took for reality were cast by other men walking back and forth in front of the cave, carrying various objects as they went about their business. For the first time in their lives, these former prisoners realize that what they had believed was real were merely insubstantial silhouettes of the actual things that cast the shadows.

This parable has traditionally been understood to explain Plato’s philosophical Idealism, that is that the objects of the world as we perceive them are imperfect embodiments of the ideal Forms, which are the real things of the Cosmos. Thus, for example, that table in the dining room is a representation, so to speak, of the ideal form of “Table” which, unlike your dining table, is immaterial, perfect, eternal, and the “idea” that informs all tables—dining and kitchen, coffee and end, writing and conference, etc. All specific things of the material world are likewise merely expressions of their ideal forms. Thus, the “idea” of a thing is its truth—the material embodiment of the idea is imperfect, temporary, and therefore in a sense “false.”

The task of philosophy is to contemplate the ideal forms, not the imperfect expressions of them; this puts the “idea” above everything. One can see why Plato’s view has a great deal of appeal to philosophers and other types of intellectual, including all too often, ideologues, for whom an ideology (“a system of ideas and ideals, especially one that forms the basis of economic or political theory and policy”) trumps practicality (and oftentimes, morality). Whether or not Plato and his legion of descendents believed in a literal heaven of ideal forms, in practice they have behaved as if their ideas were in fact perfect, eternal, “self-evident,” and true, truer than experience and superior to the stubborn resistance of material things to being shaped according to these truths. For these types, reality is a sin against reason.

So let us attempt to correct Plato’s parable: The prisoners in the cave are not trapped in the material world, but in the confines of their own mind; they are contemplating the flickering shadows of their own thoughts, stripping away the particulars of individual objects and constructing vast theories on the basis of these one dimensional, flat, featureless cutouts. (It is worth noticing that shadows are also dark, i.e., the blocking out or absence of light, as being in the shadow of a tree or building.) Once the prisoners are freed, they can see that what they thought was real (their own thoughts) were not real at all.

It is the material world of particular objects, particular individual persons for example, as well as trees, vases, tables, songs, flowers, dogs, etc., that is filled with real things, the ideas of which are figments piled on figments unto confusion. Ideas uninspired and uncorrected by reality can lead us very far astray.

A relevant quotation:
“I ran out of interest in my own consciousness around 1990, but there’s no reason ever to run out of interest in the world.” –Crispin Sartwell, “Philosophy Returns to the Real World,” The New York Times, April 13, 2015

We Are All Still Animists

[Children do not] have to be taught to attribute people’s behavior to the mental states they’re in. Children tend, quite naturally, to anthropomorphize whatever moves. What they have to learn is which things don’t have minds, not which things do.”
–Jerry Fodor (“It’s the Thought That Counts,” London Review of Books, November 28, 1996.)

Iconoclastic statements have always appealed to me, particularly because they cause me to look at the iconic statements they are set against in a new and critical light. Sometimes the iconic statements survive the scrutiny; oftentimes they don’t. In this case the iconic statement, that children learn that other people have minds of their own (theory of mind) over time, seems commonsensical until it is re-read in light of Fodor’s statement. Then it appears less evidently true.

Look at the first part of Fodor’s statement, that children “quite naturally . . . anthropomorphize whatever moves.” To anthropomorphize is to attribute human characteristics, in particular a mind with such things as motives, desires, feelings, etc., to nonhuman things. But, in my experience, not just to things that move (pets, for example), but also to things that don’t move: Dolls and figurines don’t move, though they look like they could, but small children also attribute feelings to objects that, to an adult, clearly are inanimate, such as blankies and other favored possessions; hence their sense of tragedy when the blankie disappears into the laundry hamper, or the favorite rubber ball deflates.

To read the full article, click here.

Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

Ethics and Human Nature

It is an unhappy characteristic of our age that certain ignoramuses have been elevated to the ranks of “public intellectual,” a category which seems to consist of men and women who provide sweeping theories of everything, especially of everything they know nothing about. Into this category fall certain writers whose sweeping theory is that, prior to the Enlightenment, everyone lived in abject superstition and physical misery. With the Enlightenment, reason and science began the process of sweeping away misery and ignorance, clearing the field for the flowers of prosperity and knowledge. Such a sophomoric view of human history and thought has the virtue (in their minds only) of rendering it unnecessary for them to acquaint themselves with a deep and nuanced knowledge of the past, an error which permits them to attribute all that is good in human accomplishment to the age of science and all that is bad to a dark past best forgotten.

Nowhere is this more evident than in the recent fad for publishing books and articles claiming that science, particularly evolutionary science, provides the necessary and sufficient basis for ethics.

To read the article, click here.

The Mismeasure of All Things

Some 2500 years ago, Protagoras said that man is the measure of all things. By this he meant something like, mankind can know only that which it is capable of knowing, which in effect is a recognition that the human mind does have its limits; but Protagoras’ statement has often been taken to mean that man is the standard by which all other things are to be measured, i.e., that mankind is the standard of comparison for judging the worth of everything else. This meaning may have been colored by the Christian concept of man as the object of divine history, of man as just a little lower than the angels. The Christian concept, in its turn, derives from a common interpretation of the creation story in Genesis, in which God gives man dominion over the rest of earthly creation.

However, while both Protagoras’ saying and the Genesis story carry the concept forward through history, neither explains how the idea actually originated. It may have been Giambattista Vico (1668-1744) who first recognized that it is ignorance rather than knowledge that makes man the measure of all things: “When men are ignorant of natural causes producing things, and cannot even explain them by analogy with similar things, they attribute their own nature to them.” That is, when primitive men and women surveyed the world and sought explanations of phenomena, they had nothing to go by other than what they knew about themselves, so that, for example, a terrible destructive storm could be explained as the anger of the gods, since when human beings became angry they too engaged in destructive behavior; or when a gentle rain caused plants to grow, the gods were in a good mood, perhaps pleased by some human act of worship, because when humans were in a good mood, they engaged in benevolent acts. After all, the earliest humans could not have had any knowledge of the material causes of storms, droughts, etc., nor of course of animal behavior, which they attributed to motives much like their own. As Stephen Toulmin and June Goodfield summarize Vico’s views, in primitive mythologies people “could measure the world of Nature only by that which they already knew—namely themselves” (The Discovery of Time).

Both Protagoras and Genesis simply give more sophisticated glosses on this primitive impulse. They reflect the increasing body and complexity of knowledge developed by ancient civilizations, particularly those that had developed writing systems, which in turn enabled them to impose order on what had been a plethora of local myths and their variants. Simply by creating relatively coherent pantheons containing gods with discreet attributes, roles, and positions in a divine hierarchy, ancient civilizations were able to organize their intellectual world and provide authoritative explanations. Monotheism carried this further, by providing an even more unified world view, but it also somewhat depersonalized the concept of God, making him more abstract and less personal (e.g., no images or idols, no household god or genie of the local spring, etc.). This was an important achievement in the ongoing development of knowledge, a necessary step in the process that led to the state of knowledge we enjoy today, in large part because it put more emphasis on cerebral, intellectual rather than personal and experiential modes of understanding—in a sense, creating theory to replace myth. Thus we see the Greek philosophers creating the first science and the Jews creating the first inklings of theology and, importantly, teleology (a sense of history with a goal towards which it was moving). Nevertheless, the Judeo-Christian god retained strong anthropomorphic features, especially in the popular imagination and in visual arts, in which, for example, God the Father was usually depicted as a white-haired old man. Perhaps as long as most people were illiterate and dependent on visual media for their abstract knowledge, anthropomorphism was to be expected.

The Western European, Christian intellectual (literate) tradition combined these two strands of ancient thought, the scientific/philosophical with the historic/teleological, setting the stage for a modern world view that sees the world as making coherent sense and as operating according to consistent, universal laws, which then can be exploited by human beings for their own betterment. As scientific knowledge expanded and material explanations could be provided for phenomena that once were viewed as signs of divine intervention, God receded to the back of men’s minds as less necessary to explain the world—at best, perhaps, He became little more than the Prime Mover, the one who got it all started or the one who established the universal laws which continue to operate without His immediate intervention. But if the Age of Reason or the Enlightenment put God into retirement, it did not give up the belief in coherent laws and the quest for universal theories, nor did it give up the teleological view of history.

It is important to note that the teleological view is always a human-centered view; history, whether of cosmos, nature, or society, was still about man; very few thinkers hazarded to speculate that man might be merely one among many creatures and phenomena rather than the point of the whole enterprise. In this sense, at least, the early modern era retained the primitive impulse to both anthropomorphism and anthropocentrism. The widespread acceptance of Darwin’s theory of evolution by means of natural selection did little, indeed perhaps nothing, to change that for most people. It was not difficult to switch from believing that God had created man for dominion over nature and as the center of the historical story of fall and redemption, to believing that evolution is teleological, both in the sense of inevitably leading to the emergence of homo sapiens as the crowning outcome of the evolutionary process and in the sense of evolution as a progressive process. And it was easy enough, in the context of nineteenth-century capitalism, to believe that modern industrial culture was the natural continuation of progressive evolution—indeed was its goal.

It took a generation or more for it to dawn on people that Darwinism, along with the geological discoveries regarding the great age of the earth and the astronomers’ and physicists’ discoveries of the even greater age of the universe, implied there is no god at all, not even the reticent god of the Deists. One would think that once this implication struck home, both the teleological and the anthropocentric views would fade away. But, perhaps due to human vanity, neither has done so.

In a supremely ironic twist, both teleology and anthropocentrism have been inverted. Whereas the theological age measured other creatures in human terms, the evolutionary age measures humans in animal terms. We are no longer a little lower than the angels but only a little bit higher than the other animals—or maybe not even that. We are naked apes, talking apes, singing apes. We are like social insects, we are vertebrates, we are aggressive because we are animals seeking to maximize our survival, we are merely transportation for the real biological players, selfish genes. We are not rational or conscious, we do not have free will, we operate by instinct, each of our seemingly advanced traits is hard-wired. Our morality is nothing more than an adaptation. We take a word like altruism, which originally meant a certain kind of human behavior, apply it to ants, where it becomes a description of instinctive eusocial behavior, and then re-apply that meaning back onto humans. Thus making us just like all the other animals. Therefore, we study them in order to understand ourselves. We focus on the similarities (often slim) and ignore the differences (often radical).

This continues the old habit of anthropomorphism in new guise and fails to recognize the independent existence of other creatures—their independent lines of evolution as well as their ontological separateness from us. We unthinkingly repeat that humans and chimps share 96 percent of their genes (or is it 98 percent?), as if that meant something—but then, it’s said we share 97 percent of our genes with rats. We neglect to mention that apes and humans diverged from each other some 7 to 8 million years ago and have followed independent lines of evolution ever since. We are not apes after all.

Consider the fruit fly, that ubiquitous laboratory subject which has yielded so much knowledge of how genes work. It is often cited as a model of human genetics and evolution. But consider what Michael Dickinson, a scientist (he calls himself a neuroethologist) at the University of Washington (Seattle), has to say about fruit flies: “I don’t think they’re a simple model of anything. If flies are a great model, they’re a great model for flies.” To me, this is a great insight, for it recognizes that fruit flies (and, frankly, insects in general) are so other than like us that to study them as if they were a model of anything other than themselves, as a model of us, is in a sense not to study them at all. It is rather to look into their compound eyes as if they were mirrors showing our own reflections. It is a form of narcissism, which perhaps contains our own demise.

Our demise because in continuing to look at nature as being about ourselves we continue the gross error of believing we can manipulate nature, other organisms, the entire world, to our own narrow purposes without consequences. It turns other organisms into harbingers of homo sapiens, narrows research to that which will “benefit” mankind, and misses the very strangeness of life in all its diversity and complexity. It continues the age-old world view of human dominion and fails to recognize that our “dominion” is neither a biological necessity nor a feature of the natural world. Dominion is a dangerous form of narcissism which a maturely scientific age should discard.

Marriage vs Mating

Yet Another Just-So Story

What is marriage? Ask an American of strong religious beliefs, and he is likely to say that it is a union between one man and one woman sanctioned by God. Ask more secular individuals, and they are likely to say that it is a civil contract between two individuals, committed to each other by love, but of practical importance in terms of legal and tax benefits, etc. Ask some biologists, and they will say that monogamous marriage is an evolutionary adaptation that increased the survival rate of helpless human infants, guaranteed to the father that the children produced by his wife were indeed his, and/or facilitated the development of human intelligence—or whatever, as long as the explanation can be stated in terms of natural selection. So at least is the impression one receives from a recent article in the New York Times (titled, somewhat misleadingly, since polygamy is discussed, “Monogamy’s Boost to Human Evolution”—but at least the title does neatly summarize the bias).

Ask an historian, a sociologist, or an anthropologist, and any one of them is likely to say that marriage practices vary over time and among cultures, from polygamy to monogamy, and they are also likely to mention that it varies by class. In warrior societies polygamy was common among the warrior elite (including kings and nobility, whose avocation was warfare, and who could have both many wives and concubines) to monogamy among the commoners; polygamy is common in societies in which there is a high mortality rate among young men (war, hunting mishaps, etc.) whereas monogamy is more common among societies in which the balance of adult males to females is more even, as well as in more egalitarian societies. Generally speaking, marriages were contracted for social purposes, to cement alliances, to protect inherited property, or to synchronize labor.

Marrying for love is a rather recent innovation and is characteristic of modern individualistic (and capitalist) countries, although monogamy has long been legitimized by Christianity, in part because of its dread of sexual license. Some people get around the stricture by having separate and unofficial multiple spouses, for example Charles Lindbergh, who had children in long-term relationships with three women other than his wife. Contemporary Americans seem to be practicing serial monogamy (divorce and remarriage) as well as unofficial and often temporary arrangements. In all cases, there has always been a whole lot of cheatin’ goin’ on. Then there is the added element of prostitution, including street walkers and courtesans, for which even the cleverest evolutionary biologist would have a hard time providing an evolutionary explanation. All of which suggests that marriage is different from mating. The latter is strictly biological—up until very recent times, there has been only one way to produce children, the sexual congress of a fertile man with a fertile woman, and this one way is unaffected by social customs. That is, socially sanctioned monogamy does not prevent either partner from producing a child with a person other than his/her spouse; eggs and sperm recognize no such boundaries.
I
t therefore seems both pointless and fruitless to try to concoct explanations for marriage customs and practices from natural selection. At some unknown point in the remote human past, people began creating nonbiological ways of organizing their lives. It’s what our big brains allow us to do. Mating may be in our DNA; marriage, however, is not.

Apart from the waste of time and grant money entailed in the pursuit of these evolutionary Just-So stories, the misguided notion, bordering on an ideology, that everything humans do can be explained solely in biological evolutionary terms, by a module in the brain, by DNA (i.e., instinct), denigrates other modes of knowledge that actually produce better explanations. We can learn more about marriage from historians and anthropologists than we can from biologists.

Why Determinism?

The eternal debate between determinism and free will has lately taken a new form. Determinism has been reincarnated in the shape of neuroscience, with attendant metaphors of computers, chemistry, machines, and Darwinism. Meanwhile, defenders of free will seem to have run out of arguments, particularly since, if they wish to be taken seriously, they dare not resort to a religious argument. That the debate is virtually eternal suggests that it is not finally resolvable; it could be said in fact that the two sides are arguing about different things, even though they often use the same terminology.

Determinism’s popularity is most clearly suggested by the sales figures for books on the subject and by the dominance of the view in popular science writing. Such books are widely reviewed, while those arguing for free will are neglected, especially by the mainstream press.

The question then is not whether or not we have free will, or whether or not we are wholly determined in all our thoughts and actions; but rather, why at this point in time, particularly in this country, determinism is so popular, more so than free will?

Today’s determinism is not the same as the ancient concept of fate. Fatalism was not so much about determinism or, as the Calvinists posited, predestination; fatalism did not pretend to know what would happen, but rather held that fate was a matter of unpredictability, of whim (on the part of the universe or of the gods, etc.), and in fact left some room for free will, in a what-will-be-will-be sort of way; i.e., because outcomes were unpredictable, one had to choose, one had to act, and let the dice fall where they may. The tragic flaw of hubris was exactly what is wrong with any determinism, the delusion that one could stop the wheel of fate from turning past its apex, i.e., that through prediction one could control.

Determinists worship predictability and control. I once read somewhere the idea that, if everything that has already happened were known, everything that will happen could be accurately predicted. Extreme as this statement is, it accurately summarizes the mindset of the determinists. It also suggests why determinism is so attractive in a scientific age such as ours, for science is not only about the gathering of facts and the formulation of theories but also about using those theories to make predictions.

Given the apparent power of science to accurately predict, and given that prediction is predicated on a deterministic stance, it is not surprising that scientists should turn their attention to the human condition, nor that scientists, being what they are, tend to look for, and find, evidence that human thoughts and behavior are determined by genes, neurons, modules, adaptations, what have you, and are therefore predictable. And it further is not surprising that, in a restless and rapidly changing world, laymen are attracted to these ideas. Certainty is an antidote to powerlessness.

If we are religiously minded, we find certainty in religion; hence the rise of politically and socially powerful fundamentalist movements today. If we are not religious, we may find certainty in New Age nostrums, ideologies, art, bottom lines, celebrity worship, or even skepticism (no one is more certain of his or her own wisdom than the skeptic). If we are politicians, we look for certainty and security in megabytes of data. If we are scientifically minded, we find certainty in science. But certainty is not science. It is a common psychological need in an age of uncertainty.

In satisfying this need for certainty, determinism often leads to excessive self-confidence and egotism—which in turn leads to simplifications and dismissal of complexity, ambivalence, and randomness. Determinism is teleology. Today’s determinists may have discarded God, but they still believe that He does not play dice. They are, in short, utopians. We all know where utopias end up. That much at least we can confidently predict.