Tag Archives: evolution

“The Elephant in the Brain”: A Review

One thing I’ve learned from years of wide reading is that every text, whether fiction or nonfiction, article or book, consists of two levels: the subject matter and the agenda. The subject matter is, basically, the topic or explicit contents of the text (e.g., the presidency of Andrew Jackson), and the agenda is the real point (sometimes stated, often hidden) of the text (Jackson prefigures the populist nationalism of Donald Trump, with likely similar results). “The Elephant in the Brain” which constitutes the subject matter of this book is the largely unconscious work we do to deny (to ourselves and others) the hidden motives that drive our behaviors in our everyday lives, how in fact we “accentuate our higher, purer motives” over our selfish ones. Kevin Simler and Robin Hanson describe the ways in which a number of these motives (competition, status seeking, self-protection, self-esteem, etc.) are expressed in various contexts, such as conversations, consumption, art, education, religion, and so forth.

They are, however, far from being the first to explicate such foibles of the human psyche. There is a vast literature from ancient times to modern that have already explored the same phenomena—one thinks of moments in the Old Testament (“the heart is deceitful in all things”), of Sir Walter Scott (“Oh what a tangled web we weave/When first we practice to deceive”); or of Erasmus’s “In Praise of Folly” (“No man is wise at all times, or without his blind side.”), Charles Mackay’s “Extraordinary Popular Delusions and the Madness of Crowds,” or just about anything by Freud. Nor should we exclude Thorstein Veblen’s “Conspicuous Consumption” from this short exemplary list, as his theory anticipates Simler and Hanson’s chapter on consumption.

That Simler and Hanson have little new to say about our propensity for self-deception does not mean they can’t occasionally be interesting and even fun. Their chapter on medicine makes an excellent point: that much (too much) of our consumption of health care exceeds our need for it; that such consumption amounts to “conspicuous caring,” a kind of status seeking or signaling that is analogous to Veblen’s conspicuous consumption, with the same kind of wastefulness of resources. Their chapters on education and charity demonstrate that, as they put it, “many of our most cherished institutions—charities, corporations, hospitals, universities—serve covert agendas alongside their official ones. Because of this, we must take covert agendas into account when thinking about institutions, or risk radically misunderstanding them.”

I can think of an example: At my state university, the football and basketball head coaches make $2,475,000 and $2,200,000 million (in base salary) respectively, while their putative boss, the university president, makes a base salary of $800,000. The university’s mission statement makes no mention of either the football or basketball program; instead it brags about research, learning, career success—the usual academic checkpoints. Yet money speaks louder than words.

But otherwise, the authors have little new to say about our hidden motives, so why then have they written this book, and why has it been received so positively by reviewers and readers? Because it fits smoothly into our contemporary Zeitgeist in which everything is explained (or re-explained) in terms of 1) digital technologies and 2) evolutionary theory (especially “fitness”). It’s as if these two paradigms supply us with the long-sought theory of everything and thereby relegate all that has come before to the dustbin of error and superstition; consequently, everything, including our propensity for denying our own motives, must be explained as if they had never been explained before, as if for the very first time in all of human history and thought, we have identified and explained the sources of all of our traits and quirks.

Now, while the hyper-reductionist world view of the digirati is not directly mentioned in the book, nor explicitly appropriated as a supportive argument, it is nonetheless fundamental to the authors’ thesis, both of whom are full-fledged members of the geek collective: Hanson is an economist with a degree in physics, a devotee of AI and robotics, and the author of “The Age of Em: Work, Love, and Life When Robots Rule the Earth” (2016); he has arranged to have his brain cryogenically frozen, perhaps in hopes that in the not-so-distant future it will be one of the brains (the brain?) that will be downloaded into all those em robots that will soon take over the world. His co-author Kevin Simler (shown in his blog photo as a typical Silicon Valleyboy in tee shirt, hoodie, and little wire-rimmed glasses) has studied philosophy and computer science (not as contradictory as one might assume) and has had an extensive career with start-ups. Both separately and in collusion, they view the world through logarithm-tinted glasses. There are certain background assumptions that, in this book at least, while unspoken nevertheless shape their conclusions.

One of the most important of these is the notion that the brain is a computer that runs on programs and apps (“modules,” etc.)—consciousness is the screen behind which these programs are silently running without our awareness, invisible but determinative. Which brings us to the second of their paradigms, evolution—especially the notion that unconscious instincts, running determinatively behind our conscious minds, are not only sufficient to explain all of human activity but have only one goal, to win the mating game. And while they may be expert in digitization, they are rank amateurs when dealing with supposedly Darwinian explanations of human behavior (neither has a background in neuroscience, evolutionary biology, or anthropology, the fields most relevant to their claims).

This lack is most evident in their chapter on art, which they claim is nothing more than a means of signaling fitness, i.e., that the production of art signals to prospective mates that the artist has the vigor and strength to waste on nonproductive or impractical activities (“survival surplus”). They provide only scenarios (made up illustrative fairy tales), tired analogies (bowerbirds), and modals (such words as “may” or “may have,” as in “Art may have arisen, originally, as a byproduct of other adaptations.” Note the provisional quality here: it “may have” [but maybe it didn’t] and “originally” [but maybe now it’s something else?].) We have almost no evidence of how or why human beings first began making art; there is no evidence that the making of art has enhanced the reproductive success of any artist; indeed, many of our most famous artists had no offspring at all. Maybe some of them had more sex than average, but evolutionarily that doesn’t count if it resulted in no children (and grandchildren, etc.; Shakespeare had children, but he has no living descendants today).

It is true that art, or at least the collecting of art, can enhance a person’s social status, but it’s interesting to note that enhanced social status does not necessarily result in more descendants. Indeed, at least in our current world, poorer people tend to have more children than the rich, despite the fact that the poor do not have the wherewithal to purchase prestige works of art.

There is also the problem of ethnocentrism (including what we can call present-centrism): “One study, for example, found that consumers appreciate the same artwork less when they’re told it was made by multiple artists instead of a single artist . . .” (p. 194). These few words reveal a) that neither author has the faintest acquaintance with the history of art, either Western or world; b) that they are unaware that the idea of the individual artist we’re familiar with today is an historically recent development that even in the West did not exist before the Renaissance; c) that they are unaware that even in the Renaissance, the great artists whom we revere today did not do all the work on a canvas (for example) themselves but rather headed studios in which much of the grunt work (backgrounds, preliminary work, etc.) was done by apprentices (a modern example, of course, is Andy Warhol’s Factory); d) and they are also obviously unaware that in many cultures (including traditional Chinese) originality was not valued. Simler and Hanson’s theory ignores the vast majority of artists and arts of the world and is therefore without merit.

Frankly, I don’t believe that the real issue for Simler and Hanson is evolutionary fitness; rather, it’s their bias towards the “practical” and against the “impractical” that’s at play here, as demonstrated by their chapter on education. Consider this passage: “In college we find a similar intolerance for impractical subjects. For example, more than 35 percent of college students major in subjects whose direct application is rare after school: communications, English, liberal arts, interdisciplinary studies, history, psychology, social sciences, and the visual and performing arts” (p.228). Say what? Thirty-five percent of all students is a sizable chunk and hardly indicative of an “intolerance for impractical subjects.” And given their earlier argument that art enhances the fitness of its practitioners, one wonders why it’s included on this list of impractical subjects. Such an egregious failure to pay attention to the illogic of their own arguments suggests that the authors have an unacknowledged agenda.

It’s an agenda that the authors themselves may not be fully aware of: by reducing the human mind to nothing more than the usually unconscious expression of instincts, they can convince themselves and their naïve readers that the mind can be reduced to a network of programs and applications. Thus they can justify the fantasy that AI can duplicate and eventually replace the human mind altogether; they can envision a future in which minds (particularly their minds) can be uploaded to a computer or to multiple robot machines and thereby defy mortality (the law of entropy) and achieve that Faustian dream (nightmare?) of complete power and knowledge—at least for them, not for the masses. This is nothing but digital-age Social Darwinism. But as Jaron Lanier recently wrote, “Every time you believe in A.I. you are reducing your belief in human agency and value.”


Evolutionary Just-So Story, Again!

So yet again we have a story of evolution that seems to say that evolution works like God, i.e., that it indulges in design. I am referring to an article recently published in the New York Times reporting on research into why the squid lost its shell. The phrasing of the article will, in the minds of the naive, create the impression that the squid lost its shell in order to move faster to escape its predators (shells being rather heavy and cumbersome). “The evolutionary pressures favored being nimble over being armored, and cephalopods started to lose their shells.” This seems to be an innocent enough statement, but its construction implies that the pressure to become nimble preceded and caused the loss of the shells.

That is design. It may not be God design, though one could easily make that leap, but it is design nonetheless.

Oh, if only they would read Lucretius!

Here’s what really happened: Originally, “squids” we shelled creatures; generation after generation were shelled. Occasionally, a genetic mutation or defect (call it what you will) resulted in progeny lacking shells. No doubt, most of these shell-less individuals quickly died or were eaten and left no progeny; but at some point, some of them survived (perhaps thanks to another mutation that enabled them to move more quickly than their shelled relatives) and reproduced, eventually giving rise to a new class of creatures, squids and octopuses, etc. In other words, the change occurred first, without intention or purpose, and the benefit followed. The change did not occur in order to confer the benefit. It just happened.

Of course, such changes often occur gradually, say by shrinking the shell over many generations, in what some have called “path dependency” (i.e., evolution follows an already established path and does not go backwards, in other words it doesn’t restore the shell to creatures who have lost it). But the principle remains the same: first the change, and then, if it happens to have an advantage, it sticks.

As Lucretius said, humans did not develop opposable thumbs in order to grasp tools; we can grasp tools because we have opposable thumbs.

Death of a Bug

The other day I squashed a bug. It was quite small, rather rounded in shape, and making its way slowly across the surface of my nightstand. I am usually not insecticidal, but having a bug of any conformation so proximate to my bed brings out my squeamishness. And recently my condo association had sent out a newsletter with an article about bedbugs. This was probably not a bedbug, but nonetheless, it had to die.

I regretted my brutality immediately. The poor thing had as much right to its life as I have to mine. In the great scheme of things, the life of a human is of really of no more importance than the life of any other creature. We got here through the same process of evolution as they did, and since I do not subscribe to any form of teleology, I do not consider Homo sapiens to be any more perfect nor any more the apex and fulfillment of some great cosmic plan than that poor bug and his cohorts. It is the attitude that we do count for more that has led to so much environmental destruction and so much cruelty, not only to other animals but also to other people. For as eugenics exposed, the idea that humans are the perfection of evolution leads all to easily to the notion that my humans, the people of my group, are more fully perfect than yours. Hence, genocide.

It is therefore not surprising that good souls who reject cruelty to other people also reject cruelty to animals; and also not surprising what psychologists tells us of serial killers, that they tortured and killed animals in their childhoods. Many children, especially boys, do mistreat animals, at least of the insect kind (remember watching ants burst into flame under the magnifying glass?), but most children, even boys, soon outgrow that tendency. Serial killers apparently do not, which suggests that there is an element of immaturity, even of that primitivism that can be both so charming and occasionally so alarming in children, in the serial killer’s makeup. Something having to do with the child’s sense of himself or herself as the center of the world, the world being that which was designed for one’s gratification.

There are other ways in which this juvenile belief that the world owes us gratification can be manifest. The despoiling of the natural world for profit, so that we may live in an abundance that exceeds what the world actually can supply to us, fits this bill. We take not only what is our natural due but also that which is the natural due of all the other creatures with which evolution has populated this planet, which is why so many are being driven into extinction (why so many already have been), and why, when we know perfectly well that our “lifestyles” are warming the planet, we continue to pillage as if there were no tomorrow—until one day perhaps there literally will not be.

Perhaps I am making too much of the squashing of a mere bug. I mentioned that we are the product of the same process of evolution that led to all other creatures, and that process is anything but benign. The process of life is the process of death. Virtually everything that lives does so by killing and eating some other living thing. Even a vegan lives by killing carrots and broccoli and mushrooms (do carrots scream in pain and terror when we yank them from the ground?) There is no escape from this round of death and life. The vegan may not eat any animal product, but his or her efforts make little difference in the great scheme of things—there are predators enough to override the effects of the vegetarian. That is how evolution works its mighty wonders.

Which is why I am not persuaded by those good souls who imagine that we can end suffering and wars and crime and all the other means and ways that we wreak havoc on each other and the world. I am not hopeful that we who live in the so-called developed world will rein in our greed for money and things for the sake of the planet or even for the sake of the starving and terrorized millions of so much of the rest of the world, or even for those who live within our own borders. Like all other creatures, we kill to live. Unlike other creatures, we can overkill. All too often we do, both literally and metaphorically.

That little bug on my nightstand was most likely harmless, at least to me, and maybe it even had some important function in the ecology of my apartment. Or maybe it was just quietly living its own life. I killed it anyway.

See also my “Requiem for a Tree” at this site.

What Is a Species?

That science is a human enterprise and not some pure and perfect object independent of culture is highlighted by a recent investigation into the DNA of American wolves—the gray wolf, the Eastern wolf, and the red wolf. An article in the New York Times (7/27/16) reports that analysis of the DNA of these three wolf species reveals that in fact “there is only one species [of wolf] on the continent: the gray wolf.” The other two are hybrids of coyotes and wolves—Eastern wolves are 50/50, red wolves are 75 percent coyote and 25 percent wolf. The investigators also concluded that the wolf and coyote species shared a common ancestor only 50,000 years ago, which is very recent in evolutionary terms.

Now, anyone comfortable with the fact that nature goes its own way without regard to the human need for tidy intellectual categories is not likely to be much disturbed by these findings. But such people are relatively rare, especially in academic and political circles, so it happens that certain people do find it disturbing that Eastern and red wolves are hybrids. That is, they are not “pure” and therefore may not be entitled to protection from, say, extermination—they are not “pure” and therefore not entitled to the protection of such laws as the Endangered Species Act. In a sense, they are not “natural” because—well, because they violate the notion of the purity of species, they don’t fit neatly into our conceptual categories. As one scientist was quoted (in dissension from the worry warts), “’We put things in categories, but it doesn’t work that way in nature.’”

Indeed it doesn’t. In fact, it couldn’t. The notion of “species” as neatly distinct forms of life, immune to crossings of the so-called “species barrier,” among other common myths of the “logic” of evolution, would cause evolution to grind to a halt. Evolution requires messiness, contingency, happenstance, the unexpected, for it to work. For example, genetic mutations do not magically appear in consequential response to environmental pressures, just in time to save a species from extinction. Instead, a mutation lies quietly in the background, sometimes for many generations, to emerge as the crucial factor of salvation (for those individuals who carry it, and their descendants) when and if a factor in the environment calls it forth.

I am reminded of a startling discovery during the height of the AIDS epidemic in America, that some individuals, despite a particularly risky lifestyle, were immune to the disease. Turns out, they carried a mutation that had first manifested itself centuries earlier, during an epidemic of an entirely different disease, bubonic plague. One could describe how this mutation protects against both diseases, but one could not explain why—why this gene mutation occurred in the first place, why it just happened to confer immunity or resistance to these two quite different diseases (one caused by a bacterium, the other by a retrovirus), and why it resided silently in the genomes of its fortunate carriers for so many generations before it could prove its usefulness.

A fundamental goal of all human endeavors is to reduce the entangled complexities of life, including our own, to a simple set of principles that fit the limitations of the computational power of our little brains, a mere three pounds of meat, of which only a relatively small portion engages in the tasks of reasoning. Not surprisingly, it is difficult to wrap our heads around the genuine complexity of the earth we inhabit, let alone of the cosmos. Being the limited creatures that we are, we need our categories—but let’s not worship them. Let’s not condemn the Eastern wolf and the red wolf to extermination just because they mess up our laws.

Evolution and Theodicy

“Why is there evil in the world?” This question has been asked by philosophers and theologians and ordinary men and women for millennia. Today scientists, particularly evolutionary biologists, neuroscientists, and evolutionary/neuropsychologists have joined the effort to explain evil: why do people indulge in violence, cheating, lies, harassment, and so on. There is no need here to itemize all the behaviors that can be labeled evil. What matters is the question of “why?”

The question of “why is there evil in the world?” assumes the premise that evil is abnormal while good however defined) is normal—the abnorm vs. the norm, if you will. Goodness is the natural state of man, the original condition, and evil is something imposed on or inserted into the world from some external, malevolent source. In Genesis, God created the world and pronounced it good; then Adam and Eve succumbed to the temptations of the Serpent and brought evil and therefore death into the world (thus, death is a manifestation of evil, immortality the natural state of good). Unfortunately, the Bible does not adequately account for the existence of the Serpent or Satan, so it was left to Milton to fill in the story. Gnostics, Manicheans, and others posited the existence of two deities, one good and the other evil, and constructed a vision of a cosmic struggle between light and darkness that would culminate in the triumph of good—a concept that filtered into Christian eschatology. The fact that Christian tradition sees the end times as a restoration to a state of Adamic or Edenic innocence underscores the notion that goodness is the natural, default state of man and the cosmos.

Contemporary secular culture has not escaped this notion of the primeval innocence of man. It has simply relocated Eden to the African savannah. When mankind was still at the hunter-gatherer stage, so the story goes, people lived in naked or near-naked innocence; they lived in egalitarian peace with their fellows and in harmony with nature. Alas, with the invention of agriculture and the consequent development of cities and civilizations, egalitarianism gave way to greed, social hierarchies, war, imperialism, slavery, patriarchy, all the factors that cause people to engage in violence, oppression, materialism, and so on; further, these faults of civilizations caused the oppressed to engage in violence, theft, slovenliness, and other sins. Laws and punishments and other means of control and suppression were instituted to keep the louts in their place. Many people believe that to restore the lost innocence of our hunter-gatherer origins, we must return to the land, re-engage with nature, adopt a paleo diet, restructure society according to matriarchal and/or socialist principles, and so on. Many people (some the same, some different from the back-to-nature theorists) envision a utopian future in which globalization, or digitization, or general good feeling will restore harmony and peace to the whole world.

Not too surprisingly, many scientists join in this vision of a secular peaceable kingdom. Not a few evolutionary biologists maintain that human beings are evolutionarily adapted to life on the savannah, not to life in massive cities, and that the decline in the health, intelligence, and height of our civilized ancestors can be blamed on the negative effects of a change in diet brought on by agriculture (too much grain, not enough wild meat and less variety of plants) and by the opportunities for diseases of various kinds to colonize human beings too closely crowded together in cities and too readily exposed to exotic pathogens spread along burgeoning trade routes. Crowding and competition lead to violent behaviors as well.

Thus, whether religious or secular, the explanations of evil generally boil down to this: that human beings are by nature good, and that evil is externally imposed on otherwise good people; and that if circumstances could be changed (through education, redistribution of wealth, exercise, diet, early childhood interventions, etc.), our natural goodness would reassert itself. Of course, there are some who believe that evil behavior has a genetic component, that certain mutations or genetic defects are to blame for psychopaths, rapists, and so on, but again these genetic defects are seen as abnormalities that could be managed by various eugenic interventions, from gene or hormone therapies to locking up excessively aggressive males to ensure they don’t breed and pass on their defects to future generations.

Thus it is that in general we are unable to shake off the belief that good is the norm and evil is the abnorm, whether we are religious or secular, scientists or philosophers, creationists or Darwinists. But if we take Darwinism seriously we have to admit that “evil” is the norm and that “good” is the abnorm—nature is red in tooth and claw, and all of the evil that men and women do is also found in other organisms; in fact, we can say that the “evil” done by other organisms long precedes the evil that men do, and we can also say, based on archaeological and anthropological evidence, that men have been doing evil since the very beginning of the human line. In other words, there never was an Eden, never a Noble Savage, never a long-ago Golden Age from which we have fallen or declined—and nor therefore is there any prospect of an imminent or future Utopia or Millennial Kingdom that will restore mankind to its true nature because there is nothing to restore.

The evolutionary function of “evil” is summarized in the term “natural selection”: the process by which death winnows out the less fit from the chance to reproduce (natural selection works on the average, meaning of course that some who are fit die before they can reproduce and some of the unfit survive long enough to produce some offspring, but on average fitness is favored). Death, usually by violence (eat, and then be eaten), is necessary to the workings of Darwinian evolution. An example: When a lion or pair of lions defeat an older pride lion and take over his pride, they kill the cubs of the defeated male, which has the effect of bringing the lionesses back into heat so that the new males can mate with them and produce their own offspring; their task is then to keep control of the pride long enough for their own cubs to reach reproductive maturity. Among lions, such infanticide raises no moral questions, whereas among humans it does.

There is no problem of evil but rather the problem of good: not why is there “evil” but rather why is there “good”? Why do human beings consider acts like infanticide to be morally evil while lions do not? Why do we have morality at all? I believe that morality is an invention, a creation of human thought, not an instinct. It is one of the most important creations of the human mind, at least as great as the usually cited examples of human creativity (art, literature, science, etc.), if not greater considering how much harder won it is than its nearer competitors, and how much harder it is to maintain. Because “good” is not natural, it is always vulnerable to being overwhelmed by “evil,” which is natural: Peace crumbles into war; restraint gives way to impulse, holism gives way to particularism, agape gives way to narcissism, love to lust, truth to lie, tolerance to hate. War, particularism, narcissism, etc., protect the self of the person and the tribe, one’s own gene pool so to speak, just as the lion kills his competitor’s cubs to ensure the survival of his own. We do not need to think very hard about doing evil; we do need to think hard about what is good and how to do it. It is something that every generation must relearn and rethink, especially in times of great stress.

It appears that we are in such a time today. Various stressors, the economy, the climate, overpopulation and mass migrations, religious conflict amid the dregs of moribund empires, are pushing the relationship of the tribes versus the whole out of balance, and the temptations are to put up walls, dig trenches, draw up battle lines, and find someone other than ourselves to blame for our dilemmas. A war of all against all is not totally out of the question, and it may be that such a war or wars will eventuate in a classic Darwinian victory for one group over another—but history (rather than evolution) tells us that such a victory is often less Darwinian than Pyrrhic.

We Are All Still Animists

[Children do not] have to be taught to attribute people’s behavior to the mental states they’re in. Children tend, quite naturally, to anthropomorphize whatever moves. What they have to learn is which things don’t have minds, not which things do.”
–Jerry Fodor (“It’s the Thought That Counts,” London Review of Books, November 28, 1996.)

Iconoclastic statements have always appealed to me, particularly because they cause me to look at the iconic statements they are set against in a new and critical light. Sometimes the iconic statements survive the scrutiny; oftentimes they don’t. In this case the iconic statement, that children learn that other people have minds of their own (theory of mind) over time, seems commonsensical until it is re-read in light of Fodor’s statement. Then it appears less evidently true.

Look at the first part of Fodor’s statement, that children “quite naturally . . . anthropomorphize whatever moves.” To anthropomorphize is to attribute human characteristics, in particular a mind with such things as motives, desires, feelings, etc., to nonhuman things. But, in my experience, not just to things that move (pets, for example), but also to things that don’t move: Dolls and figurines don’t move, though they look like they could, but small children also attribute feelings to objects that, to an adult, clearly are inanimate, such as blankies and other favored possessions; hence their sense of tragedy when the blankie disappears into the laundry hamper, or the favorite rubber ball deflates.

To read the full article, click here.

How We Think About Nature

It is so commonplace to think of nature as that which is free of human presence or interference that few people ever pause to consider how unnatural such a concept is. If human beings and their activities are not natural and do not occur in nature, where do they take place? If the answer is “in cities,” then we can further ask, “Where do cities exist?” Also, what are cities made of. Etc.

But in truth, human beings are as natural as grizzly bears and dandelions. We are animals; we have bodies which are composed of the same stuff as the bodies of all other mammals, of all other vertebrates as well, and at the microscopic level, our bodies’ cells are very like any other living cells. We reproduce as other animals do, we have DNA just as they do, and our brains, while noticeably more complex and capable that those of other animals, are otherwise pretty much the same as theirs. Like other animals, we must eat, breathe, and drink; we seek and/or build shelters, like beavers we divert water courses to our own benefit, and like many if not most creatures we create niches that are conducive to our well being.

It may also be said that we do not do anything that other animals don’t do, although we may do those things on a far greater scale. For example, we occasionally have carried companion organisms, whether deliberately as domesticated or useful, or inadvertently as parasites, to environments where they have not been present before and where they prosper in the absence of their traditional enemies: pigs, rats, and cats on islands, buffel grass in the American Southwest and Mexico, roses in South Africa, rabbits in Australia, foxes in New Zealand—there is a very long list. Those of us who are concerned about such things (apparently, not everyone is) call these creatures “invasive species” and would like to eliminate them from their colonial possessions. I agree: it is awful that Guam no longer has any native birds because of the introduction of the brown tree snake, and terrible that rabbits and mice wreak such destruction in Australia—an object lesson in how tragic it can be for a species to be without its predators, even for that species itself.

On the other hand, few of us would consider wheat an “invasive” species, yet with our help it has invaded every habitable continent and has taken over much of the American landscape from the native grasses; tomatoes, potatoes, and maize have spread from the Americas to the rest of the world, taking over vast tracts of land. But because we consider these species to be our allies, we do not call them invasive.

While island ecosystems can be disrupted by the introduction of new species, it is worth remembering that island ecosystems would not exist in the first place if islands were not invaded by organisms that had not previously existed there. When a new island forms, for example from a volcanic eruption from the ocean’s depths, there is no life on it, yet a few hundred or thousand years later it will be as verdant as the islands of Hawaii: green with flowering trees and shrubs, busy with the doings of birds and insects—all of which are, in a sense, invasive, their ancestors blown there by storms and winds or carried there on rafts of driftwood and debris. Organisms, even nonmobile ones like plants, do not stay in place—they wander, they spread, they invade, they take over, they flee, the die out, creating new species and new wonders in the process—a very long process, generally speaking. I’m sure that birds blown off course and landing on a less than ideal island in the storm may have had the seeds of some mainland plant in their digestive system, which, regardless of whether or not the bird survived, managed to sprout and struggle and survive and propagate, just as the seeds of some exotic plant have ridden on a human vessel and found themselves an hospitable new home. Kudzu, for example (intended), or Russian thistle (unintended).

What, then, is the difference between a seed carried in the gut of a bird and a seed carried in the pocket of a farmer, or an animal floating in on a raft of driftwood and seaweed and an animal floating in on the deck of a Polynesian canoe? They both accomplish the same thing, dispersing organisms to new ecosystems and keeping the evolutionary process churning. And the process of evolution over the billions of years to date has been marked by as much extinction as innovation. Human beings, themselves products of the same processes, are not engaged in an unprecedented activity—though we do seem, especially in the last 500 years or so (at least since 1492), to have accelerated the process to, comparatively, lightning speed. But aside from that, we are not actually doing something unusual, or even unnatural, in the annals of evolutionary time.

The difference is in ourselves, and is, broadly speaking, a moral difference. We are as capable of regret as we are of hope, of looking backward as forward; and while we often indulge in planning for the future and work towards improving our lot in life, we as often look to the past and itemize our mistakes as much as our triumphs. We can regret the passing of the dodo or the passenger pigeon, although none of us living has ever seen either; we can regret in foresight the impending extinction of the monarch butterfly or the African elephant—some of us can. Perhaps the moral sense arises from this ability to anticipate and retrospect, rather than (as some evolutionary psychologists are unduly prone to believe) from a moral molecule or altruistic gene. It is unlikely that our concern for the fate of other creatures is, or is entirely, out of concern for our own survival; we may have to adjust to a changing climate and a less “natural” world, but adjustment is not the same as extinction. From a practical point of view, i.e., from the view of human material needs, we probably have less to fear than the prophets lead us to believe—that is, if we learn to live without greed, that parasite that makes us want more than we need.

My concern, at least, is not with human survival, but rather with the survival of the many other creatures who also live on this planet, and insofar as aesthetics is a component of a moral vision, with the survival of the beautiful—I cannot see that human life is worthwhile without the beautiful. In one of his essays, Montaigne opined that voluptuousness is the equivalent of penitence; in religious terms, sin is its own punishment. One can also say that greed is its own punishment, for it destroys its object without gaining satisfaction. Of all the creatures on this earth, only human beings are greedy. Perhaps that is what makes us unnatural.

Nicholas Wade’s Troublesome Inheritance: A Critical Review

In his latest book, Nicholas Wade, a well-known science journalist, argues three points: 1) That human races are real, 2) that differences in human behavior, and likely cognition, are genetically based, and 3) that there are likely subtle but nonetheless crucial behavioral differences among races which are also genetically based. Wade is well aware that these are extremely controversial ideas, that they overturn politically correct notions that human behavior and social structures are purely cultural, yet he is confident that developments in genetics support his view.

Click here to read the full article.

The Mismeasure of All Things

Some 2500 years ago, Protagoras said that man is the measure of all things. By this he meant something like, mankind can know only that which it is capable of knowing, which in effect is a recognition that the human mind does have its limits; but Protagoras’ statement has often been taken to mean that man is the standard by which all other things are to be measured, i.e., that mankind is the standard of comparison for judging the worth of everything else. This meaning may have been colored by the Christian concept of man as the object of divine history, of man as just a little lower than the angels. The Christian concept, in its turn, derives from a common interpretation of the creation story in Genesis, in which God gives man dominion over the rest of earthly creation.

However, while both Protagoras’ saying and the Genesis story carry the concept forward through history, neither explains how the idea actually originated. It may have been Giambattista Vico (1668-1744) who first recognized that it is ignorance rather than knowledge that makes man the measure of all things: “When men are ignorant of natural causes producing things, and cannot even explain them by analogy with similar things, they attribute their own nature to them.” That is, when primitive men and women surveyed the world and sought explanations of phenomena, they had nothing to go by other than what they knew about themselves, so that, for example, a terrible destructive storm could be explained as the anger of the gods, since when human beings became angry they too engaged in destructive behavior; or when a gentle rain caused plants to grow, the gods were in a good mood, perhaps pleased by some human act of worship, because when humans were in a good mood, they engaged in benevolent acts. After all, the earliest humans could not have had any knowledge of the material causes of storms, droughts, etc., nor of course of animal behavior, which they attributed to motives much like their own. As Stephen Toulmin and June Goodfield summarize Vico’s views, in primitive mythologies people “could measure the world of Nature only by that which they already knew—namely themselves” (The Discovery of Time).

Both Protagoras and Genesis simply give more sophisticated glosses on this primitive impulse. They reflect the increasing body and complexity of knowledge developed by ancient civilizations, particularly those that had developed writing systems, which in turn enabled them to impose order on what had been a plethora of local myths and their variants. Simply by creating relatively coherent pantheons containing gods with discreet attributes, roles, and positions in a divine hierarchy, ancient civilizations were able to organize their intellectual world and provide authoritative explanations. Monotheism carried this further, by providing an even more unified world view, but it also somewhat depersonalized the concept of God, making him more abstract and less personal (e.g., no images or idols, no household god or genie of the local spring, etc.). This was an important achievement in the ongoing development of knowledge, a necessary step in the process that led to the state of knowledge we enjoy today, in large part because it put more emphasis on cerebral, intellectual rather than personal and experiential modes of understanding—in a sense, creating theory to replace myth. Thus we see the Greek philosophers creating the first science and the Jews creating the first inklings of theology and, importantly, teleology (a sense of history with a goal towards which it was moving). Nevertheless, the Judeo-Christian god retained strong anthropomorphic features, especially in the popular imagination and in visual arts, in which, for example, God the Father was usually depicted as a white-haired old man. Perhaps as long as most people were illiterate and dependent on visual media for their abstract knowledge, anthropomorphism was to be expected.

The Western European, Christian intellectual (literate) tradition combined these two strands of ancient thought, the scientific/philosophical with the historic/teleological, setting the stage for a modern world view that sees the world as making coherent sense and as operating according to consistent, universal laws, which then can be exploited by human beings for their own betterment. As scientific knowledge expanded and material explanations could be provided for phenomena that once were viewed as signs of divine intervention, God receded to the back of men’s minds as less necessary to explain the world—at best, perhaps, He became little more than the Prime Mover, the one who got it all started or the one who established the universal laws which continue to operate without His immediate intervention. But if the Age of Reason or the Enlightenment put God into retirement, it did not give up the belief in coherent laws and the quest for universal theories, nor did it give up the teleological view of history.

It is important to note that the teleological view is always a human-centered view; history, whether of cosmos, nature, or society, was still about man; very few thinkers hazarded to speculate that man might be merely one among many creatures and phenomena rather than the point of the whole enterprise. In this sense, at least, the early modern era retained the primitive impulse to both anthropomorphism and anthropocentrism. The widespread acceptance of Darwin’s theory of evolution by means of natural selection did little, indeed perhaps nothing, to change that for most people. It was not difficult to switch from believing that God had created man for dominion over nature and as the center of the historical story of fall and redemption, to believing that evolution is teleological, both in the sense of inevitably leading to the emergence of homo sapiens as the crowning outcome of the evolutionary process and in the sense of evolution as a progressive process. And it was easy enough, in the context of nineteenth-century capitalism, to believe that modern industrial culture was the natural continuation of progressive evolution—indeed was its goal.

It took a generation or more for it to dawn on people that Darwinism, along with the geological discoveries regarding the great age of the earth and the astronomers’ and physicists’ discoveries of the even greater age of the universe, implied there is no god at all, not even the reticent god of the Deists. One would think that once this implication struck home, both the teleological and the anthropocentric views would fade away. But, perhaps due to human vanity, neither has done so.

In a supremely ironic twist, both teleology and anthropocentrism have been inverted. Whereas the theological age measured other creatures in human terms, the evolutionary age measures humans in animal terms. We are no longer a little lower than the angels but only a little bit higher than the other animals—or maybe not even that. We are naked apes, talking apes, singing apes. We are like social insects, we are vertebrates, we are aggressive because we are animals seeking to maximize our survival, we are merely transportation for the real biological players, selfish genes. We are not rational or conscious, we do not have free will, we operate by instinct, each of our seemingly advanced traits is hard-wired. Our morality is nothing more than an adaptation. We take a word like altruism, which originally meant a certain kind of human behavior, apply it to ants, where it becomes a description of instinctive eusocial behavior, and then re-apply that meaning back onto humans. Thus making us just like all the other animals. Therefore, we study them in order to understand ourselves. We focus on the similarities (often slim) and ignore the differences (often radical).

This continues the old habit of anthropomorphism in new guise and fails to recognize the independent existence of other creatures—their independent lines of evolution as well as their ontological separateness from us. We unthinkingly repeat that humans and chimps share 96 percent of their genes (or is it 98 percent?), as if that meant something—but then, it’s said we share 97 percent of our genes with rats. We neglect to mention that apes and humans diverged from each other some 7 to 8 million years ago and have followed independent lines of evolution ever since. We are not apes after all.

Consider the fruit fly, that ubiquitous laboratory subject which has yielded so much knowledge of how genes work. It is often cited as a model of human genetics and evolution. But consider what Michael Dickinson, a scientist (he calls himself a neuroethologist) at the University of Washington (Seattle), has to say about fruit flies: “I don’t think they’re a simple model of anything. If flies are a great model, they’re a great model for flies.” To me, this is a great insight, for it recognizes that fruit flies (and, frankly, insects in general) are so other than like us that to study them as if they were a model of anything other than themselves, as a model of us, is in a sense not to study them at all. It is rather to look into their compound eyes as if they were mirrors showing our own reflections. It is a form of narcissism, which perhaps contains our own demise.

Our demise because in continuing to look at nature as being about ourselves we continue the gross error of believing we can manipulate nature, other organisms, the entire world, to our own narrow purposes without consequences. It turns other organisms into harbingers of homo sapiens, narrows research to that which will “benefit” mankind, and misses the very strangeness of life in all its diversity and complexity. It continues the age-old world view of human dominion and fails to recognize that our “dominion” is neither a biological necessity nor a feature of the natural world. Dominion is a dangerous form of narcissism which a maturely scientific age should discard.

Against System

As many people know, the world’s domestic honeybees are seriously threatened by a condition called colony collapse disorder, in which once healthy colonies suddenly disappear altogether, virtually overnight. Many different possible causes are generally cited, including mites, malnutrition, viral diseases, pesticides, and so forth, and it is routine to call for “more research.” Don’t ya just love it? Whenever we don’t want to face the truth, we call for “more research.” Yet the film demonstrates that more research is not needed. The reason for colony collapse is crystal clear: we have turned living organisms, the bees, into industrial units, mere nuts and bolts moving along a factory line, without regard to the fact that they are living organisms—much as we have other animals, such as cattle, chickens, hogs, and fish.

Click here to read entire article