Category Archives: My Topics

Robotics and Immigration

One of our most cherished myths is that America is a land of immigrants. In point of fact, we are—many millions of people migrated from the “Old World” to the “New” in the centuries following the first voyage of Columbus; what is now the United States became a favored destination of people from the British Isles, from Northern Europe (especially Germany), and later from Eastern and Southern Europe. These are the facts, the statistics. The mythic element, however, tells a story of people seeking freedom of various kinds—religious freedom, freedom from ethnic oppression, freedom from monarchs and oppressive class systems. No doubt these constituted the personal motives of many of the immigrants themselves.

Liberal and progressive arguments favoring continued unimpeded immigration are often couched in moral and mythic terms: that we have always been a nation of immigrants and should therefore continue to be (but: one definition of insanity is to repeat the same action over and over again despite not getting the hoped-for results); that we should forever continue to welcome “the huddled masses yearning to be free.”

What is not often considered are the motives of those already here (and of some who never stepped foot on American soil) in encouraging and enabling these mass migrations. From the very beginning, those motives have been all about profit: more specifically, about cheap labor as a means of exploiting the resources of this so-called “virgin land,” resources such as lumber, furs, gold and silver, and most especially agricultural commodities: tobacco, sugar (mostly on the Caribbean Islands), cotton, and wheat. With a few exceptions, early British colonies were chartered by London investors and were stocked with men and women from the desperate and criminal classes (people whom the British authorities were glad to be rid of), many of whom died shortly after arrival. As the colonies took hold, increasing numbers of the poor, the indebted, the jobless without prospects, the desperate, came here as indentured servants. Indenturement was little better than slavery, as many died before their term of service was up, others were cheated of their promised rewards. Then there was slavery itself, which brought millions of Africans here (and elsewhere in the New World) as chattel labor, valuable not only for free labor but as commodities in themselves.

Later, as the industrial revolution took hold, millions of Europe’s impoverished were allowed in to supply the labor for the factories as well as for the piecework that still occurred in crowded tenements and hazardous sweatshops (as exemplified by Triangle Shirtwaist Factory fire of 1911). Cheap labor was also obtained from China, particularly in the building of the transcontinental railroads; migrant (usually Mexican) labor is still crucial for harvesting fruits and vegetables.

This thumbnail sketch illustrates that cheap labor is the primary reason that business and political leaders have favored mass immigration, and why, for the most part, most business leaders still do. Economists today argue that we need to continue mass immigration (despite the fact that we already have a population of over 320 million people) because an aging population needs an influx of young workers to support (through taxes) the retired elderly—though how immigrants who live below the poverty line and, if legal, receive more in government benefits than they will ever pay in taxes, could perform that function is never explained. And this despite the fact that behind the sunny employment figures of recent weeks are the huge numbers of potential workers who have given up looking for a job and who are therefore no longer counted as “unemployed.”

Now comes another reason why mass immigration may no longer be a good thing: Artificial Intelligence (AI) is rendering many jobs, especially those traditionally occupied by the less skilled worker, obsolete. Factories now use more robotics than human beings and will do so even more as time goes on; many lower-skilled white-collar jobs are being replaced by digital substitutes; retailing jobs are disappearing as more and more consumers purchase goods online (as is illustrated by the emptying out of shopping malls and the closure of brick-and-mortar department stores). In other words, in the near future (if not already), our economy will require far fewer human workers per unit of output than they once did, and therefore demand for human labor (with certain exceptions) will drop considerably. Starkly put, we will not need the labor of our current population, let alone the labor of new immigrants.

What we will need instead is a new way of distributing the wealth that AI will generate. While it is too early in this transformation to specify how the new wealth should be distributed, it is time to begin considering the problem. The profits from AI are now accruing to the corporations in the form of profits and to the corporate managing classes who run the companies and make the big decisions as to how AI will be used. Yet again, the people with no voice in the process are the working classes (including the middle class). In fact, the political system is set up in such a way as to divide and conquer working people (e.g., the breaking up of unions and collective bargaining, the federal trade agreements that ignore the consequences to working people, etc.). The election of Donald Trump, who defeated all the establishment contenders of the Republican party before going on to (barely) defeat Hillary Clinton, is symptomatic of the anger of many citizens—that anger is likely to grow as the very rich get richer and the rest get much poorer and more desperate.

See this article.

“The Evangelicals” and the Genealogy of Ideas

Francis Fitzgerald’s new book The Evangelicals: The Struggle to Shape America is explicitly an exhaustive history of the “evangelical” movement in the United States, from its earliest manifestations in the first Great Awakening of the 18th century to the present day. It is a story of rise and fall, ebb and flow, of charismatic leaders and thoughtful academics, of politics and class, materialism vs the spiritual life (however that may be defined).

More interesting to me, however, is the insight implied above by putting the term evangelical in quotation marks, for it indicates the shifting definitions of what it means to be evangelical and the way that movements, whether intellectual, religious, political or artistic, develop. I can use a metaphor to express this idea: “the genealogy of ideas.” Like an organism, a movement or ideology is the descendent of a multitude of ancestral ideas or sources, and just as the genetics of an organism are shaped by the epigenetics of its environment, ideologies are shaped by the circumstances of their times.

In the case of evangelicalism, as Fitzgerald’s chronicle demonstrates, the multitude of sources include Wesleyanism, Calvinism, pentacostalism (both lower case and upper case), German higher criticism (more in reaction than acceptance), capitalism, Protestantism in general (especially individual conscience and reading the Bible for oneself), subjectivism, and so forth, mixed together and quoted more or less as the individual leader or thinker is inclined. Not to forget inerrancy, premillennialism, postmillennialism, dispensationalism, etc. Added to the mix more lately are pop culture notions, exemplified by the trend to self-help books in both secular and religious literature, as well as those twins, the prosperity gospel and the law of attraction. Oftentimes, the thinkers themselves have no idea where their ideas originate, have in other words no awareness of the genealogy of those ideas. And when they do, they (like all of us) pick and choose those sources and quotations which are most compatible with their presuppositions. Or their personal frustrations. Or their ambitions. (I may be mostly of peasant heritage, but I would like to point out that one of my ancestors was the bastard son of a 14th century king of France—so the throne is mine!)

Perhaps you’ve seen those television commercials for genealogy services in which (supposed) customers begin by stating that they had always assumed they were German or Hispanic or whatever, only to discover, upon researching their family trees or sending in a DNA sample, that they were really Scottish or a mixture of virtually every race on the planet. Americans especially have mixed ancestries, but even in other countries there are relatively few whose populations are genetically homogenous. Europe, for example, has long been a landscape of migrations, displacements, and mixed heritages (think of the Moorish, Roman, and Germanic influences in Spain, or the Celtic, Anglo-Saxon, and Norman influences in Great Britain).

The point is that most of us don’t know whence our many ancestors hail, and likewise most of us don’t know the sources of our most cherished and taken-for-granted ideas and beliefs. Do your ideas about ethics derive from Kant, Nietzsche, Aquinas, Aristotle, Thoreau? Do your ideas of the Self (your Self) derive from Nietzsche, Emerson, Protestantism, St. Paul, capitalism, Freud, or Montaigne? Or all of the above, and more?

The fact is that despite our efforts to construct coherent and definitive “world views” or philosophies, we always end up with a set of ideas that are mongrels, a bit of everything we have read or been taught, 57 varieties and more, and we in our turn will pass on this mixed DNA to future generations. There are, of course, professionals (theologians, philosophers, political scientists) who spend their entire lives attempting to impose coherence on this mess, who attempt to create thoroughbreds out of the chaotic DNA of thought, but their efforts are doomed: some other professional will soon dissect his predecessors’ magna opera and reveal their inherent weaknesses and impurities in order to assure the ascendency of his own new, improved, and purified system. And so on.

But: “When it was announced that the library contained all books, the first reaction was unbounded joy.” (Borges, “The Library of Babel”) But the people soon realized that this infinitude of books was not a blessing but a curse—there was no hope of reading everything, so they turned to rifling through the shelves to find the books that vindicated their presuppositions, though they were nowhere to be found. So the people turned against each other, quarreling, fighting, disputing, and eliminating, in the vain hope that their own answers to the mysteries of the universe would prevail, that their own books would be what everyone read. Hence, censorship, orthodoxy, systematizing, anathemas, excommunications, splinterings. But the infinity of books keeps multiplying.

Evolutionary Just-So Story, Again!

So yet again we have a story of evolution that seems to say that evolution works like God, i.e., that it indulges in design. I am referring to an article recently published in the New York Times reporting on research into why the squid lost its shell. The phrasing of the article will, in the minds of the naive, create the impression that the squid lost its shell in order to move faster to escape its predators (shells being rather heavy and cumbersome). “The evolutionary pressures favored being nimble over being armored, and cephalopods started to lose their shells.” This seems to be an innocent enough statement, but its construction implies that the pressure to become nimble preceded and caused the loss of the shells.

That is design. It may not be God design, though one could easily make that leap, but it is design nonetheless.

Oh, if only they would read Lucretius!

Here’s what really happened: Originally, “squids” we shelled creatures; generation after generation were shelled. Occasionally, a genetic mutation or defect (call it what you will) resulted in progeny lacking shells. No doubt, most of these shell-less individuals quickly died or were eaten and left no progeny; but at some point, some of them survived (perhaps thanks to another mutation that enabled them to move more quickly than their shelled relatives) and reproduced, eventually giving rise to a new class of creatures, squids and octopuses, etc. In other words, the change occurred first, without intention or purpose, and the benefit followed. The change did not occur in order to confer the benefit. It just happened.

Of course, such changes often occur gradually, say by shrinking the shell over many generations, in what some have called “path dependency” (i.e., evolution follows an already established path and does not go backwards, in other words it doesn’t restore the shell to creatures who have lost it). But the principle remains the same: first the change, and then, if it happens to have an advantage, it sticks.

As Lucretius said, humans did not develop opposable thumbs in order to grasp tools; we can grasp tools because we have opposable thumbs.

The Liberal Illusion

Nothing focuses the mind like losing, and in this election the Democrats lost not just the presidency but both houses of Congress and the governments of most of the states. That latter fact is important, because even if Hillary Clinton had won the presidency, she would have faced the same obstacles as Obama did during his eight years in the White House. One might wonder, then, if it isn’t better to have a Republican government that at least might do something rather than a split government that can do virtually nothing.

But it is not my purpose here to parse the minutiae of the power plays likely to erupt in Washington nor to dissect the flawed strategies of the Democratic establishment during the campaign. That’s what the media pundits are paid to do. I am more interested in what the triumph of Trump reveals about the true nature of our country (and perhaps of human nature itself) and the illusions with which liberals have been living for the last half century.
As a teenager in the 1960’s, I was enamored of the Kennedys and the whole Camelot thing. JFK’s eloquent calls to “ask what you can do for your country” and his declaration that his election signaled the rise of a new generation ready to sweep away the cobwebs of the past were appealing to youthful idealism. So too was Johnson’s Great Society. I’m sure many of my contemporaries shared this attitude of hope for a better present as well as future, and we carried that hope forward to the election of Barack Obama, which was presented to us as symbolic of a post-racial America. In so many ways, Obama (or at least his rhetoric) was the apotheosis of the liberal myth, that not only the old attitudes to race but also to gender and the environment, war and tribalism had finally expired.

But as this election has showed, all the old prejudices and worldviews have not disappeared; in fact, around the world they seem to be rearing up again, like weeds once chopped down spring forth again from the roots. Or perhaps the weeds were never chopped down in the first place, but simply obscured from view by the colorful flowers of liberalism, for if we look back at those last fifty years or so, when we all believed that we were progressing to a better world, we see not only all the good fruits but the continued proliferation of the bad seeds. Johnson fought for the great society while bombing Vietnam; Nixon succeeded Johnson; Carter was driven from Washington by Reagan; Clinton abandoned the liberal agenda for a more “centrist” politics (“the end of welfare as we know it”); Bush mired us in Iraq and the Middle East and presided over the worst financial disaster since the depression; and Obama, despite his audacity of hope, proved to be an ineffective president, perhaps as Ta-Nahesi Coates has observed, too cerebral and naïve in his hopes, and what he did accomplish may be rolled back within the first 100 days of the Trump administration.

Trump’s appeal is not entirely to the old prejudices. The economic factor, the many blue-collar workers who have seen their jobs and their hopes plowed under by globalization, automation, and the greed of CEO’s, financiers and shareholders, played an enormous role in his victory. But old prejudices tend to emerge when people feel most vulnerable and displaced from their accustomed worlds. People whose jobs are about to be shipped to Mexico are not as inclined to view Mexican immigrants with favor as those whose jobs are secure and unaffected by outsourcing. The virtues of multiculturalism and diversity are luxuries that the abandoned worker may not feel that he can afford—or she; the vote count did not break down into the obvious gender disparities because it is, after all, the economy stupid that trumps cultural and social issues. (Frankly, no one has accounted to my satisfaction for the large number of women who, for example, are pro-life—I do not buy the notion that they are acting in bad faith or have internalized male oppression, etc.)

It is forbidden to make comparisons to Germany in the years leading up to Hitler’s rise to power, but I am going to make the comparison anyway—with the caveat that the comparison is not deeply incised; we are not Germany, of course, and the world situation today is quite different from that of the two world wars, but there is enough consistency in human nature that some lessons can be drawn from a comparison. During the years since the unification of Germany, Germans had in many ways prospered, and had constructed at least the semblance of a modern and in many ways cosmopolitan culture (much the same can be said of Austria, by the way). One of the effects of this flowering was the assimilation of German Jews. Yet after the defeat of Germany at the end of WWI “Germans” looked around for scapegoats and focused on the Jews as the avatar of their defeat and humiliation. It mattered nothing that Jews had contributed so much to the culture of Germany; in fact, those contributions were held against them, as indicative of the extent to which true German (Aryan) culture had been mongrelized by “foreign” and “cosmopolitan” elements. As the economic situation worsened, the polemics against Jews coarsened, culminating in the Holocaust.

Relevant to my thesis is the extent to which the Bildungsburgertum, the educated classes, both Gentile and Jewish, failed to recognize what was happening to their precious culture and would soon happen more brutally to them. After all, they reasoned, how could the Germany of die Aufklarung (the Enlightenment), of “Schiller, Lessing, Goethe, Kant and Herder” (Bolkosky 8), succumb to the crude blandishments of such a man as Hitler? What they failed to see was that most Germans were unacquainted with all this Kultur, that their apartments and cottages were not lined with books and sheets of classical music. Likewise, I’m afraid that American liberals have mistaken their own culture for the culture of the whole, that the books they treasure are treasured by everyone, that their ideas are obvious to everyone and that everyone reveres literature and higher education as much as they do—and thus that they have been blind to the real culture of the majority of their fellow citizens. They have not seen that their universal values are not universally shared.

Fermenting alongside the progress that we thought was being made were they old prejudices and world views that have haunted our history since the very beginning. This is the same country that displaced and slaughtered the Indians, whose economy was founded on the enslavement of millions of Africans, and which has exploited the white working class for, as Nancy Isenberg shows, 400 years. Over the last fifty years we have lurched from left to right, from one pole to the other, from progress to regression—indeed, we have often traveled both rails at the same time. But we liberals seem to have ignored the continued strength of the regressive strain in our politics and culture, dismissing it as inevitably doomed. We have thought that the combination of globalization and technology would erase the differences among people and bring about universal peace, reason, and tolerance. At the same time we have forgotten about the bottom line and that people without jobs or economic hope will care little for, or will be hostile to, peace, reason, and tolerance.

Bolkosky, Sidney. The Distorted Image: German Jewish Perceptions of Germany, 1918-1935. Elsevier, 1975.

Isenberg, Nancy. White Trash: The 400 Year Untold History of Class in America. Viking, 2016.

Death of a Bug

The other day I squashed a bug. It was quite small, rather rounded in shape, and making its way slowly across the surface of my nightstand. I am usually not insecticidal, but having a bug of any conformation so proximate to my bed brings out my squeamishness. And recently my condo association had sent out a newsletter with an article about bedbugs. This was probably not a bedbug, but nonetheless, it had to die.

I regretted my brutality immediately. The poor thing had as much right to its life as I have to mine. In the great scheme of things, the life of a human is of really of no more importance than the life of any other creature. We got here through the same process of evolution as they did, and since I do not subscribe to any form of teleology, I do not consider Homo sapiens to be any more perfect nor any more the apex and fulfillment of some great cosmic plan than that poor bug and his cohorts. It is the attitude that we do count for more that has led to so much environmental destruction and so much cruelty, not only to other animals but also to other people. For as eugenics exposed, the idea that humans are the perfection of evolution leads all to easily to the notion that my humans, the people of my group, are more fully perfect than yours. Hence, genocide.

It is therefore not surprising that good souls who reject cruelty to other people also reject cruelty to animals; and also not surprising what psychologists tells us of serial killers, that they tortured and killed animals in their childhoods. Many children, especially boys, do mistreat animals, at least of the insect kind (remember watching ants burst into flame under the magnifying glass?), but most children, even boys, soon outgrow that tendency. Serial killers apparently do not, which suggests that there is an element of immaturity, even of that primitivism that can be both so charming and occasionally so alarming in children, in the serial killer’s makeup. Something having to do with the child’s sense of himself or herself as the center of the world, the world being that which was designed for one’s gratification.

There are other ways in which this juvenile belief that the world owes us gratification can be manifest. The despoiling of the natural world for profit, so that we may live in an abundance that exceeds what the world actually can supply to us, fits this bill. We take not only what is our natural due but also that which is the natural due of all the other creatures with which evolution has populated this planet, which is why so many are being driven into extinction (why so many already have been), and why, when we know perfectly well that our “lifestyles” are warming the planet, we continue to pillage as if there were no tomorrow—until one day perhaps there literally will not be.

Perhaps I am making too much of the squashing of a mere bug. I mentioned that we are the product of the same process of evolution that led to all other creatures, and that process is anything but benign. The process of life is the process of death. Virtually everything that lives does so by killing and eating some other living thing. Even a vegan lives by killing carrots and broccoli and mushrooms (do carrots scream in pain and terror when we yank them from the ground?) There is no escape from this round of death and life. The vegan may not eat any animal product, but his or her efforts make little difference in the great scheme of things—there are predators enough to override the effects of the vegetarian. That is how evolution works its mighty wonders.

Which is why I am not persuaded by those good souls who imagine that we can end suffering and wars and crime and all the other means and ways that we wreak havoc on each other and the world. I am not hopeful that we who live in the so-called developed world will rein in our greed for money and things for the sake of the planet or even for the sake of the starving and terrorized millions of so much of the rest of the world, or even for those who live within our own borders. Like all other creatures, we kill to live. Unlike other creatures, we can overkill. All too often we do, both literally and metaphorically.

That little bug on my nightstand was most likely harmless, at least to me, and maybe it even had some important function in the ecology of my apartment. Or maybe it was just quietly living its own life. I killed it anyway.

See also my “Requiem for a Tree” at this site.

Sins of Our Fathers: Matthew Karp’s “This Vast Southern Empire”

The subtitle of Matthew Karp’s important new book neatly summarizes the book’s thesis: “Slaveholders at the Helm of American Foreign Policy” during the period from the nation’s founding to the start of the Civil War. Slavery was not just a domestic issue but a global one, and the foreign policy decisions of the men who were in charge of the federal government for most of the antebellum period, slave owning Southern plantation elites, were well aware that the future of slavery, i.e., their future, would be determined as much by what happened abroad, particularly in the Caribbean and Latin America, as by what happened within the borders of the United States itself—in fact, the expansion of those borders that occurred during the antebellum period was largely determined by the desire of the Southern elite to protect and expand slavery at a time when it was being seriously challenged by both British and Northern abolitionists, as well as by slave rebellions in Haiti and elsewhere.

Karp lays out in ample detail, and in often elegant and occasionally sardonic prose, the policies and motivations of the major political and intellectual figures of the time, including Calhoun, the successive presidents, journalists and writers such as Louisa McCord (who could be called the Phyllis Schlafly of her time) and James De Bow. Since Karp does such a good job of narrating the history, it is not necessary for me to summarize it here—just take my word for it and read the book.

What really struck me as I read was the extent to which the ideologies of the slave-owning elites have persisted in the political DNA of the United States, down to the present day. Virtually every one of the political excuses for slavery and for the domestic and foreign policy positions they adopted have survived to today, though for most of us their original forms are obscured by subsequent layers of circumstance and party politics.

Let us begin with a prime example: Today we are all familiar with the phrase “states’ rights,” and probably have at least an inkling of what it means, and are aware of how it occasionally crops up in current political controversies, such as when the federal government overrides state laws on immigration or gender discrimination.

But what most of us probably don’t realize is that the notion of states’ rights emerged in the very early days of the nation and was taken up by Southern slave owners as a rationale for preventing the federal government (and thus the increasingly abolitionist North) from interfering in the South’s peculiar institution as well as to support the expansion of slavery into new territories. Today states’ rights are usually invoked in support of conservative causes, most notably in the persistent calls for reducing federal spending on domestic issues, whether it be on infrastructure or health care; while at the same time pressing for increased spending on the military in order to ensure America’s rightful place on the world scene and to guarantee national security (in those days against the British)—to achieve peace through strength, as it is often said. Both antebellum Southern elites and our contemporary conservatives want a decentralized government when it comes to domestic issues and a strong central government when it comes to foreign policy. They are strict constructionists domestically and liberal constructionists globally.

Intertwined with this view was Manifest Destiny, the notion that America is a special nation in the history of mankind, with a special mission not only to expand westward across the entire North American continent but to redeem the world. Manifest Destiny was widely popular throughout the United States, North as well as South, but it was especially appealing to the Southern elites as they looked to the southern hemisphere as a new source of commodities which, they believed, could only be exploited by bound labor, preferably African slave labor (which they hoped to supply from their own slave breeding programs). It is worth mentioning here that the four most important commodities for international trade of that day were cotton, tobacco, sugar, and coffee, all of which at that time were “tropical” products largely grown under deeply exploitative labor conditions. Those four commodities continue to be important in trade today, though their predominant role has been superseded by oil, wheat, and corn; sugar and cotton continue to benefit from federal subsidies (as did tobacco until quite recently).

The special mission of the United States led to the Mexican-American war and the acquisition of what is today the American southwest and California, as well as the Indian wars that cleared the West for white settlement. It led to the Spanish-American War and the colonialization of the Philippines; and in the twentieth century to our interventions in too numerous to mention other countries, ostensibly to extend democracy and peace but all too often in fact to protect and expand our economic and geopolitical clout. This has led to our situation today, in which we find ourselves in a state of cognitive dissonance between our fine rhetoric and our actions. American hegemonic ambitions have always been encircled by a decorative hedge of beautiful rhetoric–call it the aesthetic of imperialism.

Much of that dissonance originates in that dark shadow cast by American history, race. Although racial prejudice had existed in American thought since the colonial period and was present even in the Northern states (and even, it must be said, among abolitionists), it was Southern writers who articulated the most sophisticated and virulent racial theories of the pre-Civil War era. Just one writer of the many that Karp cites will serve as an example: Louisa McCord wrote that “God’s will formed the weaker race so that they dwindle and die out by contact with the stronger . . . Slavery, then, or extermination seems to be the fate of the dark races” (qtd. on pages 159-160). As Conrad’s Kurtz would later say, “Exterminate all the brutes!” And this slavery-or-extermination racism was not limited to Africans but applied equally as much to Native Americans and other “colored” races (and note that the very term “colored” makes a classificatory distinction between them and “whites”).

This so-called scientific racism, this notion that the strong must inevitably exterminate the weak, predated the publication of Darwin’s Origin of Species (1859) and Spencer’s coining of the phrase “survival of the fittest” (1864), yet it eerily anticipates the uses to which evolution would be put, in the form of Social Darwinism and eugenics, at least up to the Nazi racial theorists of the mid-twentieth century (I am not confident that it has disappeared even today). In the minds of McCord and others of her ilk, civilization itself depended on racial discrimination, particularly on bound labor—slavery was not only good for true civilization (the high arts and all that) but it was good for the slaves (slavery or extermination). Even “liberty,” of all things, depended on slavery (see page 67). Which raises the interesting question: What did the Founders mean, exactly, when they wrote so eloquently about “liberty”? What Southern theorists of race clearly did not mean was the freedom of the individual laborer to be worthy of his hire; indeed, they argued that “free labor” was less efficient and less orderly than slave labor, and they pointed to the declining Haitian exports of sugar after the Haitian revolution as proof—neglecting to note that sugar production for export may have been good for the bourgeoisie of Europe but not good for the Haitians themselves nor for the natural environment of the island.

The end of slavery after the Civil War did not mean the end of exploited labor and racial theory. The sharecropper system was part and parcel of Jim Crow racism, as were separate but equal, which indeed kept the races separate but by no means equal, and although significant and necessary changes came with the civil rights movement, race theory continues to infect social and political discourse today, however superficially camouflaged it may be. Likewise, the ideology of states’ rights continues to impact political thought and rhetoric, even within certain states whose political classes are reluctant to tax and budget for policies that would enhance the well-being of their citizens even as they provide tax breaks and sweetheart deals for corporations and sports teams. Meanwhile, federal military and surveillance budgets continue to climb, and a candidate for president from one of the major parties brags about bombing ISIS into oblivion.

The persistence of Manifest Destiny is best illustrated by the last sixteen years of federal foreign policy. President George W. Bush said of the invasion of Iraq that it was the “latest front in the global democratic revolution led by the United States,” though others saw that war as being more about oil than democracy. President Obama also wanted to promote democracy and advance our values in the Middle East and thought, wrongly as it turned out, that the so-called “Arab Spring” heralded the beginning of a new era in that region. Since Obama ran his first campaign as the not-Bush, it is ironic that both presidents spoke idealistically while pursuing less than ideal policies, as if they (and their advisors, and perhaps also the American citizenry in general) were unable to disentangle their idealism from the realities of the American imperial project. Perhaps that is because from the very beginning, American imperial ambitions have been couched in the rhetoric of liberty, civilization, and wealth—which makes us not so different from our antebellum Southern elite politicians, after all.

Narcissism and Individualism

This is the age of the selfie. Never before have so many people taken so many pictures of themselves and posted them on so many different media. They’re all the same, goofy smiles, funny faces, sexy poses. For something that is supposed to be self-expressive, they are pathetically predictable and unoriginal.

Some social critics say that this is an age of individualism run amok. But the selfie suggests quite otherwise. This is the age of rampant narcissism.

The narcissist craves the attention of others because she or he otherwise has no self, no means of affirming his or her existence other than through the attention of others. Narcissists love mirrors, whether of the conventional glass kind or the gaze of others looking at them. The selfie is a kind of mirror: it is focused on oneself rather than the environment in which one is situated (on me rather than the Bridge of Sighs barely visible in the background) and it is “shared” with others with the purpose of receiving a reaction. It not only announces one’s existence, it asks for confirmation of that existence. The narcissist depends on the confirmation of others. Today, narcissism is a feedback system–I will friend you if you friend me, I will “LOL” your selfie if you will do the same for mine. Narcissists are not so much antisocial as hypersocial, in the sense that they need social recognition as much as they need oxygen.

Narcissists are selfish, but they are not individuals. Individuals do not require constant affirmation to support their sense of self. They do not need to attract attention to themselves always and everywhere. They have private lives. They enjoy privacy as much as they enjoy socializing. They do not need to take pictures of themselves to verify they have been to Venice, nor to prove to anyone else that they have. They will take a picture of the Bridge of Sighs without their funny faces obscuring it, or they might not take a photo at all–they have the personal experience, an experience that does not need to be “socialized” to be real.

Narcissists will go on a talk show and reveal all to millions. Individuals can’t imagine doing such a thing. They do not watch such shows. Individuals believe in privacy, they believe that they have a right to their secrets. They believe that other people have a right to their own secrets as well. They do not conform, neither do they make a fetish of not conforming (which is a backhanded way of conforming, by drawing attention to oneself, as a kind of hypersocializing–see me, see me!).

A society of narcissists is a society of conformists, with everyone vying for everyone else’s attention. Such attention is not always positive–often it is negative, bringing down the opprobrium of the masses for transgressing the rules of proper self-revelation. This is why selfies all look the same–there is a right way of doing it. It is why talk shows follow the same format, whether it’s Dr. Phil or Jerry Springer. There is a right way to be a “selfie”–there is no right way to be a self.

Why Cursive Still Matters

According to Anne Trubek, author of a forthcoming book on the history and future of handwriting, and an advance selfie-blurb in the New York Times, “handwriting just doesn’t matter” anymore and should not be taught in elementary schools. Instead, students should be given a short course in printing and then quickly move on to typing skills. I beg to differ.

But before I do, I should be fair and mention that I am of an older, pre-digital generation and have been writing in cursive since I was eight years old. In fact, I am so habituated to cursive that I find it awkward and slow to hand print; when I’m filling out all those redundant forms in a doctor’s waiting room, I soon switch from printing to script because my hands get tired doing what they’re not accustomed to doing—and too bad for the smart young things at the desk who can’t read cursive.

Thus I am admittedly taking a traditionalist position here, consciously and deliberately counter to the futurist stance of Trubek and others who agree with her denigration of cursive. Being a traditionalist, however, does not—I repeat DOES NOT—delegitimize my argument. No more than being a futurist legitimates any argument against cursive.
So, what are Trubek’s arguments against the teaching of cursive (also called script, handwriting, longhand, etc.)? As already noted, one is that handwriting is old fashioned, outdated, and therefore as irrelevant to today’s world as Grandma’s old icebox (well, I guess it’s great Grandma’s). It is time, therefore, to consign handwriting to the same rubbish heap or museum as that icebox, and as those old ways of writing Trubek lists—carving on stone (which was never used for day to day writing, anyway), quill pens, and typewriters. But fountain pens are still widely used (I had a student once who had bought a cheap one and loved it—but I did have to demonstrate to him how to use it correctly), and typewriters are something of a fad among the young (like vinyl records). Stone cutters are still doing what they’ve always done: carving letters on headstones and monuments. Nothing is superseded entirely.

Trubek’s primary argument is a utilitarian one—in the digital age, handwriting is impractical and therefore no time should be wasted on teaching it. It is “superannuated.” One can write faster, and therefore more, by typing than by handwriting; and, glories of glories, one can write better! She asserts that “there is evidence that college students are writing more rhetorically complex essays, and at greater length, than they did a generation ago.” Hopefully, she will cite the “evidence” for this assertion in her forthcoming book, but until then I will continue to wonder why American students do so much more poorly than students in other countries on language skills and why college graduates appear to have serious deficits in writing skills. My own experiences as a college English instructor confirm the findings of large-scale tests: Students today do not write better than they did in the past, nor I have noticed that all the social-media writing that young people engage in has improved their writing skills.

Now, I am not asserting that teaching handwriting, in and of itself, will have any effect on the more global aspects of writing (organization, development of thought, etc.), but nor can one assert that teaching handwriting diminishes those skills. One need only look at the diaries and letters of, say, nineteenth-century Civil War soldiers, virtually none of whom attended school past the age of fourteen, to see that. I have in my possession a letter my paternal grandmother wrote to one of her sisters during the Great Depression; neither woman attended college, in fact what formal education they received occurred in a one-room schoolhouse in a small town near their family homestead in the Ozarks of southern Missouri—yet Grandma obviously could write, well and thoughtfully (on the political issues of the day), and, lordamercy, in a clear, readable cursive!

Frankly, to argue for the superior cognitive effects of computer typing is as bogus as arguing for the superior cognitive effects of cursive—after all, neither manual skill is about content, but only about means. Of course, I would not today compose this essay by hand on a yellow legal pad—I would never want to go back to the pre-word-processor days—all that White Out and carbon paper and retyping entire pages to correct one or two sentences is not for me! But I don’t want to give up handwriting either—in fact, my outline for this essay, and my margin comments on Trubek’s article, were handwritten in cursive on paper. The differing writing technologies available to us today are complementary, not mutually exclusive.

There is, however, one very good reason for knowing how to write in longhand: privacy. The digital world today is massively intrusive—cookies trace every move one makes on the Internet, the giant digital corporations make a mockery of web privacy, and hackers and government surveillance agencies sneak around in the background looking for vulnerabilities and suspicious activities. Just as one minor but truly exemplary instance: the other day I received yet another email from a major retailer (from whom I had recently purchased a big-ticket item) advertising their goods; rashly, I clicked on one of the items, just to satisfy my curiosity as to what such a thing would cost, and for the rest of the day, every time I went to a news media site, up popped another one of that retailer’s ads for that very item. We are getting very close to a ubiquitous media/advertising environment like that depicted in the Tom Cruise film “Minority Report.” Maybe in fact we’re already there.

But when I write something down on a slip of paper, or write an entry in a real diary, or otherwise make use of the superannuated skills of pen or pencil on paper, I am engaging in something truly private, totally inaccessible to hackers and algorithms, even these days to the prying eyes of all those who are unable to read cursive. I can express myself (not my social-media-self) without worrying or caring about the necessity of self-censorship. And I can do so anywhere under any conditions—I don’t need an electrical outlet or batteries. I can write by sunlight, or candlelight if need be. And if I don’t like what I wrote, or I want to ensure that no one else can ever read my private thoughts, I can burn the papers or send them through a shredder. There is no eternal cloud for pen-on-paper, no wayback machine to dig up some random and ill-conceived thoughts from the past. In cursive, there is still the privacy of the self. That makes teaching handwriting to students a true and wonderful gift. No reasons of utility or timely relevance are needed.

What Is a Species?

That science is a human enterprise and not some pure and perfect object independent of culture is highlighted by a recent investigation into the DNA of American wolves—the gray wolf, the Eastern wolf, and the red wolf. An article in the New York Times (7/27/16) reports that analysis of the DNA of these three wolf species reveals that in fact “there is only one species [of wolf] on the continent: the gray wolf.” The other two are hybrids of coyotes and wolves—Eastern wolves are 50/50, red wolves are 75 percent coyote and 25 percent wolf. The investigators also concluded that the wolf and coyote species shared a common ancestor only 50,000 years ago, which is very recent in evolutionary terms.

Now, anyone comfortable with the fact that nature goes its own way without regard to the human need for tidy intellectual categories is not likely to be much disturbed by these findings. But such people are relatively rare, especially in academic and political circles, so it happens that certain people do find it disturbing that Eastern and red wolves are hybrids. That is, they are not “pure” and therefore may not be entitled to protection from, say, extermination—they are not “pure” and therefore not entitled to the protection of such laws as the Endangered Species Act. In a sense, they are not “natural” because—well, because they violate the notion of the purity of species, they don’t fit neatly into our conceptual categories. As one scientist was quoted (in dissension from the worry warts), “’We put things in categories, but it doesn’t work that way in nature.’”

Indeed it doesn’t. In fact, it couldn’t. The notion of “species” as neatly distinct forms of life, immune to crossings of the so-called “species barrier,” among other common myths of the “logic” of evolution, would cause evolution to grind to a halt. Evolution requires messiness, contingency, happenstance, the unexpected, for it to work. For example, genetic mutations do not magically appear in consequential response to environmental pressures, just in time to save a species from extinction. Instead, a mutation lies quietly in the background, sometimes for many generations, to emerge as the crucial factor of salvation (for those individuals who carry it, and their descendants) when and if a factor in the environment calls it forth.

I am reminded of a startling discovery during the height of the AIDS epidemic in America, that some individuals, despite a particularly risky lifestyle, were immune to the disease. Turns out, they carried a mutation that had first manifested itself centuries earlier, during an epidemic of an entirely different disease, bubonic plague. One could describe how this mutation protects against both diseases, but one could not explain why—why this gene mutation occurred in the first place, why it just happened to confer immunity or resistance to these two quite different diseases (one caused by a bacterium, the other by a retrovirus), and why it resided silently in the genomes of its fortunate carriers for so many generations before it could prove its usefulness.

A fundamental goal of all human endeavors is to reduce the entangled complexities of life, including our own, to a simple set of principles that fit the limitations of the computational power of our little brains, a mere three pounds of meat, of which only a relatively small portion engages in the tasks of reasoning. Not surprisingly, it is difficult to wrap our heads around the genuine complexity of the earth we inhabit, let alone of the cosmos. Being the limited creatures that we are, we need our categories—but let’s not worship them. Let’s not condemn the Eastern wolf and the red wolf to extermination just because they mess up our laws.

Evolution and Theodicy

“Why is there evil in the world?” This question has been asked by philosophers and theologians and ordinary men and women for millennia. Today scientists, particularly evolutionary biologists, neuroscientists, and evolutionary/neuropsychologists have joined the effort to explain evil: why do people indulge in violence, cheating, lies, harassment, and so on. There is no need here to itemize all the behaviors that can be labeled evil. What matters is the question of “why?”

The question of “why is there evil in the world?” assumes the premise that evil is abnormal while good however defined) is normal—the abnorm vs. the norm, if you will. Goodness is the natural state of man, the original condition, and evil is something imposed on or inserted into the world from some external, malevolent source. In Genesis, God created the world and pronounced it good; then Adam and Eve succumbed to the temptations of the Serpent and brought evil and therefore death into the world (thus, death is a manifestation of evil, immortality the natural state of good). Unfortunately, the Bible does not adequately account for the existence of the Serpent or Satan, so it was left to Milton to fill in the story. Gnostics, Manicheans, and others posited the existence of two deities, one good and the other evil, and constructed a vision of a cosmic struggle between light and darkness that would culminate in the triumph of good—a concept that filtered into Christian eschatology. The fact that Christian tradition sees the end times as a restoration to a state of Adamic or Edenic innocence underscores the notion that goodness is the natural, default state of man and the cosmos.

Contemporary secular culture has not escaped this notion of the primeval innocence of man. It has simply relocated Eden to the African savannah. When mankind was still at the hunter-gatherer stage, so the story goes, people lived in naked or near-naked innocence; they lived in egalitarian peace with their fellows and in harmony with nature. Alas, with the invention of agriculture and the consequent development of cities and civilizations, egalitarianism gave way to greed, social hierarchies, war, imperialism, slavery, patriarchy, all the factors that cause people to engage in violence, oppression, materialism, and so on; further, these faults of civilizations caused the oppressed to engage in violence, theft, slovenliness, and other sins. Laws and punishments and other means of control and suppression were instituted to keep the louts in their place. Many people believe that to restore the lost innocence of our hunter-gatherer origins, we must return to the land, re-engage with nature, adopt a paleo diet, restructure society according to matriarchal and/or socialist principles, and so on. Many people (some the same, some different from the back-to-nature theorists) envision a utopian future in which globalization, or digitization, or general good feeling will restore harmony and peace to the whole world.

Not too surprisingly, many scientists join in this vision of a secular peaceable kingdom. Not a few evolutionary biologists maintain that human beings are evolutionarily adapted to life on the savannah, not to life in massive cities, and that the decline in the health, intelligence, and height of our civilized ancestors can be blamed on the negative effects of a change in diet brought on by agriculture (too much grain, not enough wild meat and less variety of plants) and by the opportunities for diseases of various kinds to colonize human beings too closely crowded together in cities and too readily exposed to exotic pathogens spread along burgeoning trade routes. Crowding and competition lead to violent behaviors as well.

Thus, whether religious or secular, the explanations of evil generally boil down to this: that human beings are by nature good, and that evil is externally imposed on otherwise good people; and that if circumstances could be changed (through education, redistribution of wealth, exercise, diet, early childhood interventions, etc.), our natural goodness would reassert itself. Of course, there are some who believe that evil behavior has a genetic component, that certain mutations or genetic defects are to blame for psychopaths, rapists, and so on, but again these genetic defects are seen as abnormalities that could be managed by various eugenic interventions, from gene or hormone therapies to locking up excessively aggressive males to ensure they don’t breed and pass on their defects to future generations.

Thus it is that in general we are unable to shake off the belief that good is the norm and evil is the abnorm, whether we are religious or secular, scientists or philosophers, creationists or Darwinists. But if we take Darwinism seriously we have to admit that “evil” is the norm and that “good” is the abnorm—nature is red in tooth and claw, and all of the evil that men and women do is also found in other organisms; in fact, we can say that the “evil” done by other organisms long precedes the evil that men do, and we can also say, based on archaeological and anthropological evidence, that men have been doing evil since the very beginning of the human line. In other words, there never was an Eden, never a Noble Savage, never a long-ago Golden Age from which we have fallen or declined—and nor therefore is there any prospect of an imminent or future Utopia or Millennial Kingdom that will restore mankind to its true nature because there is nothing to restore.

The evolutionary function of “evil” is summarized in the term “natural selection”: the process by which death winnows out the less fit from the chance to reproduce (natural selection works on the average, meaning of course that some who are fit die before they can reproduce and some of the unfit survive long enough to produce some offspring, but on average fitness is favored). Death, usually by violence (eat, and then be eaten), is necessary to the workings of Darwinian evolution. An example: When a lion or pair of lions defeat an older pride lion and take over his pride, they kill the cubs of the defeated male, which has the effect of bringing the lionesses back into heat so that the new males can mate with them and produce their own offspring; their task is then to keep control of the pride long enough for their own cubs to reach reproductive maturity. Among lions, such infanticide raises no moral questions, whereas among humans it does.

There is no problem of evil but rather the problem of good: not why is there “evil” but rather why is there “good”? Why do human beings consider acts like infanticide to be morally evil while lions do not? Why do we have morality at all? I believe that morality is an invention, a creation of human thought, not an instinct. It is one of the most important creations of the human mind, at least as great as the usually cited examples of human creativity (art, literature, science, etc.), if not greater considering how much harder won it is than its nearer competitors, and how much harder it is to maintain. Because “good” is not natural, it is always vulnerable to being overwhelmed by “evil,” which is natural: Peace crumbles into war; restraint gives way to impulse, holism gives way to particularism, agape gives way to narcissism, love to lust, truth to lie, tolerance to hate. War, particularism, narcissism, etc., protect the self of the person and the tribe, one’s own gene pool so to speak, just as the lion kills his competitor’s cubs to ensure the survival of his own. We do not need to think very hard about doing evil; we do need to think hard about what is good and how to do it. It is something that every generation must relearn and rethink, especially in times of great stress.

It appears that we are in such a time today. Various stressors, the economy, the climate, overpopulation and mass migrations, religious conflict amid the dregs of moribund empires, are pushing the relationship of the tribes versus the whole out of balance, and the temptations are to put up walls, dig trenches, draw up battle lines, and find someone other than ourselves to blame for our dilemmas. A war of all against all is not totally out of the question, and it may be that such a war or wars will eventuate in a classic Darwinian victory for one group over another—but history (rather than evolution) tells us that such a victory is often less Darwinian than Pyrrhic.