Tag Archives: globalization

Robotics and Immigration

One of our most cherished myths is that America is a land of immigrants. In point of fact, we are—many millions of people migrated from the “Old World” to the “New” in the centuries following the first voyage of Columbus; what is now the United States became a favored destination of people from the British Isles, from Northern Europe (especially Germany), and later from Eastern and Southern Europe. These are the facts, the statistics. The mythic element, however, tells a story of people seeking freedom of various kinds—religious freedom, freedom from ethnic oppression, freedom from monarchs and oppressive class systems. No doubt these constituted the personal motives of many of the immigrants themselves.

Liberal and progressive arguments favoring continued unimpeded immigration are often couched in moral and mythic terms: that we have always been a nation of immigrants and should therefore continue to be (but: one definition of insanity is to repeat the same action over and over again despite not getting the hoped-for results); that we should forever continue to welcome “the huddled masses yearning to be free.”

What is not often considered are the motives of those already here (and of some who never stepped foot on American soil) in encouraging and enabling these mass migrations. From the very beginning, those motives have been all about profit: more specifically, about cheap labor as a means of exploiting the resources of this so-called “virgin land,” resources such as lumber, furs, gold and silver, and most especially agricultural commodities: tobacco, sugar (mostly on the Caribbean Islands), cotton, and wheat. With a few exceptions, early British colonies were chartered by London investors and were stocked with men and women from the desperate and criminal classes (people whom the British authorities were glad to be rid of), many of whom died shortly after arrival. As the colonies took hold, increasing numbers of the poor, the indebted, the jobless without prospects, the desperate, came here as indentured servants. Indenturement was little better than slavery, as many died before their term of service was up, others were cheated of their promised rewards. Then there was slavery itself, which brought millions of Africans here (and elsewhere in the New World) as chattel labor, valuable not only for free labor but as commodities in themselves.

Later, as the industrial revolution took hold, millions of Europe’s impoverished were allowed in to supply the labor for the factories as well as for the piecework that still occurred in crowded tenements and hazardous sweatshops (as exemplified by Triangle Shirtwaist Factory fire of 1911). Cheap labor was also obtained from China, particularly in the building of the transcontinental railroads; migrant (usually Mexican) labor is still crucial for harvesting fruits and vegetables.

This thumbnail sketch illustrates that cheap labor is the primary reason that business and political leaders have favored mass immigration, and why, for the most part, most business leaders still do. Economists today argue that we need to continue mass immigration (despite the fact that we already have a population of over 320 million people) because an aging population needs an influx of young workers to support (through taxes) the retired elderly—though how immigrants who live below the poverty line and, if legal, receive more in government benefits than they will ever pay in taxes, could perform that function is never explained. And this despite the fact that behind the sunny employment figures of recent weeks are the huge numbers of potential workers who have given up looking for a job and who are therefore no longer counted as “unemployed.”

Now comes another reason why mass immigration may no longer be a good thing: Artificial Intelligence (AI) is rendering many jobs, especially those traditionally occupied by the less skilled worker, obsolete. Factories now use more robotics than human beings and will do so even more as time goes on; many lower-skilled white-collar jobs are being replaced by digital substitutes; retailing jobs are disappearing as more and more consumers purchase goods online (as is illustrated by the emptying out of shopping malls and the closure of brick-and-mortar department stores). In other words, in the near future (if not already), our economy will require far fewer human workers per unit of output than they once did, and therefore demand for human labor (with certain exceptions) will drop considerably. Starkly put, we will not need the labor of our current population, let alone the labor of new immigrants.

What we will need instead is a new way of distributing the wealth that AI will generate. While it is too early in this transformation to specify how the new wealth should be distributed, it is time to begin considering the problem. The profits from AI are now accruing to the corporations in the form of profits and to the corporate managing classes who run the companies and make the big decisions as to how AI will be used. Yet again, the people with no voice in the process are the working classes (including the middle class). In fact, the political system is set up in such a way as to divide and conquer working people (e.g., the breaking up of unions and collective bargaining, the federal trade agreements that ignore the consequences to working people, etc.). The election of Donald Trump, who defeated all the establishment contenders of the Republican party before going on to (barely) defeat Hillary Clinton, is symptomatic of the anger of many citizens—that anger is likely to grow as the very rich get richer and the rest get much poorer and more desperate.

See this article.

Advertisements

Sins of Our Fathers: Matthew Karp’s “This Vast Southern Empire”

The subtitle of Matthew Karp’s important new book neatly summarizes the book’s thesis: “Slaveholders at the Helm of American Foreign Policy” during the period from the nation’s founding to the start of the Civil War. Slavery was not just a domestic issue but a global one, and the foreign policy decisions of the men who were in charge of the federal government for most of the antebellum period, slave owning Southern plantation elites, were well aware that the future of slavery, i.e., their future, would be determined as much by what happened abroad, particularly in the Caribbean and Latin America, as by what happened within the borders of the United States itself—in fact, the expansion of those borders that occurred during the antebellum period was largely determined by the desire of the Southern elite to protect and expand slavery at a time when it was being seriously challenged by both British and Northern abolitionists, as well as by slave rebellions in Haiti and elsewhere.

Karp lays out in ample detail, and in often elegant and occasionally sardonic prose, the policies and motivations of the major political and intellectual figures of the time, including Calhoun, the successive presidents, journalists and writers such as Louisa McCord (who could be called the Phyllis Schlafly of her time) and James De Bow. Since Karp does such a good job of narrating the history, it is not necessary for me to summarize it here—just take my word for it and read the book.

What really struck me as I read was the extent to which the ideologies of the slave-owning elites have persisted in the political DNA of the United States, down to the present day. Virtually every one of the political excuses for slavery and for the domestic and foreign policy positions they adopted have survived to today, though for most of us their original forms are obscured by subsequent layers of circumstance and party politics.

Let us begin with a prime example: Today we are all familiar with the phrase “states’ rights,” and probably have at least an inkling of what it means, and are aware of how it occasionally crops up in current political controversies, such as when the federal government overrides state laws on immigration or gender discrimination.

But what most of us probably don’t realize is that the notion of states’ rights emerged in the very early days of the nation and was taken up by Southern slave owners as a rationale for preventing the federal government (and thus the increasingly abolitionist North) from interfering in the South’s peculiar institution as well as to support the expansion of slavery into new territories. Today states’ rights are usually invoked in support of conservative causes, most notably in the persistent calls for reducing federal spending on domestic issues, whether it be on infrastructure or health care; while at the same time pressing for increased spending on the military in order to ensure America’s rightful place on the world scene and to guarantee national security (in those days against the British)—to achieve peace through strength, as it is often said. Both antebellum Southern elites and our contemporary conservatives want a decentralized government when it comes to domestic issues and a strong central government when it comes to foreign policy. They are strict constructionists domestically and liberal constructionists globally.

Intertwined with this view was Manifest Destiny, the notion that America is a special nation in the history of mankind, with a special mission not only to expand westward across the entire North American continent but to redeem the world. Manifest Destiny was widely popular throughout the United States, North as well as South, but it was especially appealing to the Southern elites as they looked to the southern hemisphere as a new source of commodities which, they believed, could only be exploited by bound labor, preferably African slave labor (which they hoped to supply from their own slave breeding programs). It is worth mentioning here that the four most important commodities for international trade of that day were cotton, tobacco, sugar, and coffee, all of which at that time were “tropical” products largely grown under deeply exploitative labor conditions. Those four commodities continue to be important in trade today, though their predominant role has been superseded by oil, wheat, and corn; sugar and cotton continue to benefit from federal subsidies (as did tobacco until quite recently).

The special mission of the United States led to the Mexican-American war and the acquisition of what is today the American southwest and California, as well as the Indian wars that cleared the West for white settlement. It led to the Spanish-American War and the colonialization of the Philippines; and in the twentieth century to our interventions in too numerous to mention other countries, ostensibly to extend democracy and peace but all too often in fact to protect and expand our economic and geopolitical clout. This has led to our situation today, in which we find ourselves in a state of cognitive dissonance between our fine rhetoric and our actions. American hegemonic ambitions have always been encircled by a decorative hedge of beautiful rhetoric–call it the aesthetic of imperialism.

Much of that dissonance originates in that dark shadow cast by American history, race. Although racial prejudice had existed in American thought since the colonial period and was present even in the Northern states (and even, it must be said, among abolitionists), it was Southern writers who articulated the most sophisticated and virulent racial theories of the pre-Civil War era. Just one writer of the many that Karp cites will serve as an example: Louisa McCord wrote that “God’s will formed the weaker race so that they dwindle and die out by contact with the stronger . . . Slavery, then, or extermination seems to be the fate of the dark races” (qtd. on pages 159-160). As Conrad’s Kurtz would later say, “Exterminate all the brutes!” And this slavery-or-extermination racism was not limited to Africans but applied equally as much to Native Americans and other “colored” races (and note that the very term “colored” makes a classificatory distinction between them and “whites”).

This so-called scientific racism, this notion that the strong must inevitably exterminate the weak, predated the publication of Darwin’s Origin of Species (1859) and Spencer’s coining of the phrase “survival of the fittest” (1864), yet it eerily anticipates the uses to which evolution would be put, in the form of Social Darwinism and eugenics, at least up to the Nazi racial theorists of the mid-twentieth century (I am not confident that it has disappeared even today). In the minds of McCord and others of her ilk, civilization itself depended on racial discrimination, particularly on bound labor—slavery was not only good for true civilization (the high arts and all that) but it was good for the slaves (slavery or extermination). Even “liberty,” of all things, depended on slavery (see page 67). Which raises the interesting question: What did the Founders mean, exactly, when they wrote so eloquently about “liberty”? What Southern theorists of race clearly did not mean was the freedom of the individual laborer to be worthy of his hire; indeed, they argued that “free labor” was less efficient and less orderly than slave labor, and they pointed to the declining Haitian exports of sugar after the Haitian revolution as proof—neglecting to note that sugar production for export may have been good for the bourgeoisie of Europe but not good for the Haitians themselves nor for the natural environment of the island.

The end of slavery after the Civil War did not mean the end of exploited labor and racial theory. The sharecropper system was part and parcel of Jim Crow racism, as were separate but equal, which indeed kept the races separate but by no means equal, and although significant and necessary changes came with the civil rights movement, race theory continues to infect social and political discourse today, however superficially camouflaged it may be. Likewise, the ideology of states’ rights continues to impact political thought and rhetoric, even within certain states whose political classes are reluctant to tax and budget for policies that would enhance the well-being of their citizens even as they provide tax breaks and sweetheart deals for corporations and sports teams. Meanwhile, federal military and surveillance budgets continue to climb, and a candidate for president from one of the major parties brags about bombing ISIS into oblivion.

The persistence of Manifest Destiny is best illustrated by the last sixteen years of federal foreign policy. President George W. Bush said of the invasion of Iraq that it was the “latest front in the global democratic revolution led by the United States,” though others saw that war as being more about oil than democracy. President Obama also wanted to promote democracy and advance our values in the Middle East and thought, wrongly as it turned out, that the so-called “Arab Spring” heralded the beginning of a new era in that region. Since Obama ran his first campaign as the not-Bush, it is ironic that both presidents spoke idealistically while pursuing less than ideal policies, as if they (and their advisors, and perhaps also the American citizenry in general) were unable to disentangle their idealism from the realities of the American imperial project. Perhaps that is because from the very beginning, American imperial ambitions have been couched in the rhetoric of liberty, civilization, and wealth—which makes us not so different from our antebellum Southern elite politicians, after all.

Is Brexit the End of the Postwar Era?

Most people with any sense of history know that the European Union came into existence as a consequence of the desire of Europeans to prevent a recurrence of the disputes and national rivalries that had led to the two great world wars, as well as to present a united front against the new threat to Europe, the Soviet Union.  With the fall of the Soviet Union, several countries of Eastern Europe joined the EU, eventually expanding the membership to 28 countries.  It is now 27 countries—and possibly on countdown, as other countries, exasperated by the lack of democracy and the failures of the EU governing classes, contemplate following the UK’s lead.

A faulty system will be tolerated so long as people believe that it is preferable to any other likely system; the EU has been tolerated largely because it was seen as preferable to the many wars that European nations had engaged in previously.  But the last great war ended seventy-one years ago; very few people who lived through that war are still alive and memory of it and its long aftermath of reconstruction and national reorganization is largely relegated, for most living Europeans, to history books.  This may be especially true for the British, whose island continues to keep it somewhat apart from events on the Continent.  The threat of Russia under Putin is one too close to, say, Poland and Germany, but a bit far for the UK.

Of course, the United States has not been a disinterested observer of the EU (as suggested by Obama’s remarks when he visited the UK earlier this year).  Having fought with the Allies in both World Wars, having financed the rebuilding of Western Europe through the Marshal Plan, and having been the prime mover behind NATO, one can argue that the US is as much a part of the EU as it would be if it were an actual member.  One might even argue that the EU is a continuation of empire by means other than outright warfare—perhaps we could even call the European project the “imperial project.”  Napoleon tried to unify Europe under the banner of France; the Austro-Hungarian Empire experienced some success in unifying parts of central and eastern Europe; and Prussia unified the disparate German states into Germany.  The rise of nation states themselves out of the motley assortment of duchies, kingdoms, free cities, and spheres of influence into the distinct nations we know today—France, Germany, Italy, the United Kingdom, etc.—was itself a long imperial project (each of these examples was initially united under a national king who had defeated his feudal aristocratic competitors).  And of course, we know the efforts of the Nazis to impose a unified Europe by brutal force under the swastika flag.

One might say, then, that the EU is a bureaucratic rather than a military empire.  Almost by definition, empire attempts to unify national, ethnic, linguistic, and religious “tribes” under one government, but its Achilles’ heel, it’s genetic defect, is the persistence of those tribes despite the efforts of the imperium to eliminate their differences.  It happened to the Roman Empire, which was disassembled by the very tribes which it had incorporated into its borders.  It also happened to the British Empire, once the most extensive the world has ever seen but which now is reduced to the islands of Great Britain and a small part of Ireland—and which may be further reduced if Scotland and Wales, both long ago (but not forgotten) bloodily defeated and humiliated by the English.

The United States, too, has been an empire-that-will-not-speak-its-name (although the Founders were not chary in using the term when describing their continental ambitions).  We have seen in the last few decades a diminution in the global power and influence of the US as various historic threats have been removed, making others including Europe less reliant on our power, and previously backward countries have risen to the world stage, providing alternate centers of power for client states to orient to.  Our zenith of power was in the decades immediately following the end of WW2, but also for us with the passing of the “Greatest Generation,” memory of that triumph has faded, perhaps disastrously so.

So while it cannot yet be definitively confirmed, it does seem that the frustrations and resentments that built up to the Brexit vote could be a signal that the postwar era has come to an end.  If so, then the next question becomes:  Can globalization continue as planned and hoped for by the corporate, digital and government elites, or will tribalism and nationalism reassert themselves?  Will Europe (and the world) revert to its pre-WW1 national conflicts and warlike imperialist ambitions, or will it and the world evolve a totally new type of organization, one that no one has seen before or can as yet predict?  Or will things like global warming make all hope moot?

Stay tuned.

Can Humans Really Cause Climate Change?

I was listening to the Diane Rehm show today, on the topic of President Obama’s new proposals for reducing carbon dioxide emissions, when a man from Texas called in and stated, obviously in deep umbrage, that climate change is junk science. As proof, he pointed out that the last ice age ended with climate warming not caused by humans, so therefore climate change is natural—and only natural. He is not alone in asserting that climate change is junk science, and he is only one among many who point to the numerous instances of natural climate change, both cooling and warming, over the course of geological time.

It is true that the last ice age ended because of natural causes and was not caused by human activity. There weren’t enough humans back then to have much of an effect on climate, if any. But times have changed. Back then world population was less than one million. Today there are more than seven billion, and most of the geometric increase in population has occurred in the last 200 years (there were just barely one billion of us in 1800), and we have technologies that far exceed those that ice age man enjoyed, and almost all of our technologies are powered by fossil fuels: coal, oil, and natural gas. Because of our ability to re-shape the world in our own image, many scientists call the current era the anthropogenic age. Whereas in the past our impact on the climate was barely measurable, today it is enormous. That in the past, climate change was caused by volcanic eruptions or solar storms does not exclude other causes, including human activities. For that reason, comparisons to climate changes of the remote past are irrelevant to the situation we are facing today. Such comparisons are nothing more than red herrings, meant to distract us from the real evidence for global warming, and apparently for some people, such as that man from Texas, the fallacy is working.

Not that pre-industrial humans are completely off the hook. As documented by George Perkins Marsh in 1864 and by Jared Diamond more recently, human societies have degraded their environments through overuse since the beginnings of civilization. It is believed by archaeologists that human intrusion into the American continents by the ancestors of Native Americans led to the overhunting and extinction of many large mammals, including mammoths and several camel species. On New Zealand, Maoris slaughtered giant flightless birds, and it is believed that the original inhabitants of Australia killed off many species of large (and often dangerous) vertebrates.

Once Europeans clambered ashore in the Americas, we quickly reshaped the landscape and ecology of what became today’s United States—we felled forests, especially in the east, plowed under vast sweeps of grasslands for farmlands, dammed rivers and ruined estuaries, nearly extinguished the bison, and succeeded in wiping out the passenger pigeon, once the most numerous bird species in the world. (And note that, pointing out that extinctions in the remote past, such as the dinosaurs, were “natural” does not mean that the extinction of the passenger pigeon was not caused by humans.) The list of animal and plant species that have gone extinct because of human actions is pages long, and gets longer as more reach that sad fate every year (will the monarch butterfly be on that list soon?). Now that we are virtually in a global rather than many regional civilizations, our activities have global impacts.

The stance taken by the man from Texas, as well as those who agree with him, has behind it a disingenuous premise: That human activities are not great enough to have an effect on the global climate, and that therefore we can continue doing whatever we want without consequence. This premise is joined by another, that climate change is natural and therefore that human activity, which by (unstated) definition is not natural, has no effects. We can conquer nature because we are not-nature. But if human beings aren’t natural, what is? We are like all other vertebrate animals: We are bodies with many organs, we eat similar things to what other animals eat, we breathe, we reproduce, and we die. We are as natural, as biological, as any other organism. Therefore, we are included rather than excluded from the natural cycles of life and of the planet. We are not set apart. Even those of us who live in great cities and make our livings while seated in front of a computer screen have to eat animals and plants to survive, just like our hunter-gatherer ancestors and just like all other animals on this planet.

One of the most important atmospheric changes that ever occurred in the history of the planet was oxygenation. Nearly three billion years ago, certain bacteria, the cyanobacteria (blue-green algae), began producing oxygen, which was released into the atmosphere and made life, as we oxygen-breathing humans know it, possible. Cyanobacteria are living things (like us), and they remade the planet (like we are doing)—in a perfectly natural process. Everything that humans do is “natural”—we are incapable of doing anything unnatural. Even the burning of fossil fuels is “natural.” We didn’t invent fire, we simply found a way to use it for our own purposes. The byproducts of burning fossil fuels are also perfectly “natural”—and harmful. “Natural” does not mean benign; nor does it mean beyond human control, since “Nature” has given us the intellects and opposable thumbs to exercise control.

But there are “natural” limits to our control, limits built into our very bodies. We cannot survive and prosper without the natural environment that spawned us and sustains us. At some point the ecosystem on which we depend will snap under our pressure. Scientists warn that other species cannot evolve quickly enough to adapt to the rapid changes we have wrought. Neither can we.

Globalization, Global Warming, and the Null Effect of Personal Responsibility

A recent article in the New York Times, “The Sins of Angelinos,” by Hector Tobar, got me thinking about personal responsibility. In his article Tobar takes the blame, as a representative citizen of Southern California, for the ongoing drought in California and for global warming in general. He writes, “As a native of Los Angeles, I am significantly more responsible for global warming than your average resident of planet Earth. We pioneered an energy-guzzling lifestyle for the masses and taught the world to follow our lead. Now a parched, endless summer is our punishment.” He then goes on to describe some of the typical So Cal things that have led to global warming: “the cars, the sprawl, the pumping of water,” the “energy-hungry homes.” And to compound the sin, California exported its lifestyle around the world in its movies and television shows, creating a desire for imitation that has, he implies, led other nations to try to emulate that lifestyle. The article is an exemplary exercise in taking responsibility.

But not everyone who read the article agreed with its thesis, as revealed in many of the comments posted by Times readers. One comment pointed out that on a per capita basis, Californians use far less energy that most other states. For comparison, according to the United States Energy Information Administration, in 2012 the per capita energy consumption in California was 201 million BTUs, while in Wyoming it was a (seemingly whopping) 949 million BTUs. At first glance, these figures suggest that Californians are admirably thrifty in their energy use while Wyomingites are particularly wasteful.

Certainly California has in recent decades gained a reputation for being the land of eco-conscious vegan Prius owners (it has the second highest rate of Prius ownership in the country). The Sierra Club was founded and has its headquarters in California (San Francisco). But it also has some big water guzzling industries, particularly the large corporate farms in the Central Valley, which are irrigated by diverted rivers as well as ground water, a fact which is causing controversies over allocation in the current extended drought. And Southern California is basically one long thirsty urban strip extending from Los Angeles to San Diego. As Tobar points out, Californians were not always eco-conscious, and many of their current problems are the result of past profligacy.

Nevertheless, people there seem to be trying, as their comparatively low per capita usage of BTUs seems to indicate. Would that it were so easy. The fact is that California off-loads much of its energy use and material consumption to other regions. Remember that astonishingly high figure of 949 million BTUs per capita for Wyoming? That state has a population of only 584,153, the lowest population of any state, whereas California has 38,802,500, the highest of all the states. What is Wyoming doing with all those BTUs?

Shipping most of them to California (and other states) in the form of electricity, which is generated in Wyoming but sold out of state. In total energy use, California far outstrips tiny Wyoming. A driver may pat himself on the back for driving a Prius plug-in or all-electric car, but the electricity to charge the batteries comes from a fossil-fueled generating plant somewhere. Los Angeles pollutes the air of Wyoming. The United States has drastically reduced air pollution from manufacturing in large part because we have out-sourced manufacturing to other countries such as China, where air pollution has become a major problem.

Every consumer good we import carries with it an energy cost, not in dollars but in environmental damage. What was once our problem has been globalized, along with what were once our jobs. The sense of personal responsibility for the environment, and the feeling of righteousness each of us has when we recycle our bottles or install energy efficient appliances in our kitchens, are illusions so long as the real costs are removed at a distance. The costs are still being paid.

And so long as the populations of California, of the United States, and of the world continue to grow and as other countries continue to demand their own version of the California lifestyle, the effect on global warming of any one individual’s efforts to reduce, even to zero, his or her “carbon footprint” will be null. Per capita BTUs in California may be only 201 million, but with a population of 38, 802,500, California’s total energy consumption is 7,799,302,500 million BTUs. What would be the total if we included the BTUs required to manufacture the consumer goods imported and sold to Californians? The numbers are literally incomprehensible. What will happen when the people of China (1,357,000,000) and India (1,252,000,000) approach the level of energy consumption per capita of California?

This is not to say that individuals should not act responsibly, but it is saying that any real solutions are achievable only on the large scale, and that only national governments acting in international concert can have any hope of stopping global climate change. “National Interest” (i.e., national selfishness) has already impeded the necessary action, perhaps to the point of it now being too late.

The Myth of Cheap Oil

Those of us who drive certainly have noticed that we are paying less for a gallon of gasoline, and if we watch the news, we know that the drop in gas prices is the result of increased crude oil production, particularly in the U.S. Since supply is high, the cost of a barrel of oil has dropped by 40% since June of this year (2014). This is widely touted as good for consumers, who supposedly now have more pocket money to spend on other things, and therefore good for the economy, which is largely driven by consumer spending. So, lower gasoline prices are good, right?

Maybe. There are hidden costs to low oil prices. If the price is less than the cost of extracting the oil from the ground, producers shut down pumps, which in turn means the oil field workers get laid off, which means that they can no longer be consumers. If the economy were otherwise strong, they could get other jobs, but given that the so-called recovery has been weak and has created new replacement jobs that pay less and are less secure, they may have a tough time finding new employment. Low oil prices are also bad news for oil producing countries that depend on oil exports for much or most of their revenues. They may not be able to meet their citizens’ needs or continue to pay back their debts.

Another hidden cost is the temptation for people to drive more, thus increasing the amount of carbon we put into the atmosphere; it may even cause people to buy bigger, less fuel-efficient cars (on the mistaken theory that today’s low prices will last).
Many of these negative effects (and potentially others as well) are not likely to be visible to most people, including politicians, pundits, and economists. Because we have a consumption-driven economy, we tend to believe that lower consumer prices are always good, and that belief prevents us from seeing the longer term negative consequences of prices that are too low. In fact, apologists for consumer capitalism tend to see a silver lining in lower prices in the form of pressure on producers to become more efficient. And where does this efficiency come from? Fortune Magazine recently put it succinctly: “As oil prices drop, producers will undoubtedly renegotiate their ludicrously expensive oil service contracts, slash wages for their workforce and cut perks to bring their costs in line with the depressed price for crude.” Notice that two of the three steps producers will take are cutting wages and cutting perks (e.g., such things as health insurance, which is, in current consumer capitalist terms, a “perk,” not a necessity). In other words, screw the worker.

Lest we think that this is no big deal, simply the cost of doing business in a competitive environment, let’s remember that we have gone through this before.
Remember when most of the goods on retail shelves were made in the United States? Well, some of you may well be too young to remember, but in fact, there was a time when America was a manufacturing nation, when we exported more finished goods than we imported, and when our foreign trade was balanced in our favor (our last trade surplus occurred in 1975, almost 40 years ago). This shift was initially touted as good for American consumers, but since the consumer is also a worker, eventually it hurt consumers—but not to worry, we simply imported more and cheaper (in all senses) stuff, so that poorer American worker-consumers could keep buying. We even changed our ethic from one of savings to one of borrowing.

Consumers unable to keep up with the spending needed to keep the economy “growing” turned to their credit cards and their home equity: the result, most recently, was the implosion of the financial sector in 2008. While it is commonplace and correct to blame the big banks and financial institutions for the 2008 debacle, blame also rests on everyday consumers, who willingly ramped up their borrowing to finance their over-budget spending (not just on food, but expensive electronic gadgets). Irrational exuberance was not limited to Wall Street—it walked on Main Street as well.
But what were worker-consumers to do? From all sides they were pounded with the message to get out and shop—interest rates were low, credit was easy, and housing prices would rise forever. It was our patriotic duty to go out and shop, as President George W. Bush reminded us after 9/11. So we shopped, and shopped, and shopped. Until we dropped.

And in the ruins of the economy, we can see the long-term cost of lower prices. We traded away our industrial birthright for a mess of consumer pottage, only to realize too late how lacking in nutrients such gruel really is.

Actually, I’m not sure we realize that yet.

American Foreign Policy and European Imperialism: Pax or Pox?

What is wrong with United States’ policy in the Middle East?

The pundits and policy wonks argue that it’s complicated. But it’s actually fairly simple: Since World War II, the United States has continued the colonial enterprise abandoned by the European imperial powers, particularly England and France, who were too weakened and bankrupted by the war to continue that enterprise themselves.

We have not done a very good job of it, perhaps largely because we deny that that is what we are doing. There is an element of split personality in our simultaneous orientation towards, and disdain of, Europe which causes us to cite other factors as motivation for our continued interference in the affairs of that region: during the Cold War, containing the Soviet Union and communism; always, of course, protecting the flow of oil (apparently under the assumption that independent and autonomous Arab states would not be interested in selling to us the one thing they have to sell), more recently, fighting terrorism and supporting clearly rickety and not widely popular “democracy” movements.

That we have not done a good job is evident in the mistaken and botched invasion of Iraq, in the rapidity with which Iraq has broken down since our troops were withdrawn, in the persistence of Hamas despite decades of American efforts to broker some kind of lasting agreement between the Palestinians and the Israelis, and the creeping back of the Taliban as we draw down in Afghanistan. Not to mention the so-called “Arab Spring,” so named in a rush of misplaced optimism and misunderstanding of the real situations in those regions.

Although we tend to view the various insurgent groups as motivated by religious fanaticism, age-old hatreds, and plain old evil, even as enemies of the United States who want to impose Shariah law on us (how in the world would they do that?), what they are “insurging” against are the artificial borders drawn by England and France (somewhat with the collusion of Russia) after World War I and the Allied defeat of the Ottoman Empire, through such instruments as the Balfour Declaration of 1917, the Sykes-Picot Agreement of 1916, the British Mandate under the League of Nations, and various policies and actions flowing from them, without regard to the wishes or traditions of the native populations.

After World War II, the old European imperial powers were no longer able to maintain their colonies and areas of “influence,” and the natural course would have been for these artificial entities to quickly break down and sort themselves into new entities more natural to the history, geography, and ethnicities of the region—except that the United States, the new superpower, stepped in to try to continue what England and France had begun. Yet American involvement in the region has always been conflicted, likely in part because the despotic regimes that were necessary to keeping artificialities like Iraq intact went against the grain of American ideals, as encapsulated by President Wilson at the end of World War I—but we should perhaps have been forewarned by the defeat of Wilson’s program at the negotiations over the Treaty of Versailles, a defeat which showed just how reluctant Britain and France were to let other nations and people alone. And so, here we are, one hundred years later, no longer unable to stop the final unraveling of the old imperial structures of the Middle East.

The question is: Why has the United States allowed itself to become entangled in problems created by a defunct European imperialism?

Since its inception, the United States has been oriented towards Europe, more than to any other region of the world. We began as the thirteen British colonies, and since the Revolution we have traded, negotiated, and imitated Europe. Culturally, we have looked to Europe for inspiration and models. Our museums are filled with European art, our symphony orchestras perform largely European music, during the 19th century our scholars and intellectuals pursued European university educations, our own artists and writers have traveled and lived in Europe (T. S. Eliot, Hemingway, etc.), and we remain today fascinated to the point of unseemly obsession with such things as the British royal family and Downton Abbey, French fashions and wine culture, and Italian cuisine. More Americans travel to Europe than to any other overseas destination. We are truly Europhilic, likely because, until very recent times, our population has been predominantly of European descent.

Yet at the same time we hold Europe in contempt, as a conflicted teenager both loves and rejects his parents. Donald Rumsfeld was speaking for a lot of Americans when he dismissed our reluctant allies (during the Iraq War) as “old Europe”—worn out, weak, declining, cowardly, impotent, prevaricating, intellectual and effete! And though he spoke crudely, his sentiment was not new. In the 19th century, Ralph Waldo Emerson and others called for a new American literature and culture, and artists and writers since have struggled to define themselves both in terms of and against European models. There is a tint of inferiority complex coloring our attitudes to the “Old World” in our politics and foreign policy as much as in our culture, and in much of the self-congratulatory rhetoric of our politicians and pundits, particularly following the collapse of the Soviet Union (remember “The End of History and the Last Man”?). We were the sole remaining super-power, the triumphant heirs of the West that had triumphed over the rest; we could step in and repair and regain what the Europeans had lost and thereby lead the world towards a democratic/capitalist utopia, for once and for all. We had finally and definitively replaced Old Europe!

It is therefore with existential chagrin that we now confront the fact that we are as impotent to maintain the old European imperial structure as Old Europe itself. What dangers this humiliation may bring remain to be seen.

Silent Spring: The Reckoning

Rachel Carson’s Silent Spring, a prophetic warning of the deleterious effects of pesticides such as DDT on the environment, was published in 1962. The book warned that the widespread use of pesticides was devastating bird populations, and that if such use was not eliminated or reduced, many species would become extinct. Carson detailed how DDT caused birds to lay eggs with shells so thin and fragile they broke before the embryos could develop into live chicks; birds of prey were especially affected because in their role as top predators, DDT became more concentrated in their bodies. At the time of publication, bald eagles had declined to near extinction because of the thin-shelled egg problem. Fortunately, despite heavy criticism by vested interests, Carson’s message was heard, DDT was banned, and the bald eagle has recovered, as have other raptors.

One would hope that the lesson had been learned and that similar mistakes would no longer be made. But nothing of the kind has in fact happened, despite all the earth days, demonstrations, supposed regulations, and lip service. A particularly striking and pertinent example of our failure to practice what we preach is the impending fate of the Monarch butterfly, that wonder of the insect world. This remarkable creature spends the summer months spread out in the northern United States (primarily in the upper Midwest) and southern Canada and winters concentrated in its millions in a small area of central Mexico. Even more amazing, this yearly migration covers multiple generations of the species, so that the butterflies that leave Mexico in the spring are not the same individuals who arrive in the north weeks later (they reproduce on the way), and a new generation leaves the north to return to Mexico in the fall. Yet they return to the same groves that their great-great-grandparents left months earlier!

Alas, the Monarch has one trait that has long served it well but which is now its Achilles’ heel: they lay their eggs on, and their caterpillars eat, only milkweed. They absorb the nasty taste of the milkweed, rendering them unpalatable to insect eating birds, which protects them from predation on their long, multigenerational migrations. Should the milkweed decline or disappear, so too will the Monarch.

Which is exactly what appears to be happening. Scientists and amateurs alike have noted a steady decline in the numbers of Monarchs gathering each year in Mexico (the best place to get a handle on their numbers), and this year (2013) the population has declined precipitously. According to a recent article in the New York Times, in 2012 the numbers of butterflies at the Mexico wintering site was approximately 60 million, itself a decline from previous years; but this year, only 2 million have showed up, and they showed up a week later (more on the implications of this fact later). Imagine if the human population had dropped from its current 7 billion to less than 300 million in just one year.

The most likely cause of this decline is the rapid disappearance of milkweed along the routes followed by the butterflies as they move north and south in their annual journeys. The American Midwest, that famous breadbasket to the world, is increasingly covered with corn and soybean fields, a large percentage of which are planted with so-called “Roundup ready” varieties, i.e., varieties that have been genetically engineered to resist glyphosate, the active ingredient in Roundup brand herbicide. Milkweed and other native species are not genetically engineered to resist that poison, so they die while the corn and soybeans prosper. With insufficient milkweed available on which to lay their eggs, Monarchs cannot renew their numbers, so they also die.

Likely compounding the problem is global warming. Canadian scientists have observed that many Monarchs are migrating further north than in the past, well past the natural range of the milkweed. While adults can feed on the flowers of other species, they can lay their eggs, and the caterpillars can dine, only on milkweed. Thus those Monarchs who went too far north (probably because of temperature) could not successfully reproduce. Warming may also explain why Monarchs arrived a week late in Mexico.

The phenomenon of crops genetically engineered to resist manmade herbicides is an example of System run amuck. System operates on the erroneous belief that a “problem” is singular and that its solution is also singular. So, if “weeds” are “invading” your crops, getting rid of them will take care of that problem. (Note: How can native species be invading on non-native, and artificial, varieties? Aren’t corn and soybeans invading on the native species? Aren’t corn and soybeans therefore the true weeds?) How very ironic that our capitalist system seems to be imitating a communist dictator: Chairman Mao once ordered that all sparrows be killed because they stole grain; consequently, crop-eating insects increased in numbers so sharply that he ordered widespread spraying of insecticides. Result: The elimination of pollinating species, particularly honeybees. If the Monarch is in such dire straits, are not other, likely beneficial species along its route also threatened? At the same time, some not so beneficial “weed” species are developing resistance to glyphosate, and it is likely that in the not so distant future, glyphosate herbicides will be rendered useless while some other pestilence will discover the vulnerabilities of genetically engineered crops. Thus the solution will turn out to be yet another of mankind’s many self-made problems.

See also this more recent article.

Against System

As many people know, the world’s domestic honeybees are seriously threatened by a condition called colony collapse disorder, in which once healthy colonies suddenly disappear altogether, virtually overnight. Many different possible causes are generally cited, including mites, malnutrition, viral diseases, pesticides, and so forth, and it is routine to call for “more research.” Don’t ya just love it? Whenever we don’t want to face the truth, we call for “more research.” Yet the film demonstrates that more research is not needed. The reason for colony collapse is crystal clear: we have turned living organisms, the bees, into industrial units, mere nuts and bolts moving along a factory line, without regard to the fact that they are living organisms—much as we have other animals, such as cattle, chickens, hogs, and fish.

Click here to read entire article

Don’t Touch My Stuff: Archeologists vs Looters

Looters. Graverobbers. Criminals. Black marketeers. Such are the epithets attached to those who locate, dig, and pillage artifacts from ancient graves, tombs, and assorted ruins, for reasons, as imputed to them by others, of greed, ignorance, and a mania for collecting. And it is true that everywhere that antiquities can be found (which is almost anywhere), private persons do dig up and sell objects from ancient sites.

Most countries have strict laws that forbid such private digging and regulate the selling and exporting of antiquities; many items more or less legally (if not morally) acquired in the past have been repatriated, especially since the international agreements of the 1970’s codified the standards for the trade. The Getty Museum is just one among many Western museums that has had to return iconic objects to their countries of origin. (Worth mentioning are the very similar laws regarding the poaching and sale of exotic animals.) The rationales for these laws and agreements have much to do with national identity, anticolonialism, tourism dollars, and even scholarship.

When objects are pulled willy nilly from the ground, with no record of the context in which they were found (at what depth, associated with which king, bundled with what other items, etc.), information as to age and function is lost to archeologists, and therefore to the rest of us who might be interested in the archeologists’ findings. Particularly in today’s globalized world, these objects and sites are part of the heritage of the whole human race and belong to all of us.

Taking these points into account, it seems reasonable and just to protect our human heritage through law and to bring the full force of the justice and police system down on the looters and grave robbers. This seems so obvious.
But that which seems obvious may simply be prejudice or self interest, so let us consider the motives of the looters themselves. That is, the often poverty-stricken peasants who, upon locating a cache of ancient and often very beautiful objects, remove them and carry them away to be sold to dealers, who in turn will sell them to collectors who want them as objects of interest and beauty for their own enjoyment. The latter may (or may not) be motivated by greed (they could also be motivated by genuine interest or a love of art), but the peasants are motivated by the needs of poverty. It is of little interest to them that somewhere in America, Europe, or Japan there are disinterested scholars eager to excavate these sites properly and to display the prime objects in cosmopolitan museums (the rest of the objects being cataloged and stored in university basements, unlikely ever to see the light of day again), nor does it interest them that said scholars will present their findings at posh conferences and in peer reviewed journals. What they want is to feed their families, build a better house, or send a child to school.

But such peasants not only have no money, they have no power, yet they are well aware of who does, and they are not incorrect in believing that the academics periodically swarming over their territories represent those who do have power. They might (certainly in their own minds) wonder why they cannot dig and remove while the men and women in khaki can. Why, they might ask, is what we do called looting and grave robbing while what the professors and their students do is not? Have not in both cases the graves of the ancients been opened and emptied of their contents? Have not the professors gained wealth and prestige from this activity?

I remember what a leading mullah said some years ago, well before the United States invaded Afghanistan, about the Western world’s lamentations over the destruction of the Bamiyan buddhas, that we had never shown such concern for the hunger of the Afghan people. Wrong as it was to destroy those unique and irreplaceable monuments to Afghanistan’s past, I couldn’t help but think he had a point; the United States had used the Afghans to fight a proxy war with Russia, and once that was concluded to our satisfaction, we abandoned their country. Now we seem to be doing that again. Can it be that loss of life bothers us less than the loss of statues?

“Education” is often used to justify what might otherwise be recognized as exploitation, self-interest, and cruelty. Consider that marine mammal theme parks promote themselves as educational (especially for children, who may be too naïve to notice the hypocrisy) to justify the corralling of whales and porpoises who, in their natural setting (not “the wild,” since it is not the wild to them) range for thousands of miles; or that zoos, which are anything but natural settings, promote themselves as not only educating the public but also preserving endangered animals. Meanwhile, the societies in which these amusement parks and zoos are imbedded continue to degrade these creatures’ natural environments, thus hastening their demise and, not so incidentally, making the zoos even more “necessary”.

That college students and, occasionally, the general public are “educated” about ancient peoples through popular books and exhibitions may well be true, but mostly such beneficiaries are the already prosperous urbanites of populous consumer societies who have the time and inclination for such hobbies, and money to spend in museum stores. It is for these privileged ones that the graves and tombs of the ancients are pillaged by the archeologists. Not to mention that there is money to be made by documentary film-makers working for, say the Discovery or National Geographic channels, as well as scholarly careers, reputations, and tenured professorships.

Basically, fundamentally, opening an ancient tomb, removing the objects and bodies it contains, and transporting them elsewhere is desecration. The person or persons buried there meant to stay there; they did not go to their graves with the hope that hundreds or thousands of years later they would be exhumed by curious or greedy people; they did not think of their graves as “time capsules.” The reasons for disturbing their final resting places would make little difference to them. There is no a priori justification for the greed, the love of beauty, or the thirst for knowledge that motivates the desecrater.

I admit to being fascinated by the discoveries of archeology, especially by what they tell us of the ways ancient people understood themselves and the world, but I also admit I cannot think of a truly objective basis for holding that the satisfaction of such curiosity trumps all other considerations. The priority of such academic curiosity has been established by an elite that has power, money, ideology, ego, and politics on its side.

Perhaps Native Americans have the moral high ground on this issue. In recent decades, Native American tribes have demanded that objects and human remains removed from their ancestral lands and deposited in academic storage rooms and museums be returned to them for reburial according to their own rituals, and they have increasingly won their point.

(Books that document the seamier side of archeology include two by Cathy Gere, The Tomb of Agamemnon and Knossos and the Prophets of Modernism, and Jo Marchant’s recent book The Shadow King: The Bizarre Afterlife of King Tut’s Mummy.)