Archive

Archive for February, 2006

>The Oil We Eat

>Following the food chain back to Iraq

by Richard Manning

Harper’s Magazine (February 2004)

“The secret of great wealth with no obvious source is some forgotten crime, forgotten because it was done neatly”. – Balzac

The journalist’s rule says: follow the money. This rule, however, is not really axiomatic but derivative, in that money, as even our vice president will tell you, is really a way of tracking energy. We’ll follow the energy.

We learn as children that there is no free lunch, that you don’t get something from nothing, that what goes up must come down, and so on. The scientific version of these verities is only slightly more complex. As James Prescott Joule discovered in the nineteenth century, there is only so much energy. You can change it from motion to heat, from heat to light, but there will never be more of it and there will never be less of it. The conservation of energy is not an option, it is a fact. This is the first law of thermodynamics.

Special as we humans are, we get no exemptions from the rules. All animals eat plants or eat animals that eat plants. This is the food chain, and pulling it is the unique ability of plants to turn sunlight into stored energy in the form of carbohydrates, the basic fuel of all animals. Solar-powered photosynthesis is the only way to make this fuel. There is no alternative to plant energy, just as there is no alternative to oxygen. The results of taking away our plant energy may not be as sudden as cutting off oxygen, but they are as sure.

Scientists have a name for the total amount of plant mass created by Earth in a given year, the total budget for life. They call it the planet’s “primary productivity”. There have been two efforts to figure out how that productivity is spent, one by a group at Stanford University, the other an independent accounting by the biologist Stuart Pimm. Both conclude that we humans, a single species among millions, consume about forty percent of Earth’s primary productivity, forty percent of all there is. This simple number may explain why the current extinction rate is 1,000 times that which existed before human domination of the planet. We six billion have simply stolen the food, the rich among us a lot more than others.

Energy cannot be created or canceled, but it can be concentrated. This is the larger and profoundly explanatory context of a national-security memo George Kennan wrote in 1948 as the head of a State Department planning committee, ostensibly about Asian policy but really about how the United States was to deal with its newfound role as the dominant force on Earth. “We have about fifty percent of the world’s wealth but only 6.3 percent of its population”, Kennan wrote. “In this situation, we cannot fail to be the object of envy and resentment. Our real task in the coming period is to devise a pattern of relationships which will permit us to maintain this position of disparity without positive detriment to our national security. To do so, we will have to dispense with all sentimentality and day-dreaming; and our attention will have to be concentrated everywhere on our immediate national objectives. We need not deceive ourselves that we can afford today the luxury of altruism and world-benefaction.”

“The day is not far off”, Kennan concluded, “when we are going to have to deal in straight power concepts”.

If you follow the energy, eventually you will end up in a field somewhere. Humans engage in a dizzying array of artifice and industry. Nonetheless, more than two thirds of humanity’s cut of primary productivity results from agriculture, two thirds of which in turn consists of three plants: rice, wheat, and corn. In the 10,000 years since humans domesticated these grains, their status has remained undiminished, most likely because they are able to store solar energy in uniquely dense, transportable bundles of carbohydrates. They are to the plant world what a barrel of refined oil is to the hydrocarbon world. Indeed, aside from hydrocarbons they are the most concentrated form of true wealth – sun energy – to be found on the planet.

As Kennan recognized, however, the maintenance of such a concentration of wealth often requires violent action. Agriculture is a recent human experiment. For most of human history, we lived by gathering or killing a broad variety of nature’s offerings. Why humans might have traded this approach for the complexities of agriculture is an interesting and long-debated question, especially because the skeletal evidence clearly indicates that early farmers were more poorly nourished, more disease-ridden and deformed, than their hunter-gatherer contemporaries. Farming did not improve most lives. The evidence that best points to the answer, I think, lies in the difference between early agricultural villages and their pre-agricultural counterparts – the presence not just of grain but of granaries and, more tellingly, of just a few houses significantly larger and more ornate than all the others attached to those granaries. Agriculture was not so much about food as it was about the accumulation of wealth. It benefited some humans, and those people have been in charge ever since.

Domestication was also a radical change in the distribution of wealth within the plant world. Plants can spend their solar income in several ways. The dominant and prudent strategy is to allocate most of it to building roots, stem, bark – a conservative portfolio of investments that allows the plant to better gather energy and survive the downturn years. Further, by living in diverse stands (a given chunk of native prairie contains maybe 200 species of plants), these perennials provide services for one another, such as retaining water, protecting one another from wind, and fixing free nitrogen from the air to use as fertilizer. Diversity allows a system to “sponsor its own fertility”, to use visionary agronomist Wes Jackson’s phrase. This is the plant world’s norm.

There is a very narrow group of annuals, however, that grow in patches of a single species and store almost all of their income as seed, a tight bundle of carbohydrates easily exploited by seed eaters such as ourselves. Under normal circumstances, this eggs-in-one-basket strategy is a dumb idea for a plant. But not during catastrophes such as floods, fires, and volcanic eruptions. Such catastrophes strip established plant communities and create opportunities for wind-scattered entrepreneurial seed bearers. It is no accident that no matter where agriculture sprouted on the globe, it always happened near rivers. You might assume, as many have, that this is because the plants needed the water or nutrients. Mostly this is not true. They needed the power of flooding, which scoured landscapes and stripped out competitors. Nor is it an accident, I think, that agriculture arose independently and simultaneously around the globe just as the last ice age ended, a time of enormous upheaval when glacial melt let loose sea-size lakes to create tidal waves of erosion. It was a time of catastrophe.

Corn, rice, and wheat are especially adapted to catastrophe. It is their niche. In the natural scheme of things, a catastrophe would create a blank slate, bare soil, that was good for them. Then, under normal circumstances, succession would quickly close that niche. The annuals would colonize. Their roots would stabilize the soil, accumulate organic matter, provide cover. Eventually the catastrophic niche would close. Farming is the process of ripping that niche open again and again. It is an annual artificial catastrophe, and it requires the equivalent of three or four tons of TNT per acre for a modern American farm. Iowa’s fields require the energy of 4,000 Nagasaki bombs every year.

Iowa is almost all fields now. Little prairie remains, and if you can find what Iowans call a “postage stamp” remnant of some, it most likely will abut a cornfield. This allows an observation. Walk from the prairie to the field, and you probably will step down about six feet, as if the land had been stolen from beneath you. Settlers’ accounts of the prairie conquest mention a sound, a series of pops, like pistol shots, the sound of stout grass roots breaking before a moldboard plow. A robbery was in progress.

When we say the soil is rich, it is not a metaphor. It is as rich in energy as an oil well. A prairie converts that energy to flowers and roots and stems, which in turn pass back into the ground as dead organic matter. The layers of topsoil build up into a rich repository of energy, a bank. A farm field appropriates that energy, puts it into seeds we can eat. Much of the energy moves from the earth to the rings of fat around our necks and waists. And much of the energy is simply wasted, a trail of dollars billowing from the burglar’s satchel.

I’ve already mentioned that we humans take forty percent of the globe’s primary productivity every year. You might have assumed we and our livestock eat our way through that volume, but this is not the case. Part of that total – almost a third of it – is the potential plant mass lost when forests are cleared for farming or when tropical rain forests are cut for grazing or when plows destroy the deep mat of prairie roots that held the whole business together, triggering erosion. The Dust Bowl was no accident of nature. A functioning grassland prairie produces more biomass each year than does even the most technologically advanced wheat field. The problem is, it’s mostly a form of grass and grass roots that humans can’t eat. So we replace the prairie with our own preferred grass, wheat. Never mind that we feed most of our grain to livestock, and that livestock is perfectly content to eat native grass. And never mind that there likely were more bison produced naturally on the Great Plains before farming than all of beef farming raises in the same area today. Our ancestors found it preferable to pluck the energy from the ground and when it ran out move on.

Today we do the same, only now when the vault is empty we fill it again with new energy in the form of oil-rich fertilizers. Oil is annual primary productivity stored as hydrocarbons, a trust fund of sorts, built up over many thousands of years. On average, it takes 5.5 gallons of fossil energy to restore a year’s worth of lost fertility to an acre of eroded land – in 1997 we burned through more than 400 years’ worth of ancient fossilized productivity, most of it from someplace else. Even as the earth beneath Iowa shrinks, it is being globalized.

Six thousand years before sodbusters broke up Iowa, their Caucasian blood ancestors broke up the Hungarian plain, an area just northwest of the Caucasus Mountains. Archaeologists call this tribe the LBK, short for linearbandkeramik, the German word that describes the distinctive pottery remnants that mark their occupation of Europe. Anthropologists call them the wheat-beef people, a name that better connects those ancients along the Danube to my fellow Montanans on the Upper Missouri River. These proto-Europeans had a full set of domesticated plants and animals, but wheat and beef dominated. All the domesticates came from an area along what is now the Iraq-Syria-Turkey border at the edges of the Zagros Mountains. This is the center of domestication for the Western world’s main crops and livestock, ground zero of catastrophic agriculture.

Two other types of catastrophic agriculture evolved at roughly the same time, one centered on rice in what is now China and India and one centered on corn and potatoes in Central and South America. Rice, though, is tropical and its expansion depends on water, so it developed only in floodplains, estuaries, and swamps. Corn agriculture was every bit as voracious as wheat; the Aztecs could be as brutal and imperialistic as Romans or Brits, but the corn cultures collapsed with the onslaught of Spanish conquest. Corn itself simply joined the wheat-beef people’s coalition. Wheat was the empire builder; its bare botanical facts dictated the motion and violence that we know as imperialism.

The wheat-beef people swept across the western European plains in less than 300 years, a conquest some archaeologists refer to as a “blitzkrieg”. A different race of humans, the Cro-Magnons – hunter-gatherers, not farmers – lived on those plains at the time. Their cave art at places such as Lascaux testifies to their sophistication and profound connection to wildlife. They probably did most of their hunting and gathering in uplands and river bottoms, places the wheat farmers didn’t need, suggesting the possibility of coexistence. That’s not what happened, however. Both genetic and linguistic evidence say that the farmers killed the hunters. The Basque people are probably the lone remnant descendants of Cro-Magnons, the only trace.

Hunter-gatherer archaeological sites of the period contain spear points that originally belonged to the farmers, and we can guess they weren’t trade goods. One group of anthropologists concludes, “The evidence from the western extension of the LBK leaves little room for any other conclusion but that LBK-Mesolithic interactions were at best chilly and at worst hostile”. The world’s surviving Blackfeet, Assiniboine Sioux, Inca, and Maori probably have the best idea of the nature of these interactions.

Wheat is temperate and prefers plowed-up grasslands. The globe has a limited stock of temperate grasslands, just as it has a limited stock of all other biomes. On average, about ten percent of all other biomes remain in something like their native state today. Only one percent of temperate grasslands remains undestroyed. Wheat takes what it needs.

The supply of temperate grasslands lies in what are today the United States, Canada, the South American pampas, New Zealand, Australia, South Africa, Europe, and the Asiatic extension of the European plain into the sub-Siberian steppes. This area largely describes the First World, the developed world. Temperate grasslands make up not only the habitat of wheat and beef but also the globe’s islands of Caucasians, of European surnames and languages. In 2000 the countries of the temperate grasslands, the neo-Europes, accounted for about eighty percent of all wheat exports in the world, and about 86 percent of all corn. That is to say, the neo-Europes drive the world’s agriculture. The dominance does not stop with grain. These countries, plus the mothership – Europe – accounted for three fourths of all agricultural exports of all crops in the world in 1999.

Plato wrote of his country’s farmlands:

What now remains of the formerly rich land is like the skeleton of a sick man … Formerly, many of the mountains were arable. The plains that were full of rich soil are now marshes. Hills that were once covered with forests and produced abundant pasture now produce only food for bees. Once the land was enriched by yearly rains, which were not lost, as they are now, by flowing from the bare land into the sea. The soil was deep, it absorbed and kept the water in loamy soil, and the water that soaked into the hills fed springs and running streams everywhere. Now the abandoned shrines at spots where formerly there were springs attest that our description of the land is true.

Plato’s lament is rooted in wheat agriculture, which depleted his country’s soil and subsequently caused the series of declines that pushed centers of civilization to Rome, Turkey, and western Europe. By the fifth century, though, wheat’s strategy of depleting and moving on ran up against the Atlantic Ocean. Fenced-in wheat agriculture is like rice agriculture. It balances its equations with famine. In the millennium between 500 and 1500, Britain suffered a major “corrective” famine about every ten years; there were seventy-five in France during the same period. The incidence, however, dropped sharply when colonization brought an influx of new food to Europe.

The new lands had an even greater effect on the colonists themselves. Thomas Jefferson, after enduring a lecture on the rustic nature by his hosts at a dinner party in Paris, pointed out that all of the Americans present were a good head taller than all of the French. Indeed, colonists in all of the neo-Europes enjoyed greater stature and longevity, as well as a lower infant-mortality rate – all indicators of the better nutrition afforded by the onetime spend down of the accumulated capital of virgin soil.

The precolonial famines of Europe raised the question: What would happen when the planet’s supply of arable land ran out? We have a clear answer. In about 1960 expansion hit its limits and the supply of unfarmed, arable lands came to an end. There was nothing left to plow. What happened was grain yields tripled.

The accepted term for this strange turn of events is the green revolution, though it would be more properly labeled the amber revolution, because it applied exclusively to grain – wheat, rice, and corn. Plant breeders tinkered with the architecture of these three grains so that they could be hypercharged with irrigation water and chemical fertilizers, especially nitrogen. This innovation meshed nicely with the increased “efficiency” of the industrialized factory-farm system. With the possible exception of the domestication of wheat, the green revolution is the worst thing that has ever happened to the planet.

For openers, it disrupted long-standing patterns of rural life worldwide, moving a lot of no-longer-needed people off the land and into the world’s most severe poverty. The experience in population control in the developing world is by now clear: It is not that people make more people so much as it is that they make more poor people. In the forty-year period beginning about 1960, the world’s population doubled, adding virtually the entire increase of three billion to the world’s poorest classes, the most fecund classes. The way in which the green revolution raised that grain contributed hugely to the population boom, and it is the weight of the population that leaves humanity in its present untenable position.

Discussion of these, the most poor, however, is largely irrelevant to the American situation. We say we have poor people here, but almost no one in this country lives on less than one dollar a day, the global benchmark for poverty. It marks off a class of about 1.3 billion people, the hard core of the larger group of two billion chronically malnourished people – that is, one third of humanity. We may forget about them, as most Americans do.

More relevant here are the methods of the green revolution, which added orders of magnitude to the devastation. By mining the iron for tractors, drilling the new oil to fuel them and to make nitrogen fertilizers, and by taking the water that rain and rivers had meant for other lands, farming had extended its boundaries, its dominion, to lands that were not farmable. At the same time, it extended its boundaries across time, tapping fossil energy, stripping past assets.

The common assumption these days is that we muster our weapons to secure oil, not food. There’s a little joke in this. Ever since we ran out of arable land, food is oil. Every single calorie we eat is backed by at least a calorie of oil, more like ten. In 1940 the average farm in the United States produced 2.3 calories of food energy for every calorie of fossil energy it used. By 1974 (the last year in which anyone looked closely at this issue), that ratio was 1:1. And this understates the problem, because at the same time that there is more oil in our food there is less oil in our oil. A couple of generations ago we spent a lot less energy drilling, pumping, and distributing than we do now. In the 1940s we got about 100 barrels of oil back for every barrel of oil we spent getting it. Today each barrel invested in the process returns only ten, a calculation that no doubt fails to include the fuel burned by the Hummers and Blackhawks we use to maintain access to the oil in Iraq.

David Pimentel, an expert on food and energy at Cornell University, has estimated that if all of the world ate the way the United States eats, humanity would exhaust all known global fossil-fuel reserves in just over seven years. Pimentel has his detractors. Some have accused him of being off on other calculations by as much as thirty percent. Fine. Make it ten years.

Fertilizer makes a pretty fine bomb right off the shelf, a chemistry lesson Timothy McVeigh taught at Oklahoma City’s Alfred P Murrah Federal Building in 1995 – not a small matter, in that the green revolution has made nitrogen fertilizers ubiquitous in some of the more violent and desperate corners of the world. Still, there is more to contemplate in nitrogen’s less sensational chemistry.

The chemophobia of modern times excludes fear of the simple elements of chemistry’s periodic table. We circulate petitions, hold hearings, launch websites, and buy and sell legislators in regard to polysyllabic organic compounds – polychlorinated biphenyls, polyvinyls, DDT, 2-4d, that sort of thing – not simple carbon or nitrogen. Not that agriculture’s use of the more ornate chemistry is benign – an infant born in a rural, wheat-producing county in the United States has about twice the chance of suffering birth defects as one born in a rural place that doesn’t produce wheat, an effect researchers blame on chlorophenoxy herbicides. Focusing on pesticide pollution, though, misses the worst of the pollutants. Forget the polysyllabic organics. It is nitrogen – the wellspring of fertility relied upon by every Eden-obsessed backyard gardener and suburban groundskeeper – that we should fear most.

Those who model our planet as an organism do so on the basis that the earth appears to breathe – it thrives by converting a short list of basic elements from one compound into the next, just as our own bodies cycle oxygen into carbon dioxide and plants cycle carbon dioxide into oxygen. In fact, two of the planet’s most fundamental humors are oxygen and carbon dioxide. Another is nitrogen.

Nitrogen can be released from its “fixed” state as a solid in the soil by natural processes that allow it to circulate freely in the atmosphere. This also can be done artificially. Indeed, humans now contribute more nitrogen to the nitrogen cycle than the planet itself does. That is, humans have doubled the amount of nitrogen in play.

This has led to an imbalance. It is easier to create nitrogen fertilizer than it is to apply it evenly to fields. When farmers dump nitrogen on a crop, much is wasted. It runs into the water and soil, where it either reacts chemically with its surroundings to form new compounds or flows off to fertilize something else, somewhere else.

That chemical reaction, called acidification, is noxious and contributes significantly to acid rain. One of the compounds produced by acidification is nitrous oxide, which aggravates the greenhouse effect. Green growing things normally offset global warming by sucking up carbon dioxide, but nitrogen on farm fields plus methane from decomposing vegetation make every farmed acre, like every acre of Los Angeles freeway, a net contributor to global warming. Fertilization is equally worrisome. Rainfall and irrigation water inevitably washes the nitrogen from fields to creeks and streams, which flows into rivers, which floods into the ocean. This explains why the Mississippi River, which drains the nation’s Corn Belt, is an environmental catastrophe. The nitrogen fertilizes artificially large blooms of algae that in growing suck all the oxygen from the water, a condition biologists call anoxia, which means “oxygen-depleted”. Here there’s no need to calculate long-term effects, because life in such places has no long term: everything dies immediately. The Mississippi River’s heavily fertilized effluvia has created a dead zone in the Gulf of Mexico the size of New Jersey.

America’s biggest crop, grain corn, is completely unpalatable. It is raw material for an industry that manufactures food substitutes. Likewise, you can’t eat unprocessed wheat. You certainly can’t eat hay. You can eat unprocessed soybeans, but mostly we don’t. These four crops cover 82 percent of American cropland. Agriculture in this country is not about food; it’s about commodities that require the outlay of still more energy to become food.

About two thirds of US grain corn is labeled “processed”, meaning it is milled and otherwise refined for food or industrial uses. More than 45 percent of that becomes sugar, especially high-fructose corn sweeteners, the keystone ingredient in three quarters of all processed foods, especially soft drinks, the food of America’s poor and working classes. It is not a coincidence that the American pandemic of obesity tracks rather nicely with the fivefold increase in corn-syrup production since Archer Daniels Midland developed a high-fructose version of the stuff in the early seventies. Nor is it a coincidence that the plague selects the poor, who eat the most processed food.

It began with the industrialization of Victorian England. The empire was then flush with sugar from plantations in the colonies. Meantime the cities were flush with factory workers. There was no good way to feed them. And thus was born the afternoon tea break, the tea consisting primarily of warm water and sugar. If the workers were well off, they could also afford bread with heavily sugared jam – sugar-powered industrialization. There was a 500 percent increase in per capita sugar consumption in Britain between 1860 and 1890, around the time when the life expectancy of a male factory worker was seventeen years. By the end of the century the average Brit was getting about one sixth of his total nutrition from sugar, exactly the same percentage Americans get today – double what nutritionists recommend.

There is another energy matter to consider here, though. The grinding, milling, wetting, drying, and baking of a breakfast cereal requires about four calories of energy for every calorie of food energy it produces. A two-pound bag of breakfast cereal burns the energy of a half-gallon of gasoline in its making. All together the food-processing industry in the United States uses about ten calories of fossil-fuel energy for every calorie of food energy it produces.

That number does not include the fuel used in transporting the food from the factory to a store near you, or the fuel used by millions of people driving to thousands of super discount stores on the edge of town, where the land is cheap. It appears, however, that the corn cycle is about to come full circle. If a bipartisan coalition of farm-state lawmakers has their way – and it appears they will – we will soon buy gasoline containing twice as much fuel alcohol as it does now. Fuel alcohol already ranks second as a use for processed corn in the United States, just behind corn sweeteners. According to one set of calculations, we spend more calories of fossil-fuel energy making ethanol than we gain from it. The Department of Agriculture says the ratio is closer to a gallon and a quart of ethanol for every gallon of fossil fuel we invest. The USDA calls this a bargain, because gasohol is a “clean fuel”. This claim to cleanness is in dispute at the tailpipe level, and it certainly ignores the dead zone in the Gulf of Mexico, pesticide pollution, and the haze of global gases gathering over every farm field. Nor does this claim cover clean conscience; some still might be unsettled knowing that our SUVs’ demands for fuel compete with the poor’s demand for grain.

Green eaters, especially vegetarians, advocate eating low on the food chain, a simple matter of energy flow. Eating a carrot gives the diner all that carrot’s energy, but feeding carrots to a chicken, then eating the chicken, reduces the energy by a factor of ten. The chicken wastes some energy, stores some as feathers, bones, and other inedibles, and uses most of it just to live long enough to be eaten. As a rough rule of thumb, that factor of ten applies to each level up the food chain, which is why some fish, such as tuna, can be a horror in all of this. Tuna is a secondary predator, meaning it not only doesn’t eat plants but eats other fish that themselves eat other fish, adding a zero to the multiplier each notch up, easily a hundred times, more like a thousand times less efficient than eating a plant.

This is fine as far as it goes, but the vegetarian’s case can break down on some details. On the moral issues, vegetarians claim their habits are kinder to animals, though it is difficult to see how wiping out 99 percent of wildlife’s habitat, as farming has done in Iowa, is a kindness. In rural Michigan, for example, the potato farmers have a peculiar tactic for dealing with the predations of whitetail deer. They gut-shoot them with small-bore rifles, in hopes the deer will limp off to the woods and die where they won’t stink up the potato fields.

Animal rights aside, vegetarians can lose the edge in the energy argument by eating processed food, with its ten calories of fossil energy for every calorie of food energy produced. The question, then, is: Does eating processed food such as soy burger or soy milk cancel the energy benefits of vegetarianism, which is to say, can I eat my lamb chops in peace? Maybe. If I’ve done my due diligence, I will have found out that the particular lamb I am eating was both local and grass-fed, two factors that of course greatly reduce the embedded energy in a meal. I know of ranches here in Montana, for instance, where sheep eat native grass under closely controlled circumstances – no farming, no plows, no corn, no nitrogen. Assets have not been stripped. I can’t eat the grass directly. This can go on. There are little niches like this in the system. Each person’s individual charge is to find such niches.

Chances are, though, any meat eater will come out on the short end of this argument, especially in the United States. Take the case of beef. Cattle are grazers, so in theory could live like the grass-fed lamb. Some cattle cultures – those of South America and Mexico, for example – have perfected wonderful cuisines based on grass-fed beef. This is not our habit in the United States, and it is simply a matter of habit. Eighty percent of the grain the United States produces goes to livestock. Seventy-eight percent of all of our beef comes from feed lots, where the cattle eat grain, mostly corn and wheat. So do most of our hogs and chickens. The cattle spend their adult lives packed shoulder to shoulder in a space not much bigger than their bodies, up to their knees in shit, being stuffed with grain and a constant stream of antibiotics to prevent the disease this sort of confinement invariably engenders. The manure is rich in nitrogen and once provided a farm’s fertilizer. The feedlots, however, are now far removed from farm fields, so it is simply not “efficient” to haul it to cornfields. It is waste. It exhales methane, a global-warming gas. It pollutes streams. It takes thirty-five calories of fossil fuel to make a calorie of beef this way; sixty-eight to make one calorie of pork.

Still, these livestock do something we can’t. They convert grain’s carbohydrates to high-quality protein. All well and good, except that per capita protein production in the United States is about double what an average adult needs per day. Excess cannot be stored as protein in the human body but is simply converted to fat. This is the end result of a factory-farm system that appears as a living, continental-scale monument to Rube Goldberg, a black-mass remake of the loaves-and-fishes miracle. Prairie’s productivity is lost for grain, grain’s productivity is lost in livestock, livestock’s protein is lost to human fat – all federally subsidized for about $15 billion a year, two thirds of which goes directly to only two crops, corn and wheat.

This explains why the energy expert David Pimentel is so worried that the rest of the world will adopt America’s methods. He should be, because the rest of the world is. Mexico now feeds 45 percent of its grain to livestock, up from five percent in 1960. Egypt went from three percent to 31 percent in the same period, and China, with a sixth of the world’s population, has gone from eight percent to 26 percent. All of these places have poor people who could use the grain, but they can’t afford it.

I live among elk and have learned to respect them. One moonlit night during the dead of last winter, I looked out my bedroom window to see about twenty of them grazing a plot of grass the size of a living room. Just that small patch among acres of other species of native prairie grass. Why that species and only that species of grass that night in the worst of winter when the threat to their survival was the greatest? What magic nutrient did this species alone contain? What does a wild animal know that we don’t? I think we need this knowledge.

Food is politics. That being the case, I voted twice in 2002. The day after Election Day, in a truly dismal mood, I climbed the mountain behind my house and found a small herd of elk grazing native grasses in the morning sunlight. My respect for these creatures over the years has become great enough that on that morning I did not hesitate but went straight to my job, which was to rack a shell and drop one cow elk, my household’s annual protein supply. I voted with my weapon of choice – an act not all that uncommon in this world, largely, I think, as a result of the way we grow food. I can see why it is catching on. Such a vote has a certain satisfying heft and finality about it. My particular bit of violence, though, is more satisfying, I think, than the rest of the globe’s ordinary political mayhem. I used a rifle to opt out of an insane system. I killed, but then so did you when you bought that package of burger, even when you bought that package of tofu burger. I killed, then the rest of those elk went on, as did the grasses, the birds, the trees, the coyotes, mountain lions, and bugs, the fundamental productivity of an intact natural system, all of it went on.

Richard Manning is the author of Against the Grain: How Agriculture Has Hijacked Civilization, published by North Point Press.

This is The Oil We Eat, a feature, originally from February 2004, published July 23 2004. It is part of Features, which is part of Harpers.org.

http://harpers.org/TheOilWeEat.html

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>Slander? She Wrote the Book

2006/02/27 4 comments

>How Ann Coulter Gets Away With Defaming Liberals

by Ted Rall

http://www.tedrall.com (February 21 2006)

My utterances occasionally spark controversy but I’ve got nothing on Ann Coulter. The star Republican pundit, who has spewed more racist, offensive and defamatory slurs in a week than Louis Farrakhan and Pat Robertson have in their whole lives combined, has turned slander and threats of violence into a cottage industry.

Coulter thinks the nation’s top journalists deserve to die. “My only regret with Timothy McVeigh”, Coulter sneered in reference to the Oklahoma City bomber, “is he did not go to the New York Times building”. After 9/11, she validated radical Islamists’ fear and hatred: “We should invade their countries, kill their leaders and convert them to Christianity”.

After he called for the assassination of the president of Venezuela, conservatives pressured Reverend Robertson to apologize. But when Coulter dropped the following three racist slurs and a fatwa on the Iranian president in a single paragraph of her syndicated column last week, no one blinked: “If you don’t want to get shot by the police, Mahmoud Ahmadinejad, then don’t point a toy gun at them. Or, as I believe our motto should be after 9/11: Jihad monkey talks tough; jihad monkey takes the consequences. Sorry, I realize that’s offensive. How about ‘camel jockey’? What? Now what’d I say? Boy, you tent merchants sure are touchy. Grow up, would you?”

Senate Majority Leader Bill Frist was asked about Coulter’s use of the r-word – “ragheads” – to refer to Muslims. “I better not comment”, he said.

Threatening the life of a top government official violates federal law but it’s just another throwaway line for Coulter. “We need somebody to put rat poison in Justice [John Paul] Stevens’ creme brulee”, she said earlier this month. Imagine the outrage if a Democrat said the same thing about Antonin Scalia.

Coulter is the Republican Id, giving voice to ugly sentiments that other conservatives don’t dare express aloud for fear of public censure: Muslims are subhuman, torture is OK, all liberals are traitors. “Liberals hate America, they hate flag-wavers, they hate abortion opponents, they hate all religions except Islam, post 9/11. Even Islamic terrorists don’t hate America like liberals do”, she spat in 2002. For Coulter, there is never censure – only more royalties from her bestselling books.

Coulter is a nasty, foul-mouthed bigot – all of which, like my observation of same, is thankfully protected by the First Amendment. The same cannot be said, however, about her malicious lies about me.

In the same column as her aforementioned anti-Muslim slurs Coulter wrote: “Iran is led by a lunatic who makes a big point of denying the Holocaust. Indeed, in response to the Muhammad cartoons, one Iranian newspaper is soliciting cartoons about the Holocaust. (So far the only submissions have come from Ted Rall, Garry Trudeau and The New York Times.)”

This comment was apparently a rehash of a speech she delivered to the influential Conservative Political Action Committee (CPAC). “Iran is soliciting cartoons on the Holocaust”, she told a thousand-plus audience that included Dick Cheney. “So far, only Ted Rall, Garry Trudeau, and The New York Times have made submissions”. Let’s hope CPAC didn’t pay her for original material.

Coulter is entitled to her opinions, not to lie about the facts. I have not entered, nor do I intend to enter, Iran’s anti-Holocaust cartoon contest. And I don’t take kindly to being associated with the Iranian president’s comments denying Nazi atrocities. Yo, Ann: criticizing Bush doesn’t make me a neo-Nazi anti-Semitic Holocaust denier. In fact, I despise Bush precisely because his rise to power, love of violence and jingoism mirror those of the Third Reich.

Coulter’s defenders say she was “just joking”. Her enemies say they don’t take her seriously. But the content of her column, which references the Egyptian ferry disaster and the Danish Mohammed cartoon hubbub, with a central thesis that advocates “bombing Syria back to the stone age and then permanently disarming Iran”, is deadly serious. Which is exactly the way her readers, who sent me e-mails calling me an anti-Semite and anti-American traitor, took her lie about me.

“Iran’s cartoon contest is the sort of thing you’d expect Ted Rall to enter” would have qualified as (bad) satire. But she didn’t say that.

Canny marketing of Coulter’s sexuality has elevated her to alpha female status among a post-9/11 pack of right-wing attack dogs. These neo-McCarthyites (Sean Hannity, Michael Savage, Andrew Sullivan) think they can get away with saying anything, no matter how factually inaccurate, about their political opponents. And they’ve been right – because wimpy liberals refuse to stand up for themselves.

So far.

During the 1950s, when a delusional alcoholic named Joe McCarthy ruined careers and reputations by smearing liberal Democrats as traitors, Army lawyer Joseph Welch marked the beginning of America’s return to sanity by snapping at the thuggish senator on TV: “You have done enough. Have you no sense of decency sir, at long last? Have you left no sense of decency?” (Coulter’s book Treason claims that McCarthy was a swell guy who was right all along, and that the Eisenhower Administration was full of Commies.)

We need another Joseph Welch. We must hold the neo-McCarthyite Coulter smear machine that slimed Max Cleland (“He didn’t give his limbs for his country”, “or leave them on the battlefield”, she said, “There was no bravery involved …”) accountable for its lies. It’s a matter of decency, honor and setting the record straight.

As far as I can tell, no one has ever sued Coulter for slander or libel. That may change. My attorney tells me I have an actionable claim on two counts, for both the CPAC speech and the column. It wouldn’t be an open-and-shut case, but there are precedents in my favor. Readers of my Rallblog have pledged nearly $9000 if I file such a lawsuit, but it would take several times that amount to keep fighting until I get my day in court. A deep-pocketed angel would make the difference, but there, alas, is the root of the trouble with the American left.

Right-wingers, unified and organized, always stand fast and dig deep to protect their ideas and their people. They even run interference for their dead; they bullied ABC into canceling a less than hagiographical film about Ronald Reagan. They celebrate extremists like Ann Coulter, inviting her to speak at conventions attended by their brightest lights. Democrats, on the other hand, keep their most articulate advocates like Howard Dean under wraps. Trivial differences of style and ideology become reasons for lefties not to help each other out when they’re attacked. So it will likely go once more, as Coulter gets away with yet another outrageous smear.

Copyright 2006 Ted Rall

http://www.uexpress.com/tedrall/

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>The Paradox Of Pornography

>by Robert Jensen

ZNet Commentary (February 23 2006)

Pornography’s business has always been the exposure of women’s bodies for the pleasure of men, and that was readily evident at the annual Adult Entertainment Expo in Las Vegas last month.

But also exposed at the sex-industry gathering was the paradox of the pornography business at this particular moment: At the same time that the pornography industry and its products are more normalized than ever in the United States, the images they produce are more brutal and degrading toward women than ever. How can it be that a once-underground industry that lived at the margins of society has become mainstream, at precisely the same time that its sexual cruelty toward women is most pronounced?

The resolution of the paradox offers disturbing insights not just into the sexual ethics and gender politics of the United States, but into the underlying values of the entire society.

The AEE – which attracted 350 exhibitors to the Sands Expo Center, one of Las Vegas’ major convention facilities – is part industry-insider gathering and part public spectacle. About 18,000 fans, the vast majority of them men, paid $40 a day to wait in long lines to pick up autographs from their favorite women in pornography and be photographed next to them. While fans indulged their fantasies, pornography producers focused on deal-making, often sounding as if their business were no different than selling shoes. In seminars, industry experts talked about improving marketing and retailing practices to expand market share and increase profits

On the convention floor, most everyone would have agreed with Paul Fishbein, president of Adult Video News, the trade magazine that sponsors the event: “[T]he industry is ready to serve the needs of adult retailers, as well as consumers that seek to celebrate their sexuality”.

And “celebrate” they do, with no questions asked. In Las Vegas, no one was discussing the social implications of the commodification of sexuality and intimacy in the 13,000 new pornographic videos and DVDs released in 2005. Questions about the effects of sexualizing male dominance in a $12 billion a year business were not on the table. This was a venue for self-indulgence, not self-reflection.

Pornography – though still resisted by some, from either a conservative/religious position or, on very different grounds, from a feminist point of view – has become just one more form of mass entertainment in a culture obsessively dedicated to the pleasure-without-thought-about-the-consequences principle. Not everyone likes it, but few see it as worth debating.

But the paradox remains: At the same time that it is more accepted, pornography’s content is becoming steadily more extreme. In the “gonzo” style (those films with no plot or characters, just straightforward sex on tape) that dominates the market, directors continue to push the edge, filming increasingly rougher sexual practices involving multiple penetrations of women by two or three men at a time, or oral sex designed to make a woman gag, while the language used to insult women during sex grows harsher. Since legal controls on pornography began loosening in the 1970s, pornographers have pushed the limits of sexualizing the denigration of women.

Though the pornography industry loves to talk about growing sales to women and the so-called “couples market”, men are still the vast majority of pornography consumers in the United States. Producers and distributors I interviewed at the convention all estimated their clientele was eighty to ninety percent men.

What do these men want to watch? It turns out they like viewing sexual acts that the majority of women do not want to perform in their lives. While there is no survey data about women’s preferences regarding multiple penetrations or gag-inducing sex, informal investigation suggests such things are not common in the day-to-day lives of most people and not sought after by most women.

So, how can we explain the paradox? People typically do not openly endorse cruelty or the degradation of women. Yet just as those features of pornography are more extensive and intense than ever, graphic sexually explicit material is more widely accepted than ever. How can a culture embrace images that violate its stated values? Wouldn’t a society that purports to be civilized reject sexual material that becomes evermore dismissive of the humanity of women? There are two potential explanations.

First, because of the way pornography works, most of the consumers don’t see the material as being saturated with cruelty or degradation; the sexual pleasure that pornography produces tends to derail critical viewing and thinking. When consumers are focused on the pleasure, the politics drop out of view. So, when fans I interviewed said they didn’t think the material they watched embodied male domination and female subordination, they likely were being honest. They don’t see it, because they are too absorbed in feeling the sexual pleasure to be thinking about such issues.

But some men are quite clear about the gender politics in pornography, and they like it. Most of the advertising for the gonzo style highlights the subordination of women – one company brags it is in the business of “degrading whores for your viewing pleasure” – which suggests that’s exactly what some men are looking for.

The second explanation is a painful reminder that, in fact, the United States is a nation that has no serious objection to cruelty and degradation. After all, there was no sustained, collective outrage over the revelations of systematic torture by US military forces, epitomized by the photos from Abu Ghraib in Iraq. One prominent right-wing commentator compared it favorably to fraternity hazing rituals, which is not entirely misguided – fraternity hazing is routinely cruel and degrading, albeit at a much lower level.

The United States is a society that uses brutal levels of military force, including the illegal targeting of civilian infrastructure (such as in the 1991 Gulf War, when power, sewage, and water facilities were targeted) and the routine use of weapons that military officials know kill large numbers of civilians (such as cluster bombs that continue to kill long after the conflict is over, as unexploded bombs detonate for years). The culture celebrates this as evidence of our benevolence as we “liberate” other countries.

The United States is a society that locks up more than two million people, a higher percentage of its population than any other country, disproportionately non-white. The everyday conditions under which many of those human beings are kept in this prison-industrial complex are so harsh and degrading that leading human-rights groups condemn US prison practices. The culture celebrates this as evidence of the superiority of our system of “justice”.

And the United States is a society that has built thousands of glittering temples to unsustainable levels of consumption – called shopping malls – in this wealthiest nation in history, while nearly half the world’s people live on less than $2 a day. The culture celebrates this state of affairs as the wondrous workings of the magical market.

So, there is no paradox in the mainstreaming of an intensely cruel pornography; pornographers aren’t a deviation from the norm. Their presence in the mainstream shouldn’t be surprising, because they represent mainstream values: The logic of domination and subordination that is central to patriarchy, nationalism, racism, and capitalism.

What pornography says about sexuality, intimacy, and gender politics in the contemporary United States is frightening. What it says about our entire society is even more disturbing.

Robert Jensen is a journalism professor at the University of Texas at Austin and a member of the board of the Third Coast Activist Resource Center http://thirdcoastactivist.org/. He is the author of The Heart of Whiteness: Race, Racism, and White Privilege and Citizens of the Empire: The Struggle to Claim Our Humanity. He can be reached at rjensen@uts.cc.utexas.edu .

http://www.zmag.org/sustainers/content/2006-02/12jensen.cfm

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>Eating for Credit

>by Alice Waters

New York Times Op-Ed (February 24 2006)

Berkeley, California

It’s shocking that because of the rise in Type 2 diabetes experts say that the children we’re raising now will probably die younger than their parents – the result of a disease that is largely preventable by diet and exercise. But in public schools these days, children all too often are neither learning to eat well nor to exercise.

Fifty years ago, we had a preview of today’s obesity crisis: a presidential council told us that America’s children weren’t fit – and we did something about it, at great expense. We built gymnasiums and tracks and playgrounds. We hired and trained teachers. We made physical education part of the curriculum from kindergarten through high school. Students were graded on their performance.

Universal physical education is a start, and it’s a shame that schools have been cutting back on recess and gym. But in a country where nine million children over six are obese we need the diet part of the equation, too. It’s time for students to start getting credit for eating a good lunch.

I know from experience that teaching children about food changes their lives. I helped establish a gardening and cooking project in the public schools here in Berkeley called the Edible Schoolyard, and I’ve come to believe that lunch should be at the center of every school’s curriculum.

Schools should not just serve food; they should teach it in an interactive, hands-on way, as an academic subject. Children’s eating habits stay with them for the rest of their lives. The best way to defeat the obesity epidemic is to teach children about food – and thereby prevent them from ever becoming obese.

The trouble is that the shared family meal is now a rare experience for most youngsters, with only a third of married couples with children reporting regularly having dinner as a family. We have abdicated our responsibility to these children, placing their well-being in the hands of the fast-food industry, whose products – hamburgers, chicken nuggets, French fries – dominate school lunch programs.

Not only are our children eating this unhealthy food, they’re digesting the values that go with it: the idea that food has to be fast, cheap and easy; that abundance is permanent and effortless; that it doesn’t matter where food actually comes from. These values are changing us. As a nation, we need to take back responsibility for the health of not just our children, but also our culture.

Our program began at Martin Luther King Junior Middle School ten years ago, with a kitchen classroom and a garden full of fruits, vegetables and herbs. A cafeteria where students, faculty and staff members will eat together every day is under construction, and the Edible Schoolyard has become a model for a district-wide school lunch initiative.

At King School today, 1,000 children are involved in growing, preparing and sharing fresh food. These food-related activities are woven into the entire curriculum. Math classes measure garden beds. Science classes study drainage and soil erosion. History classes learn about pre-Columbian civilizations while grinding corn.

We’re not forcing them to eat their vegetables; we’re teaching them about the botany and history of those vegetables. We’re not scaring them with the health consequences of their eating habits; we’re engaging them in interactive education that brings them into a new relationship with food. Nothing less will change their behavior.

We can try to improve diets all we want by making school lunches more nutritious and by getting vending machines out of the hallways, but that gets us only partway there. For example, New York City has just banned whole milk in its public schools. It’s a courageous first step, but how can we be sure students will drink healthier milk just because it’s offered to them, let alone understand what lifelong nourishment is all about?

Indeed, it’s too often the fresh fruit and salad that gets tossed in the garbage at school cafeterias. Even if they weren’t already addicted to salt and sugar, children tend to be wary of unfamiliar foods – and besides, they can always bring packaged junk in for lunch or buy fast food after school. Healthful food that’s offered in a “take it or leave it” way is often, well, left.

But when a healthy lunch is a part of a class that all children have to take, for credit – and when they can follow food from the garden to the kitchen to the table, doing much of the work themselves – something amazing happens. The students want to taste everything. They get lured in by foods that are beautiful, that taste and smell good, that appeal to their senses. When children grow and prepare good, healthy food themselves, they want to eat it, and, what’s more, they like this way of learning.

We need a revolution, a delicious revolution, that will induce children – in a pleasurable way – to think critically about what they eat. The study of food, and school lunch, should become part of the core curriculum for all students from kindergarten through high school. Such a move will take significant investment and the kind of resolve that this country showed a half-century ago. It will be costly, but if we don’t pay now, the health care bill later will be astronomical.

Alice Waters is the owner of Chez Panisse Restaurant and Cafe and the founder of the Chez Panisse Foundation.

http://www.nytimes.com/2006/02/24/opinion/24waters.html

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>Lumpen Leisure

>Bread and circuses … and jet-skis.

by James Howard Kunstler

The American Conservative (February 13 2006)

Among the many wonders and marvels of American life in the 20th century, especially after World War II, when our country ruled much of the world economically, was the astounding rise in standards of living among social classes that had hardly known leisure or had a dollar to spare on its accoutrements from time immemorial. The subject of class in America has been so sore that we can barely acknowledge its existence, despite the workings of whole industries devoted to exploiting the envy of the lower orders. The very term – lower orders – would be considered grounds for sacking if I had the misfortune of teaching at a college and will certainly be seized on by critics as evidence of my intellectual unfittedness. In short, any discourse on class consciousness is regarded in America these days as an obscenity far worse than stealing $100 million from the shareholders of a telecom corporation.

I write this as someone who does not have a Marxist bone in his body – I am devoid of the impulse to reform the social class system per se precisely because I regard it as an implacable fact of life. The universe is organized hierarchically, and that’s all there is to it. All of the subcategories of things in it tend to be organized hierarchically, too, especially the social life of animals, including human beings. It might be argued that the hunter-gatherers of prehistory enjoyed more pure equality in their little bands and tribes, but that was only because they possessed next to nothing in material wealth. The rest, literally, is history. Once civilization got up and running, the story was nothing but class, since our complex societies required many layers of organization in the making, moving, and caretaking of things, and some persons enjoyed more favorable roles than others.

Industrial civilization enlarged the middle class without necessarily relieving the misery of the lower classes, which also grew, shifting their labors from the farm to the factory. Marxism was, of course, an effort to reform industrial society by inciting the lower orders to make war on all the orders above them. It failed because it eliminated the necessary incentives for producing industrial wealth – namely, the legal right of persons to accumulate it – while it additionally failed to abolish privilege among the politically connected. So privileged persons in places like the Soviet Union simply worked around the artificial impediments to a superior lifestyle, while the masses toiled in squalid and resigned futility.

Now, the high tide of industrial society, the 20th century, also happened to be an era of tremendously destructive industrial warfare. By mid-century, after two World Wars, the industrial nations of Europe had exhausted and bankrupted themselves and lay physically shattered. The same was true of Asia’s only industrial power, Japan. The situation in the United States, on the other hand, was favorable to the extreme. The US continental homeland went unscathed in both World Wars, and at the end of the second, our factories, mines, oil fields, harbors, and railroads stood completely intact while everyone else’s were devastated. We set out immediately to supply the rest of the shattered world with the necessary manufactured goods to resume civilized life and lent them money to buy our stuff. Once this program got underway in earnest, one of the side effects was a fabulous enrichment of America’s laboring classes.

These classes – the assembly-line workers, the road-builders, the house-framers, masons, auto mechanics, truck drivers, et cetera – entered this miraculous new age straight from the lengthy sequential traumas of the Great Depression and the Second World War. Their expectations following the war were modest. Many were glad to have simply made it home alive from the canebrakes of the Solomon Islands and the beaches of Normandy. There was widespread anxiety that without the artificial stimulus of war production, America would sink into economic depression again. This worked out otherwise. The factories easily converted back to car-making from tank-building; William Levitt figured out how to mass produce the suburban house, starting a boom; and the American oil industry got the world’s motors up and running again to get the big cleanup of Europe and Asia underway. As an added benefit, the American managing classes had returned from their stints as officers in the armed forces with equally modest expectations for the rewards of being in charge of things in civilian life. The Army had conditioned them into a subculture assembled by rank but careful in the allocation of privilege, so as to keep up morale through the ranks for the greater good of winning the war. The officers-turned-executives brought these values into postwar corporate life for the greater good of winning a durable prosperity. By the same token, the lower ranks came out of the war with a fund of respect for the authority that had engineered their victory.

And so, by 1956, say, the president of a toaster company might be paid several multiples more than the guy on the assembly line but not obscenely more. In 1956, both would certainly be owners of American cars – a Cadillac versus a Ford Fairlane – and might well have owned their own homes in greater or lesser suburbs. But their standards of living would seem, from today’s standpoint, startlingly similar. Both families would have had TV, perhaps one versus several, but both families also went to the movies at the Loews Theater and democratically took their seats first-come-first-served. Ditto the ballparks and football stadiums in the days before luxury boxes. Both upper and working class families ate the standard supermarket victuals of the day because the gourmet stratification of America had not yet happened. Both families might well have sent their children to public schools. Both fathers may have been Sunday golfers, though on different public and private courses. By the early 1960s, with America at the height of its manufacturing dominance, General Motors assembly-line workers made as much money as tenured college professors.

Now, politically, the situation I describe would seem to be very desirable, perhaps ideal, considering all the unjust systems that had existed before and elsewhere. The American system in those years was fairly equitable and appeared to be stable. But like all good things deriving from industrial civilization, this social-leveling process had some strange diminishing returns. One was that the lower ranks of American society became so affluent by historical terms that they were able to impose their tastes on everybody else, if only because there were so many of them, with so much money to spend. They began to occupy and modify the terrain of America in a way that lower classes never had been able to before – using the prime artifact of industrial civilization to accomplish that takeover, the car. They bought homes in the new subdivisions that were obliterating the rural hinterlands of the cities, and before long all the commercial accessories followed: the strip malls, the department stores, the fried-food huts, the cinemaplexes, the office parks, the Big Box store – an entire alternate infrastructure to the tired, bleak downtowns of the industrial cities, which had begun to sicken in the Great Depression and with a very few special exceptions would never return to health again. The new stuff built all over America in the late 20th century was analogous to the content of the television programming to which the lower classes insidiously became addicted – a cartoon simulacrum of a real world that was systematically being obliterated. Instead of a real countryside outside the hated cities, we now had suburbia, a cartoon of country living. Instead of towns, shopping malls. Eventually the theme park became both the embodiment of the destruction wreaked across the land and paradoxically the last refuge from it. Americans would flock to Walt Disney World in Orlando to put themselves in a saccharine replica of the authentic Main Street environments that they had thoroughly trashed in their own home places.

Another diminishing return of the American postwar industrial fiesta was that thanks to our exertions, our salesmanship, and our generosity, the other industrial nations were back on their feet making things again, and before too long they were making things better than we were and less expensively, too. Thus, beginning in the 1970s and coincident with our all-time peak in oil production, America began to hemorrhage blue-collar factory jobs. Families that had grown comfortable in high-paying assembly-line jobs, who had motor boats and second homes on little lakes and took vacations at the Disneyplexes and expected life to get ever better, were clobbered by the stagflation and other economic disorders of the day. Meanwhile, the labor unions that had guarded their interests for decades rapidly lost their power to negotiate for workers whose jobs increasingly no longer even existed.

At this point, a new economy began to replace the old smokestack economy. But the new one was not the one that was advertised in politics or the news media. It was not the information economy based on the spread of computers. Neither information nor computer-aided efficiency had net social value when jobs and standards of living were being destroyed. Nor was this new economy the vaunted service economy, a perpetual-motion fantasy akin to the proverbial village whose denizens supported themselves by taking in each other’s laundry. No, all that was mendacious balderdash. The real new economy was the final blowout of the cheap-oil era: the hypertrophic build-out of suburban sprawl and the furnishing and final accessorizing of it. In other words, our living arrangement essentially became the remaining basis of our economy, in the absence of any other purposeful creation of value or wealth, such as manufacturing things. And because it was a racket devoted to a way of life with no future, it spawned enormous cynicism. Just as the immersive ugliness of the suburban highway strip was economic entropy made visible, so the cynicism of the public was entropy applied to human values, a force propelling things into disorder. When nothing was sacred, everything became profane.

The demoralization of the American public, and especially of the economic lower orders, proceeded remorselessly from the 1980s on and became focused on two very pernicious ideas: first, the belief that it was possible to get something for nothing, and second, the belief that when you wish upon a star, your dreams come true.

The first derived from the fact that Americans still appeared to generate wealth without really producing anything of value. This was achieved through the accumulation of debt represented by the false collateral of suburban real estate – the infrastructure of a living arrangement with no future. Meanwhile, this debt, or credit – hallucinated surplus wealth – was cleverly converted into huge batches of tradable financial instruments and used to drive both bond and derivatives markets. Since finance is ultimately predicated on the expectation that the wealth of societies will ever increase, this economy was the greatest shuck and jive the world had ever seen.

The second idea, that when you wish upon a star your dreams come true, was its perfect accompaniment. It derived from the mental bombardments of advertising and Hollywood movies, and it provoked the American masses to believe that sooner or later the time would come when their individual big payoff would arrive, their ship would come in, their lottery number would hit the jackpot, they would break the house at the blackjack table of the Mirage Hotel.

Now, the trouble with this kind of demoralizing belief system is that most adult human beings realize at some level that it is at odds with the way the universe works, that it is an edifice of lies – just as a maxed-out collection of credit cards was a lie about one’s personal finances. Their sensed moral failures aroused in Americans a welter of negative emotion including guilt, shame, unworthiness, powerlessness, terror, and ultimately anger over having to feel these unpleasant emotions, and they expressed their anger by striking out against nature, employing the very machines that defined the terms of their existence, the automobile and its spawn: monster trucks, motorcycles, dune buggies, snowmobiles, all-terrain vehicles, and gigantic motorboats whose chief attractions were their power to negate the scale of the average freshwater lake while making enormous amounts of noise. These were people who no longer felt comfortable or even ontologically present in the world unless engines of some kind were ringing in their ears. Their assault on the landscape of America completed the destruction that suburbia had left unfinished. And as the cheap oil, which made the whole exercise possible, fades into history with the global oil-production peak upon us, America was reduced to a nation of tattooed, overfed clowns in paramilitary drag, pretending to be powerful.

The tendency for symbolic behavior in human beings is impressive. We are naturally and unselfconsciously metaphorical beings. By the 1960s, when America’s industrial smokestack economy was at its zenith, cigarette smoking was at its peak, too. Forty percent of the adult population smoked, each smoker behaving like a little factory, expelling the by-products of combustion at all hours of the day and night. It was practically required as a mark of adulthood. It was at least an entitlement. You could smoke on the job and in the college classroom. You could smoke in the doctor’s waiting room. You could smoke in your seat on an airplane – a little ashtray was provided right there in the armrest – and nobody was allowed to complain. In those days, smoking was more central to socializing than sharing food. TV broadcasting was largely supported by tobacco advertising. Smoking defined the character of movie stars: Humphrey Bogart expressed the entire range of human emotion in the way he handled his beloved Chesterfields, and eventually it killed him. In the middle of Times Square, a mechanized billboard with a hole in it blew “smoke rings” of steam out over the masses on the sidewalk. The adult population had plumes of smoke coming out of its collective mouths and nostrils the way that our society had smoke coming out of its cities and mill valleys. Notice how cigarette smoking has waned in lockstep with the decline of American smokestack industry.

Along similar lines today, it’s compelling to see how NASCAR auto racing has risen to the level of a mania in early 21st-century America as the nation has reached its absolute zenith of automobile use. Even as the world approached the all-time global oil-production peak, Americans rallied obliviously to the weekend proving grounds of the stock-car gods. NASCAR eclipsed baseball, football, and basketball in popularity among spectator sports. Of course, in real life, driving automobiles had come to occupy a huge amount of the public’s time. Many adults were spending a good two hours a day commuting to work and back. They were spending more time alone in their cars than with their spouses and children. NASCAR was the apotheosis of the same kind of cars that Americans drove to work. The competition vehicles were called stock cars, after all, because they were, theoretically, just souped-up versions of the same models that anyone could find in stock at an ordinary car dealership – unlike the Formula One racecars favored in Europe. What’s more, the American economy was now mostly based on creating and maintaining the enormous infrastructures of motoring, that is, suburbia, just as it had previously been centered on the infrastructures of industrial production. So the masses had merely shifted their symbolic behavior focus from an emphasis on expelling smoke to an emphasis on watching souped-up ordinary cars move symbolically around in circles.

Or more precisely, ovals, which, from the grandstand, was sort of like sitting on a freeway overpass for five hours watching traffic. The NASCAR racetracks had evolved from county-fair dirt tracks with a few rickety bleachers to gargantuan stadiums accommodating more than 100,000 spectators. It was significant, too, that the NASCAR subculture arose in the South, the old Dixie states, where the automobile had tremendous social transformative power in the previous half century. Prior to the Second World War, Dixie had been an agricultural backwater with few cities of consequence, peopled by, among other groups, a dominant Caucasian peasantry, called “rednecks” because of the effects of the sun on exposed pale skin in the dusty crop rows.

States like Georgia, North Carolina, and Alabama were huge. You could fit eleven Connecticuts in Alabama and have room for Rhode Island and Delaware. Unless they lived right along the railroad line, the folks down on the farm were pretty much stuck in place. The automobile liberated the rednecks from the oppression of geography as emancipation had liberated blacks from the legalities of chattel ownership. In fact, the effect of the car was arguably much greater, since blacks continued to exist in economic quasi-serfdom despite the putative change in their legal status. The car and all its manifold benefits hoisted poor rednecks into a middleclass existence that had seemed like a distant fairy tale previously, something only seen in the magazine pages they had used to wallpaper the rooms of their cracker cottages – their own typological term for such a dwelling. They became truckers and car dealers and car repairmen and the owners of fried-food franchises out on the highway. They made good wages, and some became rich. Once a broad money base was established, they excelled at suburban development because rural land was so cheap and there was so much of it. They worshipped the car more than they worshipped Jesus.

The economy of the South was utterly transformed after the Second World War and the new economy was mostly about the car. Cheap gasoline along with cheap air-conditioning made the South livable for people who had a choice about where to make their homes. Cheap air-conditioning in particular made city life possible in a region that had lagged hopelessly behind the states of the Old Union – to the degree that Dixie had not a single city substantial enough for a major league baseball team prior to the 1960s. But the cities that arose in Dixie after the war were not like cities elsewhere in physical form. Orlando, Houston, Charlotte, and places like them had gone from being smaller than Buffalo to becoming immense crypto-urbations of ring freeways, radial commercial highway strips, and far-flung housing subdivisions around tiny withered peanuts of pre-war traditional downtown cores. Houston by the year 2000 was not a city in the traditional sense of being composed of neighborhoods and districts; rather it was an assemblage of single-use-zoning wastelands: the shopping wasteland, the medical-services wasteland, the university wasteland, the cul-de-sac house wasteland, and so on, dominated by massive overlayments of automobile infrastructure.

The economy of the New South, as it liked to call itself in the late 20th century, was more about the making of suburban sprawl than of the corporations that were lured down from the north to the Carolinas, Tennessee, and Georgia for their cheap labor. After all, the factories themselves eventually closed up shop as globalism made even cheaper labor in distant nations more attractive to corporate enterprise – but the sprawl remained, along with the office parks where obscenely paid top executives now ran things, while the once mighty working classes slid into a new kind of trailer-trash penury. And that is where things stand today, with the region, and the nation it is still attached to, sleepwalking into the early years of a permanent global fossil-fuel crises that will once again transform the nation in ways we can only sketchily imagine.

Into the first decade of the new century, the New South was viewed as being so successful compared to failing regions like the Midwest rust belt that the behavior emanating from Dixie became paradigmatic for the nation as a whole. It was infectious. These days, the working and sub-working classes from Maine to Minnesota follow country music as avidly as the homefolks down in Spartanburg, South Carolina.

Some lumpen motoring activities obviously have regional characteristics that don’t migrate well. Snowmobile culture arose in the northern states around 1970, when the take-home pay of people performing low-skill jobs reached its all-time high, and a machine formerly used as a rescue vehicle at ski areas and a maintenance tool on ranches was marketed as a winter toy for grown-ups. This was clearly something that was not going to be as popular in Arkansas as in Minnesota. In fact, as this relatively new snowmobile subculture evolved, it became less about the machines themselves and more about drinking with friends in the outdoors – an unfortunate combination as anyone who reads the newspaper in what’s left of small-town America can see in the Monday police blotters when snowmobilers with six Budweisers under their belts decapitate themselves running through fence lines at fifty miles an hour.

All-terrain vehicles, those clumsy three- and four-wheeled motorbikes, were most popular proportionately in the American West, where hunters were able to extend their range to the vast backcountry of federal lands and get their meat home with the assistance of a gasoline engine. Likewise, the dune-buggy originated in California for the simple reason that desert terrain was adjacent to the populous Los Angeles basin. While it has persisted in its limited milieu, dune-buggy culture never quite recovered socially from its association with the murderous doings of Charles Manson and his “family”. The dirt-bike phenomenon also came out of California, but evolved quickly from an off-road work and play vehicle to the dirt-bike tracks of competitive racing, where it gave young men a way to channel surplus testosterone by winning trophies and cash. Ironically, wilderness-trail areas around the suburbs have lately been taken over by non-motorized mountain bikes, which are causing plenty of destruction in their own right.

The jet-ski is perhaps the most baroque and arguably the last in the line of such dedicated leisure vehicles, being in essence a boat with hardly any storage capacity on which one can do little else besides move at great speed over water while soaking wet. Fishing from such craft is awkward. Even drinking on them presents problems, especially where the bulky favored beverage of the sporting masses, beer, is concerned.

The abuse of public lands during this long fiesta of off-roading has led to a crisis of ethics and law. Of the 262 million acres under the federal Bureau of Land Management, 93 percent is open to off-road riding machines. Of 155 national forests, only two are off limits to off-roaders. Regulation of snowmobiles, ATVs, and dirt bikes on public lands has consistently failed in the face of lobbying by corporations who make these toys and of the peremptory claims of rights by those who use them. In a nation of outsourced blue-collar jobs, shrinking incomes, vanishing medical insurance, rising fuel and heating costs, and net-zero personal savings, the anxiety level of the struggling classes has to be appeased politically, and one way to minimize the current cost of that is to charge it off to posterity and the public interest.

Where does this leave us as we enter the post-cheap-oil world and eventually a world altogether without recoverable fossil fuels? You could say up a cul-de-sac in a rusted GMC Denali without a fill-up. Or in a society that will have to get its thrills and satisfactions in other ways, involving fewer prosthetic projections of our will to power. The will to power itself will probably be subdued by something more elemental: a will to stay warm, clean, and well nourished in the era of post-oil-and-gas hardship and turbulence we are entering.

In this new era, coming soon to a 21st-century region near you, the formerly industrial nations will have a great deal of trouble keeping the lights on, getting around, and feeding their people. Vocational niches by the hundreds will vanish, while the need to make up for a failing industrial agriculture, with all its oil-and-gas inputs, will require a revived agricultural working class in substantial numbers. This is in effect a peasantry, and the word itself obviously carries unappetizing overtones, especially among those who used to be certain that the perfection of both human nature and human society was at hand. It all seemed that way, I suppose, in the early 1960s, when the United Auto Workers union was setting up vacation camps along the Michigan lakes, and President Kennedy promised to put a man on the moon before the decade ended, and the doctrine of mutually assured destruction kept a sort of peace among the great military powers, and dad drove home from the Pontiac showroom with a new GTO, which his son, Buddy, used to cruise the strip on Friday nights while “Born to Be Wild” rang out of the radio and out into the warm, soporific San Fernando night.

All over but the keening for our soon-to-be-lost machine world. We’ll have to find new satisfactions now looking inward and reaching out with our limbs to those around us to discover what they are finding inward and outward about themselves. We’ll certainly find music there, and dancing, and perhaps some fighting, and we will still have the means to make bases and balls and sticks for hitting them and gloves for catching them and twilight evenings in the meadow to play in. Amid a great stillness. With the moon rising.
__________________________________________________

James Howard Kunstler is the author, most recently, of The Long Emergency (Atlantic Monthly Press 2005).

Copyright 2006 The American Conservative

http://amconmag.com/2006/2006_02_13/article1.html

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>The Anti-Empire Report

2006/02/22 1 comment

>Some things you need to know before the world ends

by William Blum

http://www.killinghope.org (February 14 2006)

How I Spent My Fifteen Fifteen Minutes of Fame

In case you don’t know, on January 19 the latest audiotape from Osama bin Laden was released and in it he declared: “If you [Americans] are sincere in your desire for peace and security, we have answered you. And if Bush decides to carry on with his lies and oppression, then it would be useful for you to read the book Rogue State, which states in its introduction … “He then goes on to quote the opening of a paragraph I wrote (which appears actually in the Foreword of the British edition only, that was later translated to Arabic), which in full reads:

“If I were the president, I could stop terrorist attacks against the United States in a few days. Permanently. I would first apologize – very publicly and very sincerely – to all the widows and the orphans, the impoverished and the tortured, and all the many millions of other victims of American imperialism. I would then announce that America’s global interventions – including the awful bombings – have come to an end. And I would inform Israel that it is no longer the 51st state of the union but – oddly enough – a foreign country. I would then reduce the military budget by at least ninety percent and use the savings to pay reparations to the victims and repair the damage from the many American bombings and invasions. There would be more than enough money. Do you know what one year of the US military budget is equal to? One year. It’s equal to more than $20,000 per hour for every hour since Jesus Christ was born.

“That’s what I’d do on my first three days in the White House. On the fourth day, I’d be assassinated.”

Within hours I was swamped by the media and soon appeared on many of the leading TV shows, dozens of radio programs, with long profiles in the Washington Post, Salon.com and elsewhere. In the previous ten years the Post had declined to print a single one of my letters, most of which had pointed out errors in their foreign news coverage. Now my photo was on page one.

Much of the media wanted me to say that I was repulsed by bin Laden’s “endorsement”. I did not say I was repulsed because I was not. After a couple of days of interviews I got my reply together and it usually went something like this:

“There are two elements involved here: On the one hand, I totally despise any kind of religious fundamentalism and the societies spawned by such, like the Taliban in Afghanistan. On the other hand, I’m a member of a movement which has the very ambitious goal of slowing down, if not stopping, the American Empire, to keep it from continuing to go round the world doing things like bombings, invasions, overthrowing governments, and torture. To have any success, we need to reach the American people with our message. And to reach the American people we need to have access to the mass media. What has just happened has given me the opportunity to reach millions of people I would otherwise never reach. Why should I not be glad about that? How could I let such an opportunity go to waste?”

Celebrity – modern civilization’s highest cultural achievement – is a peculiar phenomenon. It really isn’t worth anything unless you do something with it.

The callers into the programs I was on, and sometimes the host, in addition to numerous emails, repeated two main arguments against me.

(1) Where else but in the United States could I have the freedom to say what I was saying on national media? Besides their profound ignorance in not knowing of scores of countries with at least equal freedom of speech (particularly since September 11), what they are saying in effect is that I should be so grateful for my freedom of speech that I should show my gratitude by not exercising that freedom. If they’re not saying that, they’re not saying anything.

(2) America has always done marvelous things for the world, from the Marshall Plan and defeating communism and the Taliban to rebuilding destroyed countries and freeing Iraq.

I have dealt with these myths and misconceptions previously; like sub-atomic particles, they behave differently when observed. For example, in last month’s report I pointed out in detail that “destroyed countries” were usually destroyed by American bombs; and America did not rebuild them. As to the Taliban, the United States overthrew a secular, women’s-rights government in Afghanistan, which led to the Taliban coming to power; so the US can hardly be
honored for ousting the Taliban a decade later, replacing it with an American occupation, an American puppet president, assorted warlords, and women chained. But try to explain all these fine points in the minute or so one has on radio or TV. However, I think I somehow managed to squeeze in a lot ofinformation and thoughts new to the American psyche.

Some hosts and many callers were clearly pained to hear me say that anti-American terrorists are retaliating against the harm done to their countries by US foreign policy, and are not just evil, mindless, madmen from another planet. {1} Many of them assumed, with lots of certainty and no good reason at all, that I was a supporter of the Democratic Party and they proceeded to attack Bill Clinton. When I pointed out that I was no fan at all of the Democrats or Clinton, they were usually confused into silence for a few moments before seamlessly jumping to some other piece of nonsense. They do not know that an entire alternative world exists above and beyond the Republicans and Democrats.

Just recently we have been hearing and reading comments in the American media about how hopelessly backward and violent were those Muslims protesting the Danish cartoons, carrying signs calling for the beheading of those that insult Islam. But a caller to a radio program I was on said I “should be taken care of”, and one of the hundreds of nasty emails I received began: “Death to you and your family”.

One of my personal favorite moments: On an AM radio program in Pennsylvania, discussing the Israeli-Palestinian conflict: The host (with anguish in her voice): “What has Israel ever done to the Palestinians?”

Me: “Have you been in a coma the past twenty years?”

This is a question I could ask many of those who interrogated me the past few weeks. Actually, sixty years would be more appropriate.

Elections My Teacher Never Told Me About

Americans are all taught from childhood on of the significance and sanctity of free elections: You can’t have the thing called “democracy” without the thing called “free elections”. And when you have the thing called free elections it’s virtually synonymous with having the thing called democracy. And who were we taught was the greatest champion of free elections anywhere in the world? Why, our very same teacher, God’s country, the good ol’ US of A.

But what was God’s country actually doing all those years we were absorbing and swearing by this message? God’s country was actually interfering in free elections in every corner of the known world; seriously so. The latest example is the recent elections in Palestine, where the US Agency for International Development (AID) poured in some two million dollars (a huge amount in that impoverished area) to try to tilt the election to the Palestinian Authority and its political wing, Fatah, and prevent the radical Islamic group Hamas from taking power. The money was spent on various social programs and events to increase the popularity of the Palestinian Authority; the projects bore no evidence of US involvement and did not fall within the definitions of traditional development work. In addition, the United States funded many
newspaper advertisements publicizing these projects in the name of the Palestinian Authority, with no mention of AID.

“Public outreach is integrated into the design of each project to highlight the role of the Palestinian Authority in meeting citizens needs”, said a progress report on the projects. “The plan is to have events running every day of the coming week, beginning 13 January, such that there is a constant stream of announcements and public outreach about positive happenings all over Palestinian areas in the critical week before the elections”.

Under the rules of the Palestinian election system, campaigns and candidates were prohibited from accepting money from foreign sources. {2} American law explicitly forbids the same in US elections. Since Hamas won the election, the United States has made it clear that it does not recognize the election as any kind of victory for democracy and that it has no intention of having normal diplomatic relations with the Hamas government. (Israel has adopted a similar attitude, but it should not be forgotten that Israel funded and supported the emergence of Hamas in Gaza during its early days, hoping that it would challenge the Palestine Liberation Organization as well as Palestinian leftist elements.)

By my count, there have been more than thirty instances of gross Washington interference in foreign elections since the end of World War II – from Italy in 1948 and the Philippines and Lebanon in the 1950s, to Nicaragua, Bolivia and Slovakia in the 2000s – most of them carried out in an even more flagrant manner than the Palestinian example. {3} Some of the techniques employed have been used in the United States itself as our electoral system, once the object
of much national and international pride, has slid inexorably from “one person, one vote”, to “one dollar, one vote”.

Coming Soon to a Country (or City) Near You

On January 13 the United States of America, in its shocking and awesome wisdom, saw fit to fly an unmanned Predator aircraft over a remote village in the sovereign nation of Pakistan and fire a Hellfire missile into a residential compound in an attempt to kill some “bad guys”. Several houses were incinerated, eighteen people were killed, including an unknown number of “bad guys”; reports since then give every indication that the unknown number is as low as zero, al Qaeda second-in-command Ayman al-Zawahiri, the principal target, not being amongst them. Outrage is still being expressed in Pakistan. In the United States the reaction in the Senate typified the American outrage:

“We apologize, but I can’t tell you that we wouldn’t do the same thing again”, said Senator John McCain of Arizona

“It’s a regrettable situation, but what else are we supposed to do?” said Senator Evan Bayh of Indiana.

“My information is that this strike was clearly justified by the intelligence”, said Senator Trent Lott of Mississippi. {4}

Similar US attacks using such drones and missiles have angered citizens and political leaders in Afghanistan, Iraq and Yemen. In has not been uncommon for the destruction to be so complete that it is impossible to establish who was killed, or even how many people. Amnesty International has lodged complaints with the Busheviks following each suspected Predator strike. A UN report in the wake of the 2002 strike in Yemen called it “an alarming precedent [and] a clear case of extrajudicial killing” in violation of international laws and treaties. {5}

Can it be imagined that American officials would fire a missile into a house in Paris or London or Ottawa because they suspected that high-ranking al Qaeda members were present there? Even if the US knew of their presence for an absolute fact, and not just speculation as in the Predator cases mentioned above? Well, most likely not, but can we put anything past Swaggering- Superarrogant-Superpower-Cowboys-on-steroids? After all, they’ve already done it to their own, in Philadelphia, Pennsylvania. On May 13 1985, a bomb dropped by a police helicopter burned down an entire block, some sixty homes destroyed, eleven dead, including several small children. The police, the mayor’s office, and the FBI were all involved in this effort to evict an organization called MOVE from the house they lived in. The victims were all black of course. So let’s rephrase the question. Can it be imagined that American officials would fire a missile into a residential area of Beverly Hills or the upper east side of Manhattan? Stay tuned.

“The struggle of man against tyranny is the struggle of memory against forgetting”. Milan Kundera

I’m occasionally taken to task for being so negative about the United States role in the world. Why do you keep looking for all the negative stuff and tear down the positive? I’m asked.

Well, it’s a nasty job, but someone has to do it. Besides, for each negative piece I’m paid $500 by al Qaeda. And the publicity given to my books by Osama … priceless.

The new documentary film by Eugene Jarecki, “Why We Fight“, which won the Sundance Festival’s Grand Jury prize, relates how the pursuit of profit by arms merchants and other US corporations has fueled America’s post-World War II wars a lot more than any love of freedom and democracy. The unlikely hero of the film is Dwight Eisenhower, whose famous warning about the dangers of the “military-industrial complex” is the film’s principal motif. Here is Jarecki being interviewed by the Washington Post:

Post: Why did you make “Why We Fight?”

Jarecki: The simple answer: Eisenhower. He caught me off-guard. He seemed to have so much to say about our contemporary society and our general tilt towards militarism … The voices in Washington and the media have become so shrill … It seemed important to bring a little gray hair into the mix.

Post: How would you classify your politics? You’ve been accused of being a lefty.

Jarecki: I’m a radical centrist … If Dwight Eisenhower is a lefty, I am too. Then I’ll walk with Ike. {6} [ellipses in original]

Isn’t it nice that a film portraying the seamier side of the military- industrial complex is receiving such popular attention? And that we are able to look fondly upon an American president? How long has that been? Well, here I go again.

Eisenhower, regardless of what he said as he was leaving the presidency, was hardly an obstacle to American militarism or corporate imperialism. During his eight years in office, the United States intervened in every corner of the world, overthrowing the governments of Iran, Guatemala, Laos, the Congo, and British Guiana, and attempting to do the same in Costa Rica, Syria, Egypt, and Indonesia, as well as laying the military and political groundwork for
the coming Indochinese holocaust.

Eisenhower’s moralistically overbearing Secretary of State, John Foster Dulles, summed up the administration’s world outlook thusly: “For us there are two sorts of people in the world: there are those who are Christians and support free enterprise and there are the others”. {7}

NOTES

{1} See my essay on this subject at:
http://members.aol.com/essays6/myth.htm

{2} Washington Post, January 22 and 24 2006

{3} Rogue State, chapter 18, includes the text of the US law prohibiting foreign contributions to US elections.

{4} Associated Press, January 15 2006

{5} Los Angeles Times, January 29 2006

{6} Washington Post, February 12 2006, page N3

{7} Roger Morgan, The United States and West Germany, 1945-1973 (Oxford University Press, 1974), page 54

William Blum is the author of:-

Killing Hope: US Military and CIA Interventions Since World War 2 (Common Courage Press, 1995)

Rogue State: A Guide to the World’s Only Superpower (Zed Books, 2002)

West-Bloc Dissident: A Cold War Memoir (Soft Skull Press, 2002)

Freeing the World to Death: Essays on the American Empire (Common Courage Press, 2004)

Previous Anti-Empire Reports can be read at this website.

To add yourself to this mailing list simply send an email to bblum6@aol.com with “add” in the subject line. I’d like your name and city in the message, but that’s optional. I ask for your city only in case I’ll be speaking in your area.

Or put “remove” in the subject line to do the opposite.

Any part of this report may be disseminated without permission. I’d appreciate it if the website were mentioned.

www.killinghope.org

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized

>Children of the Machine

>New technological advances could make us susceptible to perpetual surveillance

by George Monbiot

Published in the Guardian (February 21 2006)

It received just a few column inches in a couple of papers, but the story I read last week looks to me like a glimpse of the future. A company in Ohio called CityWatcher has implanted radio transmitters into the arms of two of its workers. The implants ensure that only they can enter the strongroom. Apparently it is “the first known case in which US workers have been tagged electronically as a way of identifying them” .

The transmitters are tiny (about the size of a grain of rice); cheap ($150 and falling fast ); safe and stable. Without being maintained or replaced, they can identify someone for many years. They are injected, with a local anaesthetic, into the upper arm. They require no power source, as they become active only when scanned. There are no technical barriers to their wider deployment.

The company which makes these “radio frequency identification tags”, the VeriChip Corporation, says they “combine access control with the location and protection of individuals” . The chips can also be implanted in hospital patients, especially children and people who are mentally incapacitated. When doctors want to know who they are and what their medical history is, they simply scan them in. This, apparently, is “an empowering option to affected individuals” . For a while a school in California toyed with the idea of implanting the chips in all its pupils .

A tag like this has a maximum range of a few metres. But another implantable device emits a signal which allows someone to be found or tracked by satellite. The patent notice says it can be used to locate the victims of kidnapping or people lost in the wilderness .

There are, in other words, plenty of legitimate uses for implanted chips. This is why they bother me. A technology whose widespread deployment, if attempted now, would be greeted with horror, will gradually become unremarkable. As this happens, its purpose will begin to creep.

At first the tags will be more widely used for workers with special security clearance. No one will be forced to wear one; no one will object. Then hospitals – and a few in the US are already doing this – will start scanning their unconscious or incoherent patients to see whether or not they have a tag. Insurance companies might start to demand that vulnerable people are chipped.

The armed forces will discover that they are more useful than dog tags for identifying injured soldiers or for tracking troops who are lost or have been captured by the enemy. Prisons will soon come to the same conclusion. Then sweatshops in developing countries will begin to catch on. Already the overseers seek to control their workers to the second; determining when they clock on, when they visit the toilet, even the number of hand movements they perform. A chip makes all this easier. The workers will not be forced to have them, any more than they are forced to have sex with their bosses; but if they don’t accept the conditions, they don’t get the job. After that, it surely won’t be long before asylum seekers are confronted with a similar choice: you don’t have to accept an implant, but if you refuse, you can’t stay in the country.

I think it will probably stop there. I don’t believe that you or I or most comfortable, mentally competent people will be forced to wear a tag. But it will become an increasingly acceptable means of tracking and identifying people who could be a danger to themselves, or who could be at risk of sudden illness or disappearance, or who are otherwise hard for companies or governments to control. They will, on the whole, be people whose political voice is muted.

As it is with all such intrusions on our privacy, it won’t be easy to put your finger on exactly what’s wrong with this technology. It won’t really amount to a new form of control, as all the people who accept the implants will already be subject to monitoring or tracking of one kind or another. It will always be voluntary, at least to the extent that anything the state or our employers want us to do is voluntary. But there is something utterly revolting about it. It is another means by which the barriers between ourselves and the state, ourselves and the corporation, ourselves and the machine are broken down. In that tiny capsule we find the paradox of 21st century capitalism: a political system which celebrates choice, autonomy and individualism above all other virtues demands that choice, autonomy and individualism are perpetually suppressed.

While implanted chips will not lead to the mass scanning of the population, another use of the same technology quite possibly will. At the end of last month, a leaked letter from Andy Burnham, the Home Office minister, revealed that the identity cards for which we will involuntarily volunteer will contain radio frequency identification chips . This will allow the authorities to read the cards with a scanner. I propose that as the technology improves, the police will be able to scan a crowd and (assuming everyone is carrying his voluntary-compulsory ID card) produce a list of whom it contains. I further propose that it will take only a year or two for this to seem reasonable.

Already we have become used to the police filming demonstrations for the same purpose. When they started doing it, about ten years ago, it caused outrage. It gave us the impression that by protesting we became suspects. But now we don’t even notice them: not even to the extent of waving and shouting “hello Mum”. Like every other intrusion on our privacy, they have become normal.

I also propose that the mass scanning these identification chips will allow will be assisted by another kind of surveillance technology. Last week, campaigners in west Wales obtained a letter sent by the Welsh Development Agency to Ceredigion County Council. It revealed that the agency, with the help of the European Union, is setting up an industrial estate outside Aberystwyth. Its purpose is the “market accelaration” of unmanned aerial vehicles (UAVs) . With the help of companies such as Bae Systems, Rolls Royce and our new friend Qinetiq, the agency hopes to find the best way of encouraging the “routine operation of UAV systems UK-wide” . Ceredigion council’s website lists various functions of the UAVs, of which the first is “law enforcement” .

So the police won’t even have to be there. Someone sitting in a control room could fly a tiny drone (some of them are just a few inches across) equipped with a receiver over the heads of a crowd and, with the help of our new identity cards, determine who’s there. It sounds quite mad, just as the idea of biometric identity cards in the United Kingdom once did. All these new technologies somehow contrive to seem both wildly implausible and entirely likely.

There will be no dramatic developments. We will not step out of our homes one morning to discover that the state, or our boss, or our insurance company, knows everything about us. But, if the muted response to the ID card is anything to go by, we will gradually submit, in the name of our own protection, to the demands of the machine. And it will not then require a tyrannical new government to deprive us of our freedom. Step by voluntary step, we will have given it up already.

www.monbiot.com

References:

1. Richard Waters, 12th February 2006. US group implants electronic tags in workers. Financial Times.

2. Will Weissert, 14th July 2004. Chip Implanted in Mexico Judicial Workers. Associated Press.

3. http://www.verichipcorp.com/content/solutions/1117566047

4. http://www.verichipcorp.com/content/solutions/1117564579

5. The Brittan Elementary School in Sutter. Cited by Susan Kuchinskas, 18th February 2005. Networking. http://www.internetnews.com/infra/article.php/3484351

6. Paul A Gargano et al, 13th May 1997. Personal tracking and recovery system. United States Patent no 5,629,678.

7. Daren Fonda, 24th October 2005. Biochips for Everyone! Time magazine.

8. Philip Johnston, 28th January 2006. ID cards ‘will track where people go’. The Daily Telegraph.

9. Letter from Dr Sue Wolfe, Technology and Innovation Manager, Welsh Development Agency, to Philip Ellis and Allan Lewis, Economic Development Department, Ceredigion County Council, 6th January 2006.

10. ibid.

11. Ceredigion County Council, 14th July 2004. ParcAberporth is leading the way. Press release.

http://www.monbiot.com/archives/2006/02/21/children-of-the-machine/

Bill Totten http://www.ashisuto.co.jp/english/index.html

Categories: Uncategorized