Rebecca Gordon – Informed Comment https://www.juancole.com Thoughts on the Middle East, History and Religion Fri, 17 Mar 2023 04:09:10 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.8 The End of Renting? In Some Cities Affordable Housing is a Thing of the Past https://www.juancole.com/2023/03/renting-affordable-housing.html Fri, 17 Mar 2023 04:06:25 +0000 https://www.juancole.com/?p=210725 ( Tomdispatch.com) – In 1937, the American folklorist Alan Lomax invited Louisiana folksinger Huddie Ledbetter (better known as Lead Belly) to record some of his songs for the Library of Congress in Washington, D.C. Lead Belly and his wife Martha searched in vain for a place to spend a few nights nearby. But they were Black and no hotel would give them shelter, nor would any Black landlord let them in, because they were accompanied by Lomax, who was white. A white friend of Lomax’s finally agreed to put them up, although his landlord screamed abuse at him and threatened to call the police.

In response to this encounter with D.C.’s Jim Crow laws, Lead Belly wrote a song, “The Bourgeois Blues,” recounting his and Martha’s humiliation and warning Blacks to avoid the capital if they were looking for a place to live. The chorus goes,

“Lord, in a bourgeois town
It’s a bourgeois town
I got the bourgeois blues
Gonna spread the news all around”

And one verse adds,

“I want to tell all the colored people to listen to me
Don’t ever try to get a home in Washington, D.C.
‘Cause it’s a bourgeois town”

Such affronts, Lead Belly sang, occurred in the “home of the brave, land of the free,” where he didn’t want “to be mistreated by no bourgeoisie.”

There are music scholars who believe that Lead Belly didn’t really understand what “bourgeois” meant. They claim Lomax, later accused of being a Communist “fellow traveler,” provided him with that addition to his vocabulary and he simply understood it as a synonym for “racist.” Personally, I think that, in a few deft verses, Lead Belly managed to show how racism and class stratification merged to make it all but impossible to find a home in Washington, as in so many other places in America.

Still a Bourgeois Town

In the late 1970s, after a period of unemployment, my mother got a job for a year in Washington. We’d lived there while I was growing up, but she hadn’t been back for almost a decade. She was a white middle-class professional and it was still hell finding an affordable place to rent. (She’d been without a job for more than a year.) It would be some time before credit ratings would be formalized, thanks to the financial corporation FICO, producing a model of a standardized credit score for anyone. But her prospective landlords had other ways of checking on her creditworthiness. That she was a divorced woman with no rental history and no recent jobs didn’t make things easy.

Still, she had her sense of humor. One day during that search, she mailed me an old 45 rpm recording of Lead Belly’s “Bourgeois Blues.” It seemed to perfectly catch her frustrated efforts to escape a friend’s guest room before she wore out her welcome.

I was reminded of that record recently when I read about the travails of Maxwell Alejandro Frost, a new Democratic congressman from Orlando, Florida. Born in 1996, he’s the youngest member of the House of Representatives. He quit his full-time job to campaign for Congress, supporting himself by driving an Uber. When he tried to find a home in Washington, his application for a studio apartment was rejected because of a bad credit score. As Frost tweeted:

“Just applied to an apartment in DC where I told the guy that my credit was really bad. He said I’d be fine. Got denied, lost the apartment, and the application fee.

This ain’t meant for people who don’t already have money.”

Nor, as Lead Belly might have added, for people like Frost who are Black.

Washington, D.C., it seems, remains a “bourgeois” town.

The True Costs of Renting

Suppose you want to rent a place to live. What will you need to have put aside just to move in? This depends not only on the monthly rent, but on other fees and upfront payments in the place where you plan to live. And, of course, your credit score.

Application fee: One part of Frost’s story caught my attention: he had to forfeit his “application fee” for an apartment he didn’t get. If, like me, you haven’t rented a house or apartment in a while you might not even know about such fees. They’re meant to cover the cost of a background check on the applicant. You might expect them to be rolled into the rent, but in a seller’s (or renter’s) market, there’s no risk to landlords in making them extra.

Frost’s fee was $50 for one application. (These fees tend to top out around $75.) Not so bad, right? Until you grasp that many potential renters find themselves filing multiple applications — 10 isn’t unheard of — simply to find one place to rent, so you’re potentially talking about hundreds of dollars in fees. California, my own state, is among the few that regulate application fees. The maximum rises to match inflation. In December 2022, that max was $59.67. Some states set a lower maximum, and some don’t regulate the fees at all.

Move-in fees: If you haven’t rented in a while, this one may take you by surprise. Unlike a security deposit, move-in fees are nonrefundable. They’re supposed to cover the costs of preparing a place for a new tenant — everything from installing new locks to replacing appliances and painting. Once subsumed in the monthly rent, today these costs are often passed on directly to renters. Nationally, they average between 30% and 50% of a month’s rent.

In June 2022, the median rent for an apartment in the United States crossed the $2,000 threshold for the first time, which means the median move-in fee now ranges from $600 to $1,000.

First and last months’ rent: This upfront cost should be familiar to anyone who’s ever rented. Landlords almost always require two months’ rent upfront and hold on to the last month’s rent to ensure that a tenant can’t skip out without paying. Because landlords can invest the money they’re holding (and tenants can’t invest what they’ve forked over to landlords), in recent years, most states have required landlords to pay interest on the tenant’s funds.

Security deposit: Unlike the move-in fee, a security deposit — often a month’s rent — is refundable if tenants leave a place in good condition. Its ostensible purpose: to reimburse the landlord for future cleaning and repair costs that exceed normal wear-and-tear. (But wait! Isn’t that what the non-refundable move-in fee should do?)

Other fees: If you’re renting a condo, you may have to cover the owner’s monthly Home Owner Association fees. In some cases, you’ll also pay for a utility’s hookup like gas or electricity.

So, how much will you have to pay to set foot in that apartment? Well, if you’re like Nuala Bishari, a San Francisco Chronicle reporter who recently tried to rent a house in nearby Oakland, California, you’ll need to set aside almost $10,000. If you’re not sure how you could possibly put that kind of money together, the credit score company Experian has some advice for you:

First, “calculate your odds.” Find out how many other people are applying for the unit you’re interested in and, if the competition is stiff, “consider looking elsewhere.” (As if you haven’t done that already!)

Then tighten your belt. “Reducing extraneous expenses,” it observes, “is an easy way to save.” Stop going out to eat, for instance, and look for free family activities. If that’s not enough, it’s time to “get serious about cost cutting.” Their brilliant suggestions include:

  • “Cut back on utility use. [Wait! I thought I was supposed to cook more at home. Never mind. I’ll just sit here in the dark.]
  • Carpool to work instead of driving. [I take the bus, but maybe I should start walking.]
  • Switch to a budget grocery store and look for coupons and sales. [Right! No more Whole Paycheck for me!]
  • Join a buy-nothing group.”

Such “advice” to people desperate to find housing would be amusing if it weren’t so desperately insulting.

Rent Is Unaffordable for More Than Half the Country

Suppose you’ve managed to get together your up-front costs. What can you expect to pay each month? The federal Department of Housing and Urban Development considers housing affordable when rent takes no more than 30% of an individual’s or family’s monthly income. Human Rights Watch (!) reported in December 2022 that the Census Bureau’s 2021 Annual Community Survey revealed a little over half of all renters are spending more than 30% of their income that way — and in many cases, significantly more.

It tells you something that Human Rights Watch is concerned about housing costs in this country. The National Low Income Housing Coalition (NLIHC) put its data in perspective through what it calls a  “Housing Wage”: the hourly rate you’d need to make working 40 hours a week to afford to rent a place in a specific area. For many Americans, housing, they report, is simply “out of reach.”

“In 2022, a full-time worker needs to earn an hourly wage of $25.82 on average to afford a modest, two-bedroom rental home in the U.S. This Housing Wage for a two-bedroom home is $18.57 higher than the federal minimum wage of $7.25. In 11 states and the District of Columbia, the two-bedroom Housing Wage is more than $25.00 per hour. A full-time worker needs to earn an hourly wage of $21.25 on average in order to afford a modest one-bedroom rental home in the U.S.”

Unfortunately, many people don’t earn $21.25 an hour, which is why they hold two or three jobs, or add Uber or Door Dash shifts to their other work. It’s hardest for minimum wage workers. As the NLIHC observes, “In no state can a person working full-time at the prevailing federal, state, or county minimum wage afford a two-bedroom apartment at the [fair market rate].” Furthermore, “in only 274 counties out of more than 3,000 nationwide can a full-time worker earning the minimum wage afford a one-bedroom rental home at the [fair market rate].”

For people living at or below the poverty line, the situation is even direr, which is why so many end up unhoused, whether by couch-surfing among friends and family or pitching a tent on the street.

In the coming months, the situation is only expected to worsen now that pandemic-era eviction moratoriums and the $46.5 billion federal Emergency Rental Assistance Program are expiring. According to the Pew Research Center, those programs prevented more than a million people from being evicted.

It Wasn’t Always This Way

People have always experienced poverty, but in the United States, the poor have not always gone without housing. Yes, they lived in tenements or, if they were men down on their luck, in single-room occupancy hotels. And yes, the conditions were often horrible, but at least they spent their nights indoors.

Indeed, the routine presence of significant populations of the urban unhoused on this country’s city streets goes back only about four decades. When I moved to the San Francisco Bay Area in 1982, there was a community of about 400 people living in or near People’s Park in Berkeley. Known as the Berkeley Beggars, they were considered a complete oddity, a hangover of burnt-out hippies from the 1960s.

During President Ronald Reagan’s administration, however, a number of factors combined to create a semi-permanent class of the unhoused in this country: high interest rates implemented by the Federal Reserve’s inflation fight drove up the cost of mortgages; a corruption scandal destroyed many savings and loan institutions from which middle-income people had long secured home mortgages; labor unions came under sustained attack, even by the federal government; and real wages (adjusted for inflation) plateaued.

Declaring that government was the problem, not the solution, Reagan began a four-decade-long Republican quest to dismantle the New Deal social-safety net implemented under President Franklin Delano Roosevelt and supplemented under President Lyndon Johnson. Reagan savaged poverty-reduction programs like Food Stamps and Medicaid, while throwing more than 300,000 people with disabilities off Social Security. Democrat Bill Clinton followed up, joining with Republicans to weaken Aid to Families with Dependent Children (“welfare”).

A decade earlier, scandal-ridden state asylums for the mentally ill began to be shut down all over the country. In the late 1960s, Reagan had led that effort in California when he was governor. While hundreds of thousands were freed from a form of incarceration, they also instantly lost their housing. (On a personal note, this is why, in 1990, my mother found herself living in unsupervised subsidized housing for a population of frail elderly and recently de-institutionalized people with mental illnesses. This wasn’t a good combination.)

By the turn of the century, a permanent cohort of people without housing had come to seem a natural part of American life.

And It Doesn’t Have to Be Like This Forever

There is no single solution to the growing problem of unaffordable housing, but with political will and organizing action at the local, state, and federal levels it could be dealt with. In addition to the obvious — building more housing — here are a few modest suggestions:

At the state and local level:

  • Raise minimum wages to reflect the prevailing cost of living.
  • Remove zoning restrictions on the construction of multifamily buildings.
  • Pass rent-control ordinances, so rents rise no faster than the consumer price index.
  • Pass limits on up-front rental and move-in fees.
  • Pass legislation to prevent no-cause evictions.
  • Pass legislation, as California has already done, to allow renters to report their on-time rent payments to credit bureaus, allowing them to boost their credit scores without borrowing money.

At the federal level:

  • Raise the federal minimum wage, which, even in this era of inflation, has been stuck at $7.25 an hour since 2009.
  • Increase funding for SNAP, the present food-stamp program (whose pandemic-era increases have just expired).
  • Increase federal funding for public housing.
  • Provide universal healthcare, ideally in the form of Medicare for all.
  • Increase “Section 8” housing subsidies for low-income renters.
  • Raise taxes on the wealthy to fund such changes.
  • Finally, shift part — say one-third — of the bloated “defense” budget (up $80 billion from last year to $858 billion in 2023) to programs that actually contribute to national security — the daily financial security of the people who live in this nation.

Then maybe the next time we send new people to Congress, all of them will be able to find a home in Washington, D.C.

Via Tomdispatch.com

]]>
“Some say the World will End in Fire, some say in Ice”: Life in a Climate-Destabilized California https://www.juancole.com/2023/02/climate-destabilized-california.html Mon, 06 Feb 2023 05:02:01 +0000 https://www.juancole.com/?p=209901 ( Tomdispatch.com ) – It was January 1983 and raining in San Francisco.

The summer before, I’d moved here from Portland, Oregon, a city known for its perpetual gray drizzles and, on the 60-odd days a year when the sun deigns to shine, dazzling displays of greenery. My girlfriend had spent a year convincing me that San Francisco had much more to offer me than Portland did for her.

Every few months, I’d scrape the bottom of my bank account to travel to San Francisco and taste its charms. Once, I even hitched a ride on a private plane. (Those were the days!) In a week’s visit, she’d take me to multiple women’s music concerts — events you’d wait a year for in Portland. We’d visit feminist and leftist bookstores, eat real Mexican food, and walk through Golden Gate Park in brilliant sunshine. The sky would be clear, the city would be sparkling, and she convinced me that San Francisco would indeed be paradise. Or at least drier than Portland.

So, I moved, but I wuz robbed! I knew it that first winter when, from December through March, the rain seemed to come down in rivers — atmospheric rivers, in fact — though none of us knew the term back then. That would be my initial encounter with, as a Mexican-American friend used to call it, “el pinche niño.” El Niño is the term meteorologists give to one-half of an oscillating cyclical weather phenomenon originating in the Pacific Ocean. El Niño usually brings drought to the southern parts of North America, as well as Central America, while deluging northern California and the Pacific Northwest. La Niña is the other half of that cycle, its effects roughly flipping those of El Niño geographically. (As for the meaning of “pinche,” go ahead and Google it.)

San Francisco sits in the sweet spot where, at least until the end of the last century, we would get winter rains at both ends of the cycle. And boy, did it rain that winter! I soon began to wonder whether any amount of love or any number of concerts could make up for the cold and mud. Eventually, I realized that I couldn’t really blame the girlfriend. The only other time I’d lived in San Francisco was during the then-unusual drought year of 1976. Of course, I came to believe then that it never rained here. So, really, if there was a bait-and-switch going on, I had pulled it on myself.

Still, looking back, as much as the rain annoyed me, I couldn’t have imagined how much I’d miss it two decades into the twenty-first century.

But Is It Climate Change? And Would That Actually Be So Bad?

Along with the rest of the western United States, my city has now been in the grip of a two-decade-long megadrought that has persisted through a number of El Niño/La Niña cycles. Scientists tell us that it’s the worst for the West and Southwest in at least the last 1,200 years. Since 2005, I’ve biked or walked the three miles from my house to the university where I teach. In all those years, there have probably been fewer than 10 days when rain forced me to drive or take the bus. Periodic droughts are not unknown in this part of the country. But climate scientists are convinced that this extended, deadly drought has been caused by climate change.

It wasn’t always that way. Twenty years ago, those of us who even knew about global warming, from laypeople to experts, were wary of attributing any particular weather event to it. Climate-change deniers and believers alike made a point of distinguishing between severe weather events and the long-term effects of changes in the climate. For the deniers, however, as the years went on, it seemed that no accumulation of symptoms — floods, droughts, heat waves, fires, or tornadoes — could legitimately be added together to yield a diagnosis of climate change. Or if climate change was the reason, then human activity didn’t cause it and it was probably a good thing anyway.

Not that long ago, it wasn’t even unusual to encounter “climate-change-is-good-for-you” articles in reasonably mainstream outlets. For example, the conservative British magazine The Spectator ran a Matt Ridley piece in 2013 that began: “Climate change has done more good than harm so far and is likely to continue doing so for most of this century. This is not some barmy, right-wing fantasy; it is the consensus of expert opinion.” It turned out that Ridley’s “consensus of expert opinion” derived from a single economist’s (and not a climate scientist’s) paper summarizing 14 other economists on the subject.

“The chief benefits of global warming,” Ridley wrote then, “include: fewer winter deaths; lower energy costs; better agricultural yields; probably fewer droughts; maybe richer biodiversity.” He added that, were the world’s economy to continue to grow by 3% annually, “the average person will be about nine times as rich in 2080 as she is today. So low-lying Bangladesh will be able to afford the same kind of flood defenses that the Dutch have today.”

There was so much wrong with those last two sentences (beginning with what “average” means), but I’ll content myself with pointing out that, in October 2022, historic floods covered one-third of Pakistan (next door to Bangladesh), including prime farmland the size of the state of Virginia. Thirty-three million people were affected by those floods that, according to the New York Times, “were caused by heavier-than-usual monsoon rains and glacial melt.” And what led to such unusual rain and melt? As the Times reported:

“Scientists say that global warming caused by greenhouse-gas emissions is sharply increasing the likelihood of extreme rain in South Asia, home to a quarter of humanity. And they say there is little doubt that it made this year’s monsoon season more destructive.”

It seems unlikely those floods will lead to “better agricultural yields.” (If only Pakistan had thought to build dikes, like the Dutch!)

Maybe it’s easy to take potshots at what someone like Ridley wrote almost a decade ago, knowing what we do now. Back then, views like his were not uncommon on the right and, all too sadly, they’re not rare even today. (Ridley is still at it, having recently written a piece twitting the British Conservative Party for supporting something as outré as wind power.) And of course, those climate change denials were supported (then and now) by the companies that stood to lose the most from confronting the dangers of greenhouse gases, not only the fossil-fuel industry (whose scientists knew with stunning accuracy exactly what was already happening on this planet as early as the 1970s), but electric companies as well.

Back in 2000, an ExxonMobile “advertorial” in the New York Times hit the trifecta: climate change isn’t real; or if it is, humans (and especially fossil-fuel companies!) aren’t responsible; and anyway it might be a good thing. Titled “Unsettled Science,” the piece falsely argued that scientists could not agree on whether climate change was happening. (By that time, 90% of climate scientists, including ExxonMobile’s, had reached a consensus that climate change is real.) After all, the ad insisted, there had been other extended periods of unusual weather like the “little ice age” of the medieval era and, in any case, greenhouse gas concentrations vary naturally “for reasons having nothing to do with human activity.”

We shouldn’t be surprised that Exxon-Mobile tried to keep climate change controversial in the public mind. They had a lot to lose in a transition away from fossil fuels. It’s less common knowledge, however, that the company has long bankrolled climate denial “grassroots” organizations. In fact, its scientists knew about climate change as early as the 1950s and, in a 1977 internal memo, they summarized their research on the subject by predicting a one- to three-degree Celsius average temperature rise by 2050, pretty much the future we’re now staring at.

Water, Water, Anywhere?

California has been “lucky” this fall and winter. We’ve seen a (probably temporary) break in the endless drought. A series of atmospheric rivers have brought desperately needed rain to our valleys and an abundance of snow to the mountains. But not everyone has been celebrating, as floods have swept away homes, cars, and people up and down the state. They’ve shut down highways and rail lines, while forcing thousands to evacuate. After years of thirst, for a few weeks the state has been drowning; and, as is so often the case with natural disasters, the poorest people have been among those hardest hit.

I’ve always enjoyed the delicious smugness of lying in a warm bed listening to wind and water banging at my windows. These days it’s a guilty pleasure, though, because I know how many thousands of unhoused people have suffered in and even died during the recent storms. In Sacramento, rain marooned one tent encampment, as the spit of land it occupied became an island. In the city of Ontario, near Los Angeles, flash floods washed away people’s tents and may have drowned as many as 10 of their inhabitants.

My own city responded to the rains with police sweeps of unhoused people hours before a “bomb cyclone” hit on January 4th. In such a “sweep,” police and sometimes other officials descend suddenly to enforce city ordinances that make it illegal to sit or lie on the sidewalk. They make people “move along,” confiscating any belongings they can’t carry off. Worse yet, shelters in the city were already full. There was nowhere inside for the unhoused to go and many lost the tents that had been their only covering.

The same climate change that’s prolonged the drought has exacerbated the deadly effects of those rainstorms. Over the last few years, record wildfires have consumed entire communities. Twenty years of endless dry days have turned our forests and meadows into tinderboxes, just waiting for a spark. Now, when rain bangs down in such amounts on already burnt, drought-hardened land, houses slide down hills, trees are pulled from the earth, and sinkholes open in roads and highways.

There is one genuine piece of luck here, though. Along with the rain, more than twice as much snow as would accumulate in an average year has covered the Sierra mountains of northern California. This is significant because many cities in the region get their water from the Sierra runoff. San Francisco is typical. Its municipal water supply comes from the Hetch Hetchy Reservoir, near Yosemite National Park, fed from that runoff. For now, it looks as if a number of cities could, for the first time in a while, have extra water available this year. But there’s always the chance that warm weather early in the spring will turn snow to rain, melting away the snowpack and our hopes.

Much of northern California’s water comes from the Sierra mountains, but it’s a different story in the south. The 9.8 million residents of Los Angeles County, along with most of southern California, get their water from the Colorado River. A century-old arrangement governs water use by the seven states through which the Colorado runs, along with 30 tribal nations and parts of northern Mexico — about 40 million people in all. Historically, the “northern basin” states, Wyoming, Utah, Colorado, and New Mexico, have been allocated 7.5 million acre-feet of water a year. Nevada, California, and Arizona have received 8.5 million and Mexico has treaty rights to 1.5 million. Dams on the two lakes — Mead in Nevada and Powell in Utah — provide hydroelectric power to many people in those same states.

The megadrought has drastically reduced the levels of these two artificial lakes that serve as reservoirs for those seven states. The original agreement assumed that 17.5 million acre-feet of water would be available annually (each acre-foot being about what two households might use in a year). For the last three years, however, the flow has fallen below 10 million acre-feet. This year, the states have been unable to agree on how to parcel out those allocations, so the Biden administration may have to step in and impose a settlement.

Both lakes are at their lowest historic levels since they were first filled. Several times, while working on a midterm election campaign in Reno, Nevada last year, I noticed stories in the local press about human remains being uncovered as Lake Mead’s shoreline recedes, some of them apparently victims of mob hits in decades past.

Less water in those giant lakes means less water for agriculture and residential consumption. But the falling water levels threaten a further problem: the potential failure of their dams to provide electric power crucial to millions. Last summer, Lake Mead dropped to within 90 feet of the depth at which its dam can no longer generate power. Some estimates suggest that Lake Powell’s Glen Canyon dam may stop producing electricity as soon as July.

Earthquakes, Drought, and Disaster

The woman I moved to San Francisco for (whom I’ve known since I was a young teen in the 1960s) spent her college years at the University of California, Berkeley. I remember her telling me, in the summer of 1969, that she and a number of friends had spent the previous spring semester celebrating the coming end of the world as they knew it. Apparently, some scientists had then predicted that a giant earthquake would cause the San Francisco Bay Area to collapse into the Pacific Ocean. Facing such a possible catastrophe, a lot of young folks decided that they might as well have a good party. There was smoking and drinking and dancing to welcome the approaching apocalypse. (When a Big One did hit 20 years later, the city didn’t exactly fall into the ocean, but a big chunk of the San Francisco Bay Bridge did go down.)

Over the last months, we Californians have experienced both historic drought and historic rainfall. The world as we knew it really is ending faster than some of us ever expected. Now that we’re facing an imminent catastrophe, one already killing people around the globe and even in my state, it’s hard to know how to respond. Somehow, I don’t feel like partying though. I think it’s time to fight.

Via Tomdispatch.com

]]>
Why American Exceptionalism can be a Very Bad Thing https://www.juancole.com/2023/01/american-exceptionalism-thing.html Tue, 10 Jan 2023 05:04:16 +0000 https://www.juancole.com/?p=209353 ( Tomdispatch.com ) – Let me start with a confession: I no longer read all the way through newspaper stories about the war in Ukraine. After years of writing about war and torture, I’ve reached my limit. These days, I just can’t pore through the details of the ongoing nightmare there. It’s shameful, but I don’t want to know the names of the dead or examine images caught by brave photographers of half-exploded buildings, exposing details — a shoe, a chair, a doll, some half-destroyed possessions — of lives lost, while I remain safe and warm in San Francisco. Increasingly, I find that I just can’t bear it.

And so I scan the headlines and the opening paragraphs, picking up just enough to grasp the shape of Vladimir Putin’s horrific military strategy: the bombing of civilian targets like markets and apartment buildings, the attacks on the civilian power grid, and the outright murder of the residents of cities and towns occupied by Russian troops. And these aren’t aberrations in an otherwise lawfully conducted war. No, they represent an intentional strategy of terror, designed to demoralize civilians rather than to defeat an enemy military. This means, of course, that they’re also war crimes: violations of the laws and customs of war as summarized in 2005 by the International Committee of the Red Cross (ICRC).

The first rule of war, as laid out by the ICRC, requires combatant countries to distinguish between (permitted) military and (prohibited) civilian targets. The second states that “acts or threats of violence the primary purpose of which is to spread terror among the civilian population” — an all-too-on-target summary of Russia’s war-making these last 10 months — “are prohibited.” Violating that prohibition is a crime.

The Great Exceptions

How should war criminals be held accountable for their actions? At the end of World War II, the victorious Allies answered this question with trials of major German, and Japanese officials. The most famous of these were held in the German city of Nuremberg, where the first 22 defendants included former high government officials, military commanders, and propagandists of the Nazi regime, as well as the banker who built its war machine. All but three were convicted and 12 were hanged..

The architects of those Nuremberg trials — representatives of the United States, the Soviet Union, the United Kingdom, and France — intended them as a model of accountability for future wars. The best of those men (and most of them were men) recognized their debt to the future and knew they were establishing a precedent that might someday be held against their own nations. The chief prosecutor for the United States, Robert H. Jackson, put it this way: “We must not forget that the record on which we judge the defendants today is the record on which we will be judged tomorrow.”

Indeed, the Nuremberg jurists fully expected that the new United Nations would establish a permanent court where war criminals who couldn’t be tried in their home countries might be brought to justice. In the end, it took more than half a century to establish the International Criminal Court (ICC). Only in 1998 did 60 nations adopt the ICC’s founding document, the Rome Statute. Today, 123 countries have signed.

Russia is a major exception, which means that its nationals can’t be tried at the ICC for war crimes in Ukraine. And that includes the crime the Nuremberg tribunal identified as the source of all the rest of the war crimes the Nazis committed: launching an aggressive, unprovoked war.

Guess what other superpower has never signed the ICC? Here are a few hints:

  • Its 2021 military budget dwarfed that of the next nine countries combined and was 1.5 times the size of what the world’s other 144 countries with such budgets spent on defense that year.
  • Its president has just signed a $1.7 trillion spending bill for 2023, more than half of which is devoted to “defense” (and that, in turn, is only part of that country’s full national security budget).
  • It operates roughly 750 publicly acknowledged military bases in at least 80 countries.
  • In 2003, it began an aggressive, unprovoked (and disastrous) war by invading a country 6,900 miles away.

War Crimes? No, Thank You

Yes, the United States is that other Great Exception to the rules of war. While, in 2000, during the waning days of his presidency, Bill Clinton did sign the Rome Statute, the Senate never ratified it. Then, in 2002, as the Bush administration was ramping up its “global war on terror,” including its disastrous occupation of Afghanistan and an illegal CIA global torture program, the United States simply withdrew its signature entirely. Secretary of Defense Donald Rumsfeld then explained why this way:

“…[T]he ICC provisions claim the authority to detain and try American citizens — U.S. soldiers, sailors, airmen and Marines, as well as current and future officials — even though the United States has not given its consent to be bound by the treaty. When the ICC treaty enters into force this summer, U.S. citizens will be exposed to the risk of prosecution by a court that is unaccountable to the American people, and that has no obligation to respect the Constitutional rights of our citizens.”

That August, in case the U.S. stance remained unclear to anyone, Congress passed, and President George W. Bush signed, the American Servicemembers Protection Act of 2002. As Human Rights Watch reported at the time, “The new law authorizes the use of military force to liberate any American or citizen of a U.S.-allied country being held by the [International Criminal] Court, which is located in The Hague.” Hence, its nickname: the “Hague Invasion Act.” A lesser-known provision also permitted the United States to withdraw military support from any nation that participates in the ICC.

The assumption built into Rumsfeld’s explanation was that there was something special — even exceptional — about U.S. citizens. Unlike the rest of the world, we have “Constitutional rights,” which apparently include the right to commit war crimes with impunity. Even if a citizen is convicted of such a crime in a U.S. court, he or she has a good chance of receiving a presidential pardon. And were such a person to turn out to be one of the “current and future officials” Rumsfeld mentioned, his or her chance of being hauled into court would be about the same as mine of someday being appointed secretary of defense.

The United States is not a member of the ICC, but, as it happens, Afghanistan is. In 2018, the court’s chief prosecutor, Fatou Bensouda, formally requested that a case be opened for war crimes committed in that country. The New York Times reported that Bensouda’s “inquiry would mostly focus on large-scale crimes against civilians attributed to the Taliban and Afghan government forces.” However, it would also examine “alleged C.I.A. and American military abuse in detention centers in Afghanistan in 2003 and 2004, and at sites in Poland, Lithuania, and Romania, putting the court directly at odds with the United States.”

Bensouda planned an evidence-gathering trip to the United States, but in April 2019, the Trump administration revoked her visa, preventing her from interviewing any witnesses here. It then followed up with financial sanctions on Bensouda and another ICC prosecutor, Phakiso Mochochoko.

Republicans like Bush and Trump are not, however, the only presidents to resist cooperating with the ICC. Objection to its jurisdiction has become remarkably bipartisan. It’s true that, in April 2021, President Joe Biden rescinded the strictures on Bensouda and Mochochoko, but not without emphasizing this exceptional nation’s opposition to the ICC as an appropriate venue for trying Americans. The preamble to his executive order notes that

“the United States continues to object to the International Criminal Court’s assertions of jurisdiction over personnel of such non-States Parties as the United States and its allies absent their consent or referral by the United Nations Security Council and will vigorously protect current and former United States personnel from any attempts to exercise such jurisdiction.”

Neither Donald Rumsfeld nor Donald Trump could have said it more clearly.

So where do those potential Afghan cases stand today? A new prosecutor, Karim Khan, took over as 2021 ended. He announced that the investigation would indeed go forward, but that acts of the U.S. and allies like the United Kingdom would not be examined. He would instead focus on actions of the Taliban and the Afghan offshoot of the Islamic State. When it comes to potential war crimes, the United States remains the Great Exception.

In other words, although this country isn’t a member of the court, it wields more influence than many countries that are. All of which means that, in 2023, the United States is not in the best position when it comes to accusing Russia of horrifying war crimes in Ukraine.

What the Dickens?

I blame my seven decades of life for the way my mind can now meander. For me, “great exceptions” brings to mind Charles Dickens’s classic story Great Expectations. His novels exposed the cruel reality of life among the poor in an industrializing Great Britain, with special attention to the pain felt by children. Even folks whose only brush with Dickens was reading Oliver Twist or watching The Muppets Christmas Carol know what’s meant by the expression “Dickensian poverty.” It’s poverty with that extra twist of cruelty — the kind the American version of capitalism has so effectively perpetuated.

When it comes to poverty among children, the United States is indeed exceptional, even among the 38 largely high-income nations of the Organization for Economic Cooperation and Development (OECD). As of 2018, the average rate of child poverty in OECD countries was 12.8%. (In Finland and Denmark, it was only 4%!) For the United States, with the world’s highest gross domestic product, however, it was 21%.

Then, something remarkable happened. In year two of the Covid pandemic, Congress passed the American Rescue Plan, which (among other measures) expanded the child tax credit from $2,000 up to as much as $3,600 per child. The payments came in monthly installments and, unlike the Earned Income Credit, a family didn’t need to have any income to qualify. The result? An almost immediate 40% drop in child poverty. Imagine that!

Given such success, you might think that keeping an expanded child tax credit in place would be an obvious move. Saving little children from poverty! But if so, you’ve failed to take into account the Republican Party’s remarkable commitment to maintaining its version of American exceptionalism. One of the items that the party’s congressional representatives managed to get expunged from the $1.7 trillion 2023 appropriation bill was that very expanded child tax credit. It seems that cruelty to children was the Republican party’s price for funding government operations.

Charles Dickens would have recognized that exceptional — and gratuitous — piece of meanness.

The same bill, by the way, also thanks to Republican negotiators, ended universal federal public-school-lunch funding, put in place during the pandemic’s worst years. And lest you think the Republican concern with (extending) poverty ended with starving children, the bill also will allow states to resume kicking people off Medicaid (federally subsidized health care for low-income people) starting in April 2023. The Kaiser Family Foundation estimates that one in five Americans will lose access to medical care as a result.

Great expectations for 2023, indeed.

We’re the Exception!

There are, in fact, quite a number of other ways in which this country is also exceptional. Here are just a few of them:

  • Children killed by guns each year. In the U.S. it’s 5.6 per 100,000. That’s seven times as high as the next highest country, Canada, at 0.8 per 100,000.
  • Number of required paid days off per year. This country is exceptional here as well, with zero mandatory days off and 10 federal holidays annually. Even Mexico mandates six paid vacation days and seven holidays, for a total of 13. At the other end of the scale, Chile, France, Germany, South Korea, Spain, and the United Kingdom all require a combined total of more than 30 paid days off per year.
  • Life expectancy. According to 2019 data, the latest available from the World Health Organization for 183 countries, U.S. average life expectancy at birth for both sexes is 78.5 years. Not too shabby, right? Until you realize that there are 40 countries with higher life expectancy than ours, including Japan at number one with 84.26 years, not to mention Chile, Greece, Peru, and Turkey, among many others.
  • Economic inequality. The World Bank calculates a Gini coefficient of 41.5 for the United States in 2019. The Gini is a 0-to-100-point measure of inequality, with 0 being perfect equality. The World Bank lists the U.S. economy as more unequal than those of 142 other countries, including places as poor as Haiti and Niger. Incomes are certainly lower in those countries, but unlike the United States, the misery is spread around far more evenly.
  • Women’s rights. The United States signed the United Nations Convention on the Elimination of All Forms of Discrimination against Women in 1980, but the Senate has never ratified it (thank you again, Republicans!), so it doesn’t carry the force of law here. Last year, the right-wing Supreme Court gave the Senate a helping hand with its decision in Dobbs v. Jackson Women’s Health Organization to overturn Roe v. Wade. Since then, several state legislatures have rushed to join the handful of nations that outlaw all abortions. The good news is that voters in states from Kansas to Kentucky have ratified women’s bodily autonomy by rejecting anti-abortion ballot propositions.
  • Greenhouse gas emissions. Well, hooray! We’re no longer number one in this category. China surpassed us in 2006. Still, give us full credit; we’re a strong second and remain historically the greatest greenhouse gas emitter of all time.

Make 2023 a (Less) Exceptional Year

Wouldn’t it be wonderful if we were just a little less exceptional? If, for instance, in this new year, we were to transfer some of those hundreds of billions of dollars Congress and the Biden administration have just committed to enriching corporate weapons makers, while propping up an ultimately unsustainable military apparatus, to the actual needs of Americans? Wouldn’t it be wonderful if just a little of that money were put into a new child tax credit?

Sadly, it doesn’t look very likely this year, given a Congress in which, however minimally and madly, the Republicans control the House of Representatives. Still, whatever the disappointments, I don’t hate this country of mine. I love it — or at least I love what it could be. I’ve just spent four months on the front lines of American politics in Nevada, watching some of us at our very best risk guns, dogs, and constant racial invective to get out the vote for a Democratic senator.

I’m reminded of poet Lloyd Stone’s words that I sang as a teenager to the tune of Sibelius’s Finlandia hymn:

“My country’s skies are bluer than the ocean
And sunlight beams on cloverleaf and pine
But other lands have sunlight, too, and clover,
And skies are somewhere blue as mine.
Oh, hear my prayer, O gods of all the nations
A song of peace for their lands and for mine”

So, no great expectations in 2023, but we can still hope for a few exceptions, can’t we?

Tomdispatch.com

]]>
Living Politics, Embedding with Workers, Standing up to the Lies of the Rich https://www.juancole.com/2022/12/politics-embedding-standing.html Wed, 07 Dec 2022 05:02:33 +0000 https://www.juancole.com/?p=208630 ( Tomdispatch.com ) – “Welcome back!” read my friend Allan’s email. “So happy to have you back and seeing that hard work paid off. Thank you for all that you do. Please don’t cook this evening. I am bringing you a Honduran dinner — tacos hondureños and baleadas, plus a bottle of wine.” The tacos were tasty indeed, but even more pleasing was my friend’s evident admiration for my recent political activities.

My partner and I had just returned from four months in Reno, working with UNITE-HERE, the hospitality industry union, on their 2022 midterm electoral campaign. It’s no exaggeration to say that, with the votes in Nevada’s mostly right-wing rural counties cancelling out those of Democratic-leaning Las Vegas, that union campaign in Reno saved the Senate from falling to the Republicans. Catherine Cortez Masto, the nation’s first Latina senator, won reelection by a mere 7,928 votes, out of a total of more than a million cast. It was her winning margin of 8,615 in Washoe County, home to Reno, that put her over the top.

Our friend was full of admiration for the two of us, but the people who truly deserved the credit were the hotel housekeepers, cooks, caterers, and casino workers who, for months, walked the Washoe County streets six days a week, knocking on doors in 105-degree heat and even stumping through an Election Day snowstorm. They endured having guns pulled on them, dogs sicced on them, and racist insults thrown at them, and still went out the next day to convince working-class voters in communities of color to mark their ballots for a candidate many had never heard of. My partner and I only played back-up roles in all of this; she, managing the logistics of housing, feeding, and supplying the canvassers, and I, working with maps and spreadsheets to figure out where to send the teams each day. It was, admittedly, necessary, if not exactly heroic, work.

“I’m not like the two of you,” Allan said when he stopped by with the promised dinner. “You do important work. I’m just living my life.”

“Not everybody,” I responded, “has a calling to politics.” And I think that’s true. I also wonder whether having politics as a vocation is entirely admirable.

Learning to Surf

That exchange with Allan got me thinking about the place of politics in my own life. I’ve been fortunate enough to be involved in activism of one sort or another for most of my 70 years, but it’s been just good fortune or luck that I happened to stumble into a life with a calling, even one as peculiar as politics.

There are historical moments when large numbers of people “just living” perfectly good lives find themselves swept up in the breaking wave of a political movement. I’ve seen quite a few of those moments, starting with the struggle of Black people for civil rights when I was a teenager, and the movement to stop the Vietnam War in that same era. Much more recently, I’ve watched thousands of volunteers in Kansas angrily reject the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization, which overturned a 50-year precedent protecting a woman’s right to end a pregnancy. Going door to door in a classic political field campaign, they defeated a proposed anti-abortion amendment to the Kansas constitution, while almost doubling the expected turnout for a midterm primary.

To some observers, in a red and landlocked state like Kansas, that wave of resistance seemed to come out of nowhere. It certainly surprised a lot of professionals, but the capacity to ride it didn’t, in fact, come out of nowhere. When given a choice, it turns out that a substantial majority of people in the middle of this country will vote in favor of women’s bodily autonomy. But many of them won’t do it without a push. To build such a successful electoral campaign required people who’d spent years honing the necessary skills in times when the political seas appeared almost unendurably flat.

Some of those skills, learned through repeated practice, were technical: crafting effective messages; targeting the right voters; navigating coalitions of organizations with sometimes overlapping, sometimes competing priorities. And some might be called “moral skills,” the cultivation of internal characteristics — patience, say, or hope — until they become second nature. The Greek philosopher Aristotle called those moral skills “virtues” and believed we acquire them just like any other skill — by practicing them until they become habits.

You could compare some of us with a political vocation to a surfer sitting on her board constantly scanning the sea ahead, hoping to discern the best waves as they form, hoping she’d practiced well enough to ride them. Like so many surfers of this sort, I’ve probably wiped out more often than I’ve successfully ridden the waves.

Character Flaws for Justice

“This is the year,” I told a different friend long ago, “that I want to develop character flaws.” She was understandably startled, not least because my character has never been what you might call spotless.

“Why would you want to do that?” she asked.

“Because I’m getting ready to work on a political campaign.” I was only half-joking. In fact, doing politics effectively requires habits that don’t come naturally to me — like keeping information close to my vest rather than sharing everything I know with all comers.

There’s a fine line, too, between sitting on information and telling lies. In fact, to do politics effectively, you must be willing to lie. This truth is often taken for granted by those involved. A recent New York Times article about a man who can’t stop lying referred to a study of people’s self-reported truthfulness. Writing about those who admit to lying frequently, reporter Ellen Barry says,

“This ‘small group of prolific liars,’ as the researchers termed it, constituted around 5.3 percent of the population but told half the reported lies, an average of 15 per day. Some were in professions, like retail or politics, that compelled them to lie. But others lied in a way that had no clear rationale.” [My emphasis added, of course.]

As Barry sees it, politics is self-evidently a profession that compels its practitioners to lie. And I tend to agree with her, though I’m less interested in the lies candidates tell voters to get elected than the ones organizers like me tell people to get them to join, stick with, or fund a campaign.

Often, we lie about whether we can win. As I’ve written previously, I worked on campaigns I was sure we were going to lose, but that I thought were worth fighting anyway. In 1995 and 1996, for instance, I helped build a field campaign to defeat California Proposition 209, which succeeded in outlawing affirmative action at every level of government. We didn’t have much of a chance, but we still built an army of volunteers statewide, in part by telling them that, though our opponents had the money, we had the people capable of engaging voters no one expected to participate.

So, we said we could win because we were thinking ahead. Proposition 209 represented a cynical effort (indeed, its authors called it the California Civil Rights Initiative) to harness white anxiety about what would soon be a nonwhite majority in California. We hoped that building a multi-racial coalition to fight this initiative, even if we lost, would prepare people for the struggles to come.

But did I really know we couldn’t win? At some point, I suppose I traded in one virtue — truthfulness — for another — hope. And then, to project confidence and encourage others to hope as well, I had to start believing my own lies (at least a bit).

The funny thing about hope, though, is that sometimes the lies you force yourself to believe turn out to be true. That’s what happened this year with the campaign in Nevada. You never have enough canvassers to talk to every voter, so you have to choose your main target groups. UNITE-HERE chose to target people of color in working-class neighborhoods who rarely or never participate in elections.

Voters in Nevada are unusual in that more than a third of them (37%) are registered to vote with a small party or have no party affiliation at all. This is the largest single group of voters in the state, and it included many of our targets. Registered Democrats have a 6% edge over Republicans in Nevada, but the question always is: Which way will the people in the mysterious middle vote — for us or them? During two weeks of early voting, I downloaded the statistics on the party affiliations of the voters in Washoe County, where I was working. Democrats were winning the mail-in ballots, but when it came to in-person voting, the Republicans were creaming us. It didn’t look good at all — except that the numbers of small-party or no-party voters dwarfed the consistent edge the Republicans held. Which way would they jump?

I typically kept those statistics to myself, since it wasn’t part of my job to look at them in the first place. In the upbeat daily briefing for our canvassing team leaders, I concentrated instead on reporting the crucial everyday numbers for us: How many doors did we knock on yesterday? How many conversations did we have with voters? How many supporters did we identify? Those numbers I could present with honest enthusiasm, pointing to improvements made, for instance, by working with individual canvassers on how to keep doors open and voters talking.

But the funny thing was this: the hope I was projecting turned out to be warranted. The strategy that failed in California in 1996 — bringing out unlikely voters in communities of workers and people of color — succeeded in Nevada in 2022. When we opened the mystery box, it turned out to contain voters for us.

One More Conversation

I once had a friend, Lauren, who, for years, had been a member of one of the political organizations that grew out of the 1960s radical group Students for a Democratic Society. She’d gone to meetings and demonstrations, collated newsletters, handed out flyers, and participated in a well-functioning system of collective childcare. One day, I asked her how the work was going.

“Oh,” she said. “I dropped out. I still spend every Wednesday night with Emma [the child whose care she had shared in that group], but I’m not doing political work anymore.”

“But why not?”

“I realized that everything about politics involves making people do things they don’t want to do and that’s not how I want to spend my life.”

Even now, years later, I can see her point. Whether it’s asking my fellow part-time university teachers to come to a union meeting, trying to get a stranger to accept a leaflet on the street, or convincing a potential voter to listen to me about why this election matters and should matter to them, my strange vocation often does involve attempting to get people to do things they don’t particularly want to do.

Of course, it’s because I do believe in whatever I’m trying to move them toward that I’m involved in such politics in the first place. Usually, it’s because I believe that my goal should be their goal, too, whether it’s racial or economic justice, women’s liberation, or just keeping the planet from burning up.

But that leads me to another character flaw politics requires. You could call it pride, or even arrogance; it’s the confidence that I know better than you what’s good for you. Oddly enough, it may turn out that it’s when I’m pushing the most selfish goals — when I’m working for something I myself need like a living wage or the right to control my own body — that my motives stand up best to my own scrutiny.

It’s then that I’m asking someone to participate in collective action for my own benefit, and what could be more honest than that?

Politics as a Vocation

Politics as a Vocation” was the title of a well-known lecture by German sociologist Max Weber. In it, he famously defined the state as “a human community that (successfully) claims the monopoly of the legitimate use of physical force within a given territory.” Even when the use of force is delegated to some other institution — say, the police — Weber argued that citizens accept the “right” of the police to use violence because it comes from the state. That source of legitimacy is the only thing that separates a police force (so to speak) from any other violent gang.

For Weber, politics meant either leading a state or influencing its leaders. So if a state controls the legitimate use of force, then politics involves deciding how that force is to be deployed — under what conditions and for what purposes. It’s a heavy responsibility that, he claimed, people take on for one of only two reasons: either as a means to an end (which could be anything from personal wealth to ending poverty) or for its own sake — for the pleasure and feeling of prestige that power bestows.

“The decisive means for politics,” Weber wrote, “is violence.” If he was right, then my friend’s intuition that politics is about making people do things they don’t want to do may not have been so off the mark. Even the form of politics that appears to challenge Weber’s premise — the tradition of nonviolent action — involves a form of coercion. Those who willingly expose themselves to political violence are also trying to make people do something they don’t want to do by invoking empathy (and possibly feelings of guilt).

If, in some fashion, all politics really does involve coercion, can a political life possibly be a morally good one? I still think so, but it requires tempering a commitment to a cause with what Weber called the “ethic of responsibility” — a willingness not only to honestly examine our motives but to genuinely consider the likely results when we choose to act on them. It’s not enough to have good intentions. It’s crucial to strive as well for good — if imperfect — outcomes.

“Politics,” Weber said, “is a strong and slow boring of hard boards. It takes both passion and perspective.” But there’s another kind of life he also recommended, even if with a bit of a sneer, to those who don’t measure up to the demands of politics as a vocation. Such people “would have done better,” he observed, “in simply cultivating plain brotherliness in personal relations.”

And therein lies the greatest moral danger for those of us who feel that our vocation is indeed politics: a contempt for that plain “brotherliness” (or sisterliness) that makes ordinary human life bearable. There’s a saying attributed to Carlos Fonseca, one of the founders of Nicaragua’s revolutionary party, the Sandinistas: “A man [of course, it’s a man!] who is tired has a right to rest. But a man who rests does not have the right to be in the vanguard.”

And there it is, a fundamental disrespect for ordinary human life, including the need for rest, that tempts the activist to feel her calling makes her better than the people she’s called to serve.

In the end, if we do politics at all, it should be precisely so that people can have ordinary lives, ones not constrained and distorted by the kinds of injustice political activists try to end.

“I’m just living my life,” my friend Allan told me. In truth, his life is far more admirable than he believes. I’d say that he has a vocation for kindness every bit as heroic as any political calling. We’re not the only folks he feeds. The day before he visited us, he’d delivered dinner to another friend after her shoulder surgery. He spends little on himself, so he can send most of the money he earns to his family in Central America. During the worst of the pandemic shutdown, he regularly checked in on all the old folks he knows, startling my partner and me into realizing that we’ve lived long enough to fall into the category of elders to be looked after.

At the end of this long political season, back home from Nevada, I find that I’m full of admiration for the life my friend Allan is “just living.” As I wait for the next Trumpist wave to rise, may I remember that “just living” is the whole point of doing politics.

Via Tomdispatch.com

]]>
Returning to Reno: In the Shadow of Roe’s Undoing https://www.juancole.com/2022/07/returning-shadow-undoing.html Mon, 11 Jul 2022 04:02:19 +0000 https://www.juancole.com/?p=205714 ( Tomdispatch.com ) – Recently, I told my friend Mimi that, only weeks from now, I was returning to Reno to help UNITE-HERE, the hospitality industry union, in the potentially nightmarish 2022 election. “Even though,” I added, “I hate electoral politics.”

She just laughed.

“What’s so funny?” I asked.

“You’ve been saying that as long as I’ve known you,” she replied with a grin.

How right she was. And “as long as I’ve known you” has been a pretty long time. We met more than a quarter of a century ago when my partner and I hired her as the first organizer in a field campaign to defeat Proposition 209. That ballot initiative was one of a series pandering to the racial anxieties of white Californians that swept through the state in the 1990s. The first of them was Prop 187, outlawing the provision of government services, including health care and education, to undocumented immigrants. In 1994, Californians approved that initiative by a 59% to 41% vote. A federal court, however, found most of its provisions unconstitutional and it never went into effect.

We weren’t so lucky with Proposition 209, which, in 1996, outlawed affirmative-action programs statewide at any level of government or public service. Its effects reverberate to this day, not least at the prestigious University of California’s many campuses.

A study commissioned 25 years later by its Office of the President revealed that “Prop 209 caused a decline in systemwide URG enrollment by at least twelve percent.” URGs are the report’s shorthand for “underrepresented groups” — in other words, Latinos, Blacks, and Native Americans. Unfortunately, Proposition 209’s impact on the racial makeup of the university system’s students has persisted for decades and, as that report observed, “led URG applicants to cascade out of UC into measurably less-advantageous universities.” Because of UC’s importance in California’s labor market, “this caused a decline in the total number of high-earning ($100,000) early-30s African American and Hispanic/Latinx Californians by at least three percent.”

Yes, we lost the Prop 209 election, but the organization we helped start back in 1995, Californians for Justice, still flourishes. Led by people of color, it’s become a powerful statewide advocate for racial justice in public education with a number of electoral and legislative victories to its name.

Shortcomings and the Short Run

How do I hate thee, electoral organizing? Let me count the ways. First, such work requires that political activists like me go wide, but almost never deep. It forces us to treat voters like so many items to be checked off a list, not as political actors in their own right. Under intense time pressure, your job is to try to reach as many people as possible, immediately discarding those who clearly aren’t on your side and, in some cases, even actively discouraging them from voting. In the long run, treating elections this way can weaken the connection between citizens and their government by reducing all the forms of democratic participation to a single action, a vote. Such political work rarely builds organized power that lasts beyond Election Day.

In addition, electoral campaigns sometimes involve lying not just to voters, but even to your own canvassers (not to speak of yourself) about whether you can win or not. In bad campaigns — and I’ve seen a couple of them — everyone lies about the numbers: canvassers about how many doors they’ve knocked on; local field directors about what their canvassers have actually done; and so on up the chain of command to the campaign director. In good campaigns, this doesn’t happen, but those may not, I suspect, be in the majority. And lying, of course, can become a terrible habit for anyone hoping to construct a strong organization, not to mention a better world.

Lying, as the philosopher Immanuel Kant argued, is a way of treating people as if they were merely things to be used. Electoral campaigns can often tempt organizers to take just such an instrumental approach to others, assuming voters and campaign workers have value only to the extent that they can help you win. Such an approach, however efficient in the short run, doesn’t build solidarity or democratic power for the long haul. Sometimes, of course, the threat is so great — as was true when it came to the possible reelection of Donald Trump in 2020 — that the short run simply matters more.

Another problem with elections? Campaigns so often involve convincing people to do something they’ve come to think of as a waste of time, namely, going to the polls. A 2018 senatorial race I worked on, for example, focused on our candidate’s belief in the importance of raising the minimum wage. And yes, we won that election, but four years later, the federal minimum wage is still stubbornly stuck at $7.25 an hour, though not, of course, through any fault of our candidate. Still, the voters who didn’t think electing Nevada Senator Jacky Rosen would improve their pay weren’t wrong.

On the other hand, the governor we helped elect that same year (and for whose reelection I’ll be working again soon) did come through for working Nevadans by, for example, signing legislation that guarantees a worker’s right to be recalled before anyone new is hired when a workplace reopens after a Covid shutdown.

You’ll hear some left-wing intellectuals and many working people who are, in the words of the old saying, “too broke to pay attention,” claim that elections don’t change anything. But such a view grows ever harder to countenance in a world where a Supreme Court disastrously reshaped by Donald Trump and Mitch McConnell is hell-bent on reshaping nearly the last century of American political life. It’s true that overturning Roe v. Wade doesn’t affect my body directly. I’m too old to need another abortion. Still, I’m just as angry as I was in 2016 at people who couldn’t bring themselves to vote for Hillary Clinton because she wasn’t Bernie Sanders. As I told such acquaintances at the time, “Yes, we’ll hate her and we’ll have to spend the next four years fighting her, but on the other hand, SUPREME COURT, SUPREME COURT, SUPREME COURT!”

Okay, maybe that wasn’t exactly the most elegant of arguments, but it was accurate, as anyone will tell you who’d like to avoid getting shot by a random heat-packing pedestrian, buried under the collapsing wall between church and state, or burned out in yet another climate-change-induced conflagration.

If Voting Changed Anything…

Back in 1996, as Election Day approached, Californians for Justice had expanded from two offices — in Oakland and Long Beach — to 11 around the state. We were paying a staff of 45 and expanding (while my partner and I lay awake many nights wondering how we’d make payroll at the end of the week). We were ready for our get-out-the-vote push.

Just before the election, one of the three organizations that had given us seed money published its monthly newsletter. The cover featured a photo of a brick wall spray-painted with the slogan: “If voting changed anything, they’d make it illegal.” Great, just what we needed!


Buy the Book

It’s not as if I didn’t agree, at least in part, with the sentiment. Certainly, when it comes to foreign policy and the projection of military force globally, there has been little difference between the two mainstream political parties. Since the end of World War II, Democrats and Republicans have cooperated in a remarkably congenial way when it comes to this country’s disastrous empire-building project, while financially rewarding the military-industrial complex, year after year, in a grandiose fashion.

Even in the Proposition 209 campaign, my interest lay more in building long-term political power for California communities of color than in a vote I already knew we would lose. Still, I felt then and feel today that there’s something deeply wrong with the flippant response of some progressives that elections aren’t worth bothering about. I’d grown up in a time when, in the Jim Crow South, voting was still largely illegal for Blacks and people had actually died fighting for their right to vote. Decades earlier, some of my feminist forebears had been tortured while campaigning for votes for women.

Making Voting Illegal Again

In 1965, President Lyndon Johnson signed the Voting Rights Act, explicitly outlawing any law or regulation that “results in the denial or abridgement of the right of any citizen to vote on account of race or color.” Its specific provisions required states or counties with a history of voter suppression to receive “pre-clearance” from the attorney general or the District Court for the District of Columbia for any further changes in election laws or practices. Many experts considered this provision the heart of that Act.

Then, in 2013, in Shelby County v. Holder, a Supreme Court largely shaped by Republican presidents tore that heart right out. Essentially, the court ruled that, because those once excluded from voting could now do so, such jurisdictions no longer needed preclearance to change their voting laws and regulations. In other words, because it was working, it should be set aside.

Not surprisingly, some states moved immediately to restrict access to voting rights. According to the Brennan Center for Justice, “within 24 hours of the ruling, Texas announced that it would implement a strict photo ID law. Two other states, Mississippi and Alabama, also began to enforce photo ID laws that had previously been barred because of federal preclearance.” Within two months, North Carolina passed what that center called “a far-reaching and pernicious voting bill” which:

“instituted a strict photo ID requirement; curtailed early voting; eliminated same day registration; restricted preregistration; ended annual voter registration drives; and eliminated the authority of county boards of elections to keep polls open for an additional hour.”

Fortunately, the Fourth Circuit Court of Appeals struck down the North Carolina law in 2016, and surprisingly the Supreme Court let that ruling stand.

But as it turned out, the Supremes weren’t done with the Voting Rights Act. In 2021, the present Trumpian version of the court issued a ruling in Brnovich v. Democratic National Committee upholding Arizona’s right to pass laws requiring people to vote only in precincts where they live, while prohibiting anyone who wasn’t a relative of the voter from hand-delivering mail-in ballots to the polls. The court held that, even though in practice such measures would have a disproportionate effect on non-white voters, as long as a law was technically the same for all voters, it didn’t matter that, in practice, it would become harder for some groups to vote.

Writing for the majority, Justice Samuel Alito declared that states have a different and more important interest in such voting restrictions: preventing voter fraud. In other words — at least in the minds of two-thirds of the present Supreme Court — some version of Donald Trump’s big lie about rigged elections and voter fraud has successfully replaced racist voter suppression as the primary future danger to free and fair elections.

Maybe elections do change something. Otherwise, why, in the wake of the 2020 elections, would “they” (including Republican-controlled state legislatures across significant parts of the country) be so intent on making it ever harder for certain people to vote? And if you think that’s bad, wait until the Supremes rule next year on the fringe legal theory of an “independent state legislature.” We may well see the court decide that a state’s legislature can legally overrule the popular vote in a federal election — just in time for the 2024 presidential race.

The Future Awaits Us

A couple of times a week I talk by phone with another friend. We began doing this at the height of George W. Bush’s and Dick Cheney’s vicious “war on terror.” We’d console each other when it came to the horrors of that conflict, including the illegal invasion of Iraq, the deaths and torture of Iraqi and Afghan civilians, and the seemingly endless expansion of American imperial meddling. We’re still doing it. Somehow, every time we talk, it seems as if the world has travelled one more mile on its way to hell in a handbasket.

Both of us have spent our lives trying, in our own modest fashion, to gum up the works of capitalism, militarism, and authoritarian government. To say that we’ve been less than successful would certainly be understating things. Still, we do keep at it, while discussing what in the world we can still do.

At this point in my life and my country’s slide into authoritarian misery, I often find it hard even to imagine what would be useful. Faced with such political disorientation, I fall back on a core conviction that, when the way forward is unclear, the best thing we can do is give people the experience of achieving in concert what they could never achieve by themselves. Sometimes, the product of an organizing drive is indeed victory. Even when it isn’t though, helping create a group capable of reading a political situation and getting things done, while having one another’s backs, is also a kind of victory.

That’s why, this election season, my partner and I are returning to Reno to join hotel housekeepers, cooks, and casino workers trying to ensure the reelection of two Democrats, Senator Catherine Cortez Masto and Governor Steve Sisolak, in a state where the margin of Democratic Party victories hasn’t grown since 2012.

From our previous experience, we know one thing: we’ll be working in a well-run campaign that won’t waste anyone’s time and has its eye on the future. As I wrote about the union’s 2020 presidential campaign for Joe Biden, more than winning a difficult election is at stake. What’s also important is building organized power for working people. In other words, providing the kind of training and leadership development that will send “back to every hotel, restaurant, casino, and airport catering service leaders who can continue to organize and advocate for their working-class sisters and brothers.”

I still hate electoral politics, but you don’t always get to choose the terrain you’re fighting on. Through its machinations at the federal, state, and county level, the Republican Party has been all but screaming its plans to steal the next presidential election. It’s no exaggeration to say that preserving some form of democratic government two years from now depends in part on keeping Republicans from taking over Congress, especially the Senate, this year.

So, it’s back to Reno, where the future awaits us. Let’s hope it’s one we can live with.

Copyright 2022 Rebecca Gordon

]]>
What Does It Mean that Women Now Dominate Higher Education? https://www.juancole.com/2022/06/dominate-higher-education.html Fri, 10 Jun 2022 04:02:48 +0000 https://www.juancole.com/?p=205121 ( Tomdispatch.com ) – In the last week of her life, my mother extracted a promise from me. “Make sure,” she said, “that Orion goes to college.”

I swore that I would, although I wasn’t at all sure how I’d make it happen. Even in the year 2000, average tuitions were almost 10 times what my own undergraduate school had charged 30 years earlier. I knew that sending my nephew to college would cost more money than I’d have when the time came. If he was going to college, like his aunt before him, he’d need financial help. The difference was that his “help” was likely to come not as a grant, but as life-defining loans.

“Orion,” by the way, is a pseudonym for my brother’s son, my parents’ only grandchild. To the extent that any of us placed family hopes in a next generation, he’s borne them all. Orion was only five years old when I made that promise and he lived 3,000 miles away in a depressed and depressing de-industrialized town in New York’s Hudson River Valley. We’d only met in person once at that point. Over the years, however, we kept in touch by phone, later by text message, and twice he even visited my partner and me in San Francisco.

A little more than a decade after I made that promise, Orion graduated from high school. I thought that with a scholarship, loans, and financial help from his father and us, we might indeed figure out how to pay the staggering costs of a college education, which now averages $35,000 a year, having doubled in this century alone.

It turned out, however, that money wasn’t the only obstacle to making good on my promise. There was another catch as well. Orion didn’t want to go to college. Certainly, the one guidance counselor at his 1,000-student public high school had made no attempt to encourage either him or, as far as I could tell, many of his classmates to plan for a post-high-school education. But would better academic counseling have made a difference? I doubt it.

A bright boy who had once been an avid reader, Orion was done with schooling by the time he’d turned 18. He made that clear when I visited him for a talk about his future. He had a few ideas about what he might do: join the military or the New York state police. In reality, though, it turned out that he had no serious interest in either of those careers.

He might have been a disaffected student, but he was — and is — a hard worker. Over the next few years, despite sky-high unemployment in the Hudson River Valley, he always had a job. He made and delivered pizzas. He cleaned rooms at a high-end hotel for wealthy equestrians. He did pick-up carpentry. And then he met an older tradesman who gave him an informal apprenticeship in laying floors and setting tile. Orion learned how to piece together hardwood and install carpeting. He proudly showed me photos of the floors he’d laid and the kitchens he’d tiled.

Eventually, he had to part ways with his mentor, who also happened to be a dangerous drunk. We had another talk and I reminded him of my promise to my mother. I’d recently gotten an unexpected windfall — an advance on a book I was writing, American Nuremberg — which put me in a position to help set him up in business. He bought a van, completed his tool set, and paid for a year’s insurance. Now, 10 years after graduating from high school, he’s making decent money as a respected tradesman and is thinking about marrying his girlfriend. He’s made himself a life without ever going to college.

I worry about him, though. Laying floors is a young person’s trade. A few years on your knees, swinging a hammer all day, will tear your joints apart. He can’t do this forever.

The Rising of the Women

Still, it turns out that my nephew isn’t the only young man to opt out of more schooling. I’ve seen this in my own classrooms and the data confirms it as a national and international trend.

I started teaching ethics at the University of San Francisco in 2005. It soon struck me that there were invariably more women in my classes than men. Nor was the subject matter responsible, since everyone had to pass a semester of ethics to graduate from that Jesuit university. No, as it turned out, my always-full classes represented the school’s overall gender balance. For a few years, I wondered whether such an overrepresentation of women could be attributed to parents who felt safer sending their daughters to a Catholic school, especially in a city with San Francisco’s reputation for sex, drugs, and rock ‘n’ roll.


Buy the Book

Recently, though, I came to realize that my classes were simply part of a much larger phenomenon already beginning to worry some observers. Until about 1990, men invariably outnumbered women at every level of post-secondary education and more of them graduated, too. At four-year colleges and in post-graduate programs or in community colleges (once they became more prevalent), more men earned two-year, four-year, master’s, and doctorate-level degrees.

It was during the 1970s that the ratio began to shift. In 1970, among recent high-school graduates, 32% of the men and just 20% of the women enrolled in post-secondary institutions. By 1990, equal percentages – around 32% — were going to college. In the years that followed, college attendance continued to increase for both sexes, but significantly faster for women who, in 1994, surpassed men. Since the end of the 1990s, men’s college attendance has stayed relatively stable at about 37% of high-school graduates.

Women’s campus presence, however, has only continued to climb with 44% of recent female high-school graduates enrolled in post-secondary schools by 2019.

So, the problem, if there is one, isn’t that men have stopped going to college. A larger proportion of them, in fact, attend today than at any time in our history. It’s just that an even larger proportion of women are doing so.

As a result, if you visit a college campus, you should see roughly three women — now about 60% of all college students — for every two men. And that gap has been growing ever wider, even during the disruption of the Covid pandemic.

Not only do more women now attend college than men, but they’re more likely to graduate and receive degrees. According to the National Center for Educational Statistics, in 1970, men received 57% of both two- and four-year degrees, 61% of master’s degrees, and 90% of doctorates. By 2019, women were earning the majority of degrees at all levels.

One unexpected effect of this growing college gender gap is that it’s becoming harder for individual women to get into selective schools. The Hechinger Report, a non-profit institution focused on education, lists a number of well-known ones where male applicants have a better chance of being accepted, including:

“Boston, Bowdoin and Swarthmore colleges; Brown, Denison, Pepperdine, Pomona, Vanderbilt and Wesleyan universities; and the University of Miami. At each school, men were at least 2 percentage points more likely than women to be accepted in both 2019 and 2020. Pitzer College admitted 20% of men last year compared to 15% of women, and Vassar College accepted 28% of men compared to 23% of women. Both had more than twice as many female applicants as male applicants.”

Even for Vassar, once a women’s college, having too many women is now apparently a problem.

In addition, in recent years, despite those lower acceptance rates for women at elite schools, colleges have generally had to deal with declining enrollments, a trend only accelerated by the Covid pandemic. As Americans postpone having children and have fewer when they do, the number of people reaching college age is actually shrinking. Two-year colleges have been especially hard hit.

And there’s the debt factor. Like my nephew Orion, more potential students, especially men, are now weighing the problem of deferring their earnings, while acquiring a staggering debt load from their years at college. Some of them are opting instead to try to make a living without a degree. Certain observers think this shift has been partially caused by a pandemic-fueled rise in wages in the lower tiers of the American work force.

A Mystery

Why are there fewer men than women in college today? On this, theories abound, but genuine answers are few. Conservatives offer a number of explanations that echo their culture-war slogans, including that “the atmosphere on the nation’s campuses has become increasingly hostile to masculinity.”

A Wall Street Journal op-ed ascribed it in part to “labor-saving innovations in household management and child care — automatic washing machines, disposable diapers, inexpensive takeout restaurants — as well as new forms of birth control [that] helped women pursue college degrees and achieve new vocational ambitions.” But the biggest problem, write the op-ed’s authors, may be that girls simply do better in elementary and secondary school, which discourages boys from going on to college. This problem, they argue, is attributable not only to the advent of washing machines, but ultimately to the implementation of the Great Society’s liberal social policies. Citing Charles Murray, the reactionary co-author of the 1994 book The Bell Curve, they blame women’s takeover of higher education on the progressive social policies of the 1960s, the rise of the “custodial” (or welfare) state, and the existence of a vast pool of jailed men. They write:

“[T]here are about 1.24 million more men who are incarcerated than women, largely preventing them from attending traditional college. Scholars such as Charles Murray have long demonstrated that expanded government entitlements following the Great Society era have reduced traditional family formation, reduced incentives to excel both in school and on the job, and increased crime.”

Critics to the left have also cited male incarceration as a factor in the college gender divide, although they’re more likely to blame racist police and policies. Sadly, the devastation caused by jailing so many Black, Latino, and Native American men has only begun to be understood, but given the existing racial divide in college attendance, I seriously doubt that many of those men would be in college even if they weren’t in prison.

Some observers have also suggested that, given the staggering rise in college tuitions, young men, especially from the working and middle classes, often make a sound if instinctive decision that a college education will not repay their time, effort, and the debt load it entails. Like my nephew, they may indeed be better off entering a well-paying trade and getting an early start on building their savings.

Do Women Need College More Than Men?

If some young men now believe that college won’t reward them sufficiently to warrant the investment, many young women have rightly judged that they will need a college education to have any hope of earning a decent living. It’s no accident that their college enrollment skyrocketed in the 1970s. After a long post-World-War II economic expansion, that was the moment when wages in this country first began stagnating, a trend that continued in the 1980s when President Ronald Reagan launched his attacks on unions, while the federal minimum wage barely rose. In fact, it has remained stuck at $7.25 per hour since 2009.

First established in 1938, the minimum wage was intended to allow a single adult (then assumed to be a man) to support a non-earning adult (assumed to be his wife), and several children. It was called a “breadwinner” wage. The feminism that made work outside the home possible for women, saving the lives and sanity of so many of us, provided a useful distraction from those stagnant real wages, rising inequality, and the increased immiseration of millions (not to speak of the multiplication of billionaires).

In the last few decades of the twentieth century, many women came to believe that working for money was their personal choice. In truth, I suspect that they were also responding to new economic realities and the end of that “breadwinner” wage. I think the college gender gap, which grew ever wider as wages fell, is at least in part a consequence of those changes. Few of my women students believe that they have a choice when it comes to supporting themselves, even if they haven’t necessarily accepted how limited the kind of work they’re likely to find will be. Whether they form partnered households or not, they take it for granted that they’ll have to support themselves financially.

This makes a college degree even more important, since having it has a bigger impact on women’s earnings than on men’s. A study by the Federal Reserve Bank in St. Louis confirmed this. Reviewing 2015 census data, it showed that the average wage for a man with only a high-school diploma was around $12 per hour. Women earned 24.4% less than that, or about $9 hourly. On the other hand, women got a somewhat greater boost (28%) from earning a two-year degree than men (22%). For a four-year degree, it was 68% for women and 62% for men.

In other words, although a college education improves income prospects for both genders, it does more for women — even if not enough to raise their income to the level of men with the same education. The income gender gap remains stubbornly fixed in men’s favor. Like Alice in Through the Looking Glass, it seems women still have to run faster just to avoid losing ground. This means that for us, earning a decent living requires at least some college, which is less true for men.

What Does the Future Hold?

Sadly, as college becomes ever more the preserve of women, I suspect it will also lose at least some of its social and economic value. They let us in and we turned out to be too good at it. My prediction? Someday, college will be dismissed as something women do and therefore not an important marker of social or economic worth.

As with other realms that became devalued when women entered them (secretarial work, for example, or family medicine), I expect that companies will soon begin dropping the college-degree requirement for applicants.

In fact, it already seems to be happening. Corporations like IBM, Accenture, and Bank of America have begun opting for “skills-based” rather than college-based hiring. According to a CNBC report, a recent Harvard Business School study examined job postings for software quality-assurance engineers and “found that only 26% of Accenture’s postings for the job contained a degree requirement. At IBM, just 29% did.” Even the government is dropping some college-degree requirements. According to the same report, in January 2021, the White House issued an executive order on “Limiting [the] Use of Educational Requirements in Federal Service Contracts.” When hiring for IT positions, the order says, considering only those with college degrees “excludes capable candidates and undermines labor market efficiencies.” And recently, Maryland announced that it’s dropping the college graduation requirement for thousands of state positions.

Of course, this entire economic argument assumes that the value of a college education is purely extrinsic and can be fully measured in dollars. As a long-time college teacher, I still believe that education has an intrinsic value, beyond preparing “job-ready” workers or increasing their earning potential. At its best, college offers a unique opportunity to encounter new ideas in expansive ways, learn how to weigh evidence and arguments, and contemplate what it means to be a human being and a citizen of the world. It can make democracy possible in a time of creeping authoritarianism.

What kind of future do we face in a world where such an experience could be reduced, like knitting (which was once an honorable way for both sexes to earn a living), to a mere hobby for women?

Copyright 2022 Rebecca Gordon

Via Tomdispatch.com

]]>
I may have failed as a Tax Resister, but Corporate America is Hugely Successful at it https://www.juancole.com/2022/04/resister-corporate-successful.html Wed, 13 Apr 2022 04:02:41 +0000 https://www.juancole.com/?p=204018 ( Tomdispatch.com) – Every April, as income-tax returns come due, I think about the day 30 years ago when I opened my rented mailbox and saw a business card resting inside. Its first line read, innocently enough, “United States Treasury.” It was the second line — “Internal Revenue Service” — that took my breath away. That card belonged to an IRS revenue agent and scrawled across it in blue ink was the message: “Call me.”

I’d used that mailbox as my address on the last tax return I’d filed, eight years earlier. Presumably, the agent thought she’d be visiting my home when she appeared at the place where I rented a mailbox, which, as I would discover, was the agency’s usual first step in running down errant taxpayers. Hands shaking, I put a quarter in a pay phone and called my partner. “What’s going to happen to us?” I asked her.

Resisting War Taxes

I knew that the IRS wasn’t visiting me as part of an audit of my returns, since I hadn’t filed any for eight years. My partner and I were both informal tax resisters — she, ever since joining the pacifist Catholic Worker organization; and I, ever since I’d returned from Nicaragua in 1984. I’d spent six months traveling that country’s war zones as a volunteer with Witness for Peace. My work involved recording the testimony of people who had survived attacks by the “Contras,” the counterrevolutionary forces opposing the leftist Sandinista government then in power (after a popular uprising deposed the U.S.-backed dictator, Anastasio Somoza). At the time, the Contras were being illegally supported by the administration of President Ronald Reagan.

With training and guidance from the CIA, they were using a military strategy based on terrorizing civilians in the Nicaraguan countryside. Their targets included newly built schools, clinics, roads, and phone lines — anything the revolutionary government had, in fact, achieved — along with the campesinos (the families of subsistence farmers) who used such things. Contra attacks very often involved torture: flaying people alive, severing body parts, cutting open the wombs of pregnant women. Nor were such acts mere aberrations. They were strategic choices made by a force backed and directed by the United States.

When I got back to the United States, I simply couldn’t imagine paying taxes to subsidize the murder of people in another country, some of whom I knew personally. I continued working, first as a bookkeeper, then at a feminist bookstore, and eventually at a foundation. But with each new employer, on my W-4 form I would claim that I expected to owe no taxes that year, so the IRS wouldn’t take money out of my paycheck. And I stopped filing tax returns.

Not paying taxes for unjust wars has a long history in this country. It goes back at least to Henry David Thoreau’s refusal to pay them to support the Mexican-American War (1846-1848). His act of resistance landed him in jail for a night and led him to write On the Duty of Civil Disobedience, dooming generations of high-school students to reading the ruminations of a somewhat self-satisfied tax resister. Almost a century later, labor leader and pacifist A.J. Muste revived Thoreau’s tradition, once even filing a copy of the Duty of Civil Disobedience in place of his Form 1040. After supporting textile factory workers in their famous 1919 strike in Lowell, Massachusetts, and some 20 years later helping form and run the Amalgamated Textile Workers of America (where my mother once worked as a labor organizer), Muste eventually came to serve on the board of the War Resisters League (WRL).

For almost a century now, WRL, along with the even older Fellowship of Reconciliation and other peace groups, have promoted antiwar tax resistance as a nonviolent means of confronting this country’s militarism. In recent years, both organizations have expanded their work beyond opposing imperial adventures overseas to stand against racist, militarized policing at home as well.

Your Tax Dollars at Work

Each year, the WRL publishes a “pie chart” poster that explains “where your income tax money really goes.” In most years, more than half of it is allocated to what’s euphemistically called “defense.” This year’s poster, distinctly an outlier, indicates that pandemic-related spending boosted the non-military portion of the budget above the 50% mark for the first time in decades. Still, at $768 billion, we now have the largest Pentagon budget in history (and it’s soon to grow larger yet). That’s a nice reward for a military whose main achievements in this century are losing major wars in Iraq and Afghanistan.

But doesn’t the war in Ukraine justify all those billions? Not if you consider that none of the billions spent in previous years stopped Russia from invading. As Lindsay Koshgarian argues at Newsweek, “Colossal military spending didn’t prevent the Russian invasion, and more money won’t stop it. The U.S. alone already spends 12 times more on its military than Russia. When combined with Europe’s biggest military spenders, the U.S. and its allies on the continent outspend Russia by at least 15 to 1. If more military spending were the answer, we wouldn’t be in this situation.”

“Defense” spending could, however, just as accurately be described as welfare for military contractors, because that’s where so much of the money eventually ends up. The top five weapons-making companies in 2021 were Lockheed Martin, Raytheon Technology, Boeing, Northrup Grumman, and General Dynamics. Together, they reaped $198 billion in taxpayer funds last year alone. In 2020, the top 100 contractors took in $551 billion. Of course, we undoubtedly got some lovely toys for our money, but I’ve always found it difficult to live in — or eat — a drone. They’re certainly useful, however, for murdering a U.S. citizen in Yemen or so many civilians elsewhere in the Greater Middle East and Africa.

The Pentagon threatens the world with more than the direct violence of war. It’s also a significant factor driving climate change. The U.S. military is the world’s largest institutional consumer of oil. If it were a country, the Pentagon would rank 55th among the world’s carbon emitters.


Buy the Book

While the military budget increases yearly, federal spending that actually promotes human welfare has fallen over the last decade. In fact, such spending for the program most Americans think of when they hear the word “welfare” — Temporary Aid for Needy Families, or TANF — hasn’t changed much since 1996, the year the Personal Responsibility and Work Opportunity Reconciliation Act (so-called welfare reform) took effect. In 1997, federal expenditures for TANF totaled about $16.6 billion. That figure has remained largely unchanged. However, according to the Congressional Research Service, since the program began, such expenditures have actually dropped 40% in value, thanks to inflation.

Unlike military outlays, spending for the actual welfare of Americans doesn’t increase over time. In fact, as a result of the austerity imposed by the 2011 Budget Control Act, the Center for Budget and Policy Priorities reports that “by 2021 non-defense funding (excluding veterans’ health care) was about 9% lower than it had been 11 years earlier after adjusting for inflation and population growth.” Note that Congress passed that austerity measure a mere three years after the subprime lending crisis exploded, initiating the Great Recession, whose reverberations still ring in our ears.

This isn’t necessarily how taxpayers want their money spent. In one recent poll, a majority of them, given the choice, said they would prioritize education, social insurance, and health care. A third would rather that their money not be spent on war at all. And almost 40% believed that the federal government simply doesn’t spend enough on social-welfare programs.

Death May Be Coming for Us All, But Taxes Are for the Little People

Pollsters don’t include corporations like Amazon, FedEx, and Nike in their surveys of taxpayers. Perhaps the reason is that those corporate behemoths often don’t pay a dollar in income tax. In 2020, in fact, 55 top U.S. companies paid no corporate income taxes whatsoever. Nor would the survey takers have polled billionaires like Jeff Bezos, Elon Musk, or Carl Icahn, all of whom also manage the neat trick of not paying any income tax at all some years.

In 2021, using “a vast trove of Internal Revenue Service data on the tax returns of thousands of the nation’s wealthiest people, covering more than 15 years,” ProPublica published a report on how much the rich really pay in taxes. The data show that, between 2014 and 2018, the richest Americans paid a measly “true tax” rate of 3.4% on the growth of their wealth over that period. The average American — you — typically pays 14% of his or her income each year in federal income tax. As ProPublica explains:

“America’s billionaires avail themselves of tax-avoidance strategies beyond the reach of ordinary people. Their wealth derives from the skyrocketing value of their assets, like stock and property. Those gains are not defined by U.S. laws as taxable income unless and until the billionaires sell.”

So, if the rich avoid paying taxes by holding onto their assets instead of selling them, where do they get the money to live like the billionaires they are? The answer isn’t complicated: they borrow it.­ Using their wealth as collateral, they typically borrow millions of dollars to live on, using the interest on those loans to offset any income they might actually receive in a given year and so reducing their taxes even more.

While they do avoid paying taxes, I’m pretty sure those plutocrats aren’t tax resisters. They’re not using such dodges to avoid paying for U.S. military interventions around the world, which was why I stopped paying taxes for almost a decade. Through the Reagan administration and the first Bush presidency, with the Savings and Loan debacle and the first Gulf War, there was little the U.S. government was doing that I wanted to support.

These days, however, having lived through the “greed is good” decade, having watched a particularly bizarre version of American individualism reach its pinnacle in the presidency of billionaire Donald Trump, I think about taxes a bit differently. I still don’t want to pay for the organized global version of murder that is war, American-style, but I’ve also come to see that taxes are an important form of communal solidarity. Our taxes allow us, though the government, to do things together we can’t do as individuals — like generating electricity or making sure our food is clean and safe. In a more truly democratic society, people like me might feel better about paying taxes, since we’d be deciding more collectively how to spend our common wealth for the common good. We might even buy fewer drones.

Until that day comes, there are still many ways, as the War Resisters League makes clear, to resist paying war taxes, should you choose to do so. I eventually started filing my returns again and paid off eight years of taxes, penalties, and interest. It wasn’t the life decision I’m proudest of, but here’s what happened.

“Too Distraught

The method I chose was, as I’ve said, not to file my tax returns, which, if your employer doesn’t withhold any taxes and send them to the feds, denies the federal government tax revenue from you. Mind you, for most of those years I wasn’t making much money. We’re talking about hundreds of dollars, not hundreds of thousands of dollars in lost tax revenue. Over those years, I got just the occasional plaintive query from the IRS about whether I’d forgotten my taxes. But during the mid-1980s, the IRS upgraded its computers, improving its ability to capture income reported by employers and so enabling it to recreate the returns a taxpayer should have filed, but didn’t. And so, in 1992 an IRS agent visited my mailbox.

Only a month earlier, a friend, my partner, and I had bought a house together. So, when I saw that “Call me,” on the agent’s business card, I was terrified that my act of conscience was going to lose us our life savings. Trembling, I called the revenue agent and set up an appointment at the San Francisco federal building, a place I only knew as the site of many antiwar demonstrations I’d attended.

I remember the agent meeting us at the entrance to a room filled with work cubicles. I took a look at her and my gaydar went off. “Oh, my goodness,” I thought, “she’s a lesbian!” Maybe that would help somehow — not that I imagined for a second that my partner and I were going to get the “family discount” we sometimes received from LGBT cashiers.

The three of us settled into her cubicle. She told me that I would have to file returns from 1986 to 1991 (the IRS computers, it turned out, couldn’t reach back further than that) and also pay the missing taxes, penalties, and interest on all of it. With an only partially feigned quaver in my voice, I asked, “Are you going to take our house away?”

She raised herself from her chair just enough to scan the roomful of cubicles around us, then sat down again. Silently, she shook her head. Well, it may not have been the family discount, but it was good enough for me.

Then she asked why I hadn’t filed my taxes and, having already decided I was going to pay up, I didn’t explain anything about those Nicaraguan families our government had maimed or murdered. I didn’t say why I’d been unwilling or what I thought it meant to pay for this country’s wars in Central America or preparations for more wars to come. “I just kept putting it off,” I said, which was true enough, if not the whole truth.

Somehow, she bought that and asked me one final question, “By the way, what do you do for a living?”

“I’m an accountant,” I replied.

Her eyebrows flew up and she shook her head, but that was that.

Why did I give up so easily? There were a few reasons. The Contra war in Nicaragua had ended after the Sandinistas lost the national elections in 1990. Nicaraguans weren’t stupid. They grasped that, as long as the Sandinistas were in power, the U.S. would continue to embargo their exports and arm and train the Contras. And I’d made some changes in my own life. After decades of using part-time paid work to support my full-time activism, I’d taken a “grown-up” job to help pay my ailing and impoverished mother’s rent, once I convinced her to move from subsidized housing in Cambridge, Massachusetts, to San Francisco. And, of course, I’d just made a fundamental investment of my own in the status quo. I’d bought a house. Even had I been willing to lose it, I couldn’t ask my co-owners to suffer for my conscience.

But in the end, I also found I just didn’t have the courage to defy the government of the world’s most powerful country.

As it happened, I wasn’t the only person in the Bay Area to get a visit from a revenue agent that year. The IRS, it turned out, was running a pilot program to see whether they could capture more unpaid taxes by diverting funds from auditing to directly pursuing non-filers like me. Several resisters I knew were caught in their net, including my friend T.J.

An agent came to T.J.’s house and sat at his kitchen table. Unlike “my” agent, T.J.’s not only asked him why he hadn’t filed his returns, but read from a list of possible reasons: “Did you have a drug or alcohol problem? Were you ill? Did you have trouble filling out the forms?”

“Don’t you have any political reasons on your list?” T.J. asked.

The agent looked doubtful. “Political? Well, there’s ‘too distraught.’”

“That’s it,” said T.J. “Put down ‘too distraught.’”

T.J. died years ago, but I remember him every tax season when I again have to reckon with just how deeply implicated all of us are in this country’s military death machine, whether we pay income taxes or not. Still, so many of us keep on keeping on, knowing we must never become too distraught to find new ways to oppose military aggression anywhere in the world, including, of course, Ukraine, while affirming life as best we can.

Copyright 2022 Rebecca Gordon

Via Tomdispatch.com

]]>
The Rise of the (Real) Murderbots and why we Need to Abolish Them before it is is too Late https://www.juancole.com/2022/01/murderbots-abolish-before.html Mon, 10 Jan 2022 05:02:56 +0000 https://www.juancole.com/?p=202300 ( Tomdispatch.com ) – Here’s a scenario to consider: a military force has purchased a million cheap, disposable flying drones each the size of a deck of cards, each capable of carrying three grams of explosives — enough to kill a single person or, in a “shaped charge,” pierce a steel wall. They’ve been programmed to seek out and “engage” (kill) certain human beings, based on specific “signature” characteristics like carrying a weapon, say, or having a particular skin color. They fit in a single shipping container and can be deployed remotely. Once launched, they will fly and kill autonomously without any further human action.

Science fiction? Not really. It could happen tomorrow. The technology already exists.

In fact, lethal autonomous weapons systems (LAWS) have a long history. During the spring of 1972, I spent a few days occupying the physics building at Columbia University in New York City. With a hundred other students, I slept on the floor, ate donated takeout food, and listened to Alan Ginsberg when he showed up to honor us with some of his extemporaneous poetry. I wrote leaflets then, commandeering a Xerox machine to print them out.

And why, of all campus buildings, did we choose the one housing the Physics department? The answer: to convince five Columbia faculty physicists to sever their connections with the Pentagon’s Jason Defense Advisory Group, a program offering money and lab space to support basic scientific research that might prove useful for U.S. war-making efforts. Our specific objection: to the involvement of Jason’s scientists in designing parts of what was then known as the “automated battlefield” for deployment in Vietnam. That system would indeed prove a forerunner of the lethal autonomous weapons systems that are poised to become a potentially significant part of this country’s — and the world’s — armory.

Early (Semi-)Autonomous Weapons

Washington faced quite a few strategic problems in prosecuting its war in Indochina, including the general corruption and unpopularity of the South Vietnamese regime it was propping up. Its biggest military challenge, however, was probably North Vietnam’s continual infiltration of personnel and supplies on what was called the Ho Chi Minh Trail, which ran from north to south along the Cambodian and Laotian borders. The Trail was, in fact, a network of easily repaired dirt roads and footpaths, streams and rivers, lying under a thick jungle canopy that made it almost impossible to detect movement from the air.

The U.S. response, developed by Jason in 1966 and deployed the following year, was an attempt to interdict that infiltration by creating an automated battlefield composed of four parts, analogous to a human body’s eyes, nerves, brain, and limbs. The eyes were a broad variety of sensors — acoustic, seismic, even chemical (for sensing human urine) — most dropped by air into the jungle. The nerve equivalents transmitted signals to the “brain.” However, since the sensors had a maximum transmission range of only about 20 miles, the U.S. military had to constantly fly aircraft above the foliage to catch any signal that might be tripped by passing North Vietnamese troops or transports. The planes would then relay the news to the brain. (Originally intended to be remote controlled, those aircraft performed so poorly that human pilots were usually necessary.)


Buy the Book

And that brain, a magnificent military installation secretly built in Thailand’s Nakhon Phanom, housed two state-of-the-art IBM mainframe computers. A small army of programmers wrote and rewrote the code to keep them ticking, as they attempted to make sense of the stream of data transmitted by those planes. The target coordinates they came up with were then transmitted to attack aircraft, which were the limb equivalents. The group running that automated battlefield was designated Task Force Alpha and the whole project went under the code name Igloo White.

As it turned out, Igloo White was largely an expensive failure, costing about a billion dollars a year for five years (almost $40 billion total in today’s dollars). The time lag between a sensor tripping and munitions dropping made the system ineffective. As a result, at times Task Force Alpha simply carpet-bombed areas where a single sensor might have gone off. The North Vietnamese quickly realized how those sensors worked and developed methods of fooling them, from playing truck-ignition recordings to planting buckets of urine.

Given the history of semi-automated weapons systems like drones and “smart bombs” in the intervening years, you probably won’t be surprised to learn that this first automated battlefield couldn’t discriminate between soldiers and civilians. In this, they merely continued a trend that’s existed since at least the eighteenth century in which wars routinely kill more civilians than combatants.

None of these shortcomings kept Defense Department officials from regarding the automated battlefield with awe. Andrew Cockburn described this worshipful posture in his book Kill Chain: The Rise of the High-Tech Assassins, quoting Leonard Sullivan, a high-ranking Pentagon official who visited Vietnam in 1968: “Just as it is almost impossible to be an agnostic in the Cathedral of Notre Dame, so it is difficult to keep from being swept up in the beauty and majesty of the Task Force Alpha temple.”

Who or what, you well might wonder, was to be worshipped in such a temple?

Most aspects of that Vietnam-era “automated” battlefield actually required human intervention. Human beings were planting the sensors, programming the computers, piloting the airplanes, and releasing the bombs. In what sense, then, was that battlefield “automated”? As a harbinger of what was to come, the system had eliminated human intervention at a single crucial point in the process: the decision to kill. On that automated battlefield, the computers decided where and when to drop the bombs.

In 1969, Army Chief of Staff William Westmoreland expressed his enthusiasm for this removal of the messy human element from war-making. Addressing a luncheon for the Association of the U.S. Army, a lobbying group, he declared:

“On the battlefield of the future enemy forces will be located, tracked, and targeted almost instantaneously through the use of data links, computer-assisted intelligence evaluation, and automated fire control. With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opposition will be less important.”

What Westmoreland meant by “fix the opposition” was kill the enemy. Another military euphemism in the twenty-first century is “engage.” In either case, the meaning is the same: the role of lethal autonomous weapons systems is to automatically find and kill human beings, without human intervention.

New LAWS for a New Age — Lethal Autonomous Weapons Systems

Every autumn, the British Broadcasting Corporation sponsors a series of four lectures given by an expert in some important field of study. In 2021, the BBC invited Stuart Russell, professor of computer science and founder of the Center for Human-Compatible Artificial Intelligence at the University of California, Berkeley, to deliver those “Reith Lectures.” His general subject was the future of artificial intelligence (AI), and the second lecture was entitled “The Future Role of AI in Warfare.” In it, he addressed the issue of lethal autonomous weapons systems, or LAWS, which the United Nations defines as “weapons that locate, select, and engage human targets without human supervision.”

Russell’s main point, eloquently made, was that, although many people believe lethal autonomous weapons are a potential future nightmare, residing in the realm of science fiction, “They are not. You can buy them today. They are advertised on the web.”

I’ve never seen any of the movies in the Terminator franchise, but apparently military planners and their PR flacks assume most people derive their understanding of such LAWS from this fictional dystopian world. Pentagon officials are frequently at pains to explain why the weapons they are developing are not, in fact, real-life equivalents of SkyNet — the worldwide communications network that, in those films, becomes self-conscious and decides to eliminate humankind. Not to worry, as a deputy secretary of defense told Russell, “We have listened carefully to these arguments and my experts have assured me that there is no risk of accidentally creating SkyNet.”

Russell’s point, however, was that a weapons system doesn’t need self-awareness to act autonomously or to present a threat to innocent human beings. What it does need is:

  • A mobile platform (anything that can move, from a tiny quadcopter to a fixed-wing aircraft)
  • Sensory capacity (the ability to detect visual or sound information)
  • The ability to make tactical decisions (the same kind of capacity already found in computer programs that play chess)
  • The ability to “engage,” i.e. kill (which can be as complicated as firing a missile or dropping a bomb, or as rudimentary as committing robot suicide by slamming into a target and exploding)

The reality is that such systems already exist. Indeed, a government-owned weapons company in Turkey recently advertised its Kargu drone — a quadcopter “the size of a dinner plate,” as Russell described it, which can carry a kilogram of explosives and is capable of making “anti-personnel autonomous hits” with “targets selected on images and face recognition.” The company’s site has since been altered to emphasize its adherence to a supposed “man-in-the-loop” principle. However, the U.N. has reported that a fully-autonomous Kargu-2 was, in fact, deployed in Libya in 2020.

You can buy your own quadcopter right now on Amazon, although you’ll still have to apply some DIY computer skills if you want to get it to operate autonomously.

The truth is that lethal autonomous weapons systems are less likely to look like something from the Terminator movies than like swarms of tiny killer bots. Computer miniaturization means that the technology already exists to create effective LAWS. If your smart phone could fly, it could be an autonomous weapon. Newer phones use facial recognition software to “decide” whether to allow access. It’s not a leap to create flying weapons the size of phones, programmed to “decide” to attack specific individuals, or individuals with specific features. Indeed, it’s likely such weapons already exist.

Can We Outlaw LAWS?

So, what’s wrong with LAWS, and is there any point in trying to outlaw them? Some opponents argue that the problem is they eliminate human responsibility for making lethal decisions. Such critics suggest that, unlike a human being aiming and pulling the trigger of a rifle, a LAWS can choose and fire at its own targets. Therein, they argue, lies the special danger of these systems, which will inevitably make mistakes, as anyone whose iPhone has refused to recognize his or her face will acknowledge.

In my view, the issue isn’t that autonomous systems remove human beings from lethal decisions. To the extent that weapons of this sort make mistakes, human beings will still bear moral responsibility for deploying such imperfect lethal systems. LAWS are designed and deployed by human beings, who therefore remain responsible for their effects. Like the semi-autonomous drones of the present moment (often piloted from half a world away), lethal autonomous weapons systems don’t remove human moral responsibility. They just increase the distance between killer and target.

Furthermore, like already outlawed arms, including chemical and biological weapons, these systems have the capacity to kill indiscriminately. While they may not obviate human responsibility, once activated, they will certainly elude human control, just like poison gas or a weaponized virus.

And as with chemical, biological, and nuclear weapons, their use could effectively be prevented by international law and treaties. True, rogue actors, like the Assad regime in Syria or the U.S. military in the Iraqi city of Fallujah, may occasionally violate such strictures, but for the most part, prohibitions on the use of certain kinds of potentially devastating weaponry have held, in some cases for over a century.

Some American defense experts argue that, since adversaries will inevitably develop LAWS, common sense requires this country to do the same, implying that the best defense against a given weapons system is an identical one. That makes as much sense as fighting fire with fire when, in most cases, using water is much the better option.

The Convention on Certain Conventional Weapons

The area of international law that governs the treatment of human beings in war is, for historical reasons, called international humanitarian law (IHL). In 1995, the United States ratified an addition to IHL: the 1980 U.N. Convention on Certain Conventional Weapons. (Its full title is much longer, but its name is generally abbreviated as CCW.) It governs the use, for example, of incendiary weapons like napalm, as well as biological and chemical agents.

The signatories to CCW meet periodically to discuss what other weaponry might fall under its jurisdiction and prohibitions, including LAWS. The most recent conference took place in December 2021. Although transcripts of the proceedings exist, only a draft final document — produced before the conference opened — has been issued. This may be because no consensus was even reached on how to define such systems, let alone on whether they should be prohibited. The European Union, the U.N., at least 50 signatory nations, and (according to polls), most of the world population believe that autonomous weapons systems should be outlawed. The U.S., Israel, the United Kingdom, and Russia disagree, along with a few other outliers.

Prior to such CCW meetings, a Group of Government Experts (GGE) convenes, ostensibly to provide technical guidance for the decisions to be made by the Convention’s “high contracting parties.” In 2021, the GGE was unable to reach a consensus about whether such weaponry should be outlawed. The United States held that even defining a lethal autonomous weapon was unnecessary (perhaps because if they could be defined, they could be outlawed). The U.S. delegation put it this way:

“The United States has explained our perspective that a working definition should not be drafted with a view toward describing weapons that should be banned. This would be — as some colleagues have already noted — very difficult to reach consensus on, and counterproductive. Because there is nothing intrinsic in autonomous capabilities that would make a weapon prohibited under IHL, we are not convinced that prohibiting weapons based on degrees of autonomy, as our French colleagues have suggested, is a useful approach.”

The U.S. delegation was similarly keen to eliminate any language that might require “human control” of such weapons systems:

“[In] our view IHL does not establish a requirement for ‘human control’ as such… Introducing new and vague requirements like that of human control could, we believe, confuse, rather than clarify, especially if these proposals are inconsistent with long-standing, accepted practice in using many common weapons systems with autonomous functions.”

In the same meeting, that delegation repeatedly insisted that lethal autonomous weapons would actually be good for us, because they would surely prove better than human beings at distinguishing between civilians and combatants.

Oh, and if you believe that protecting civilians is the reason the arms industry is investing billions of dollars in developing autonomous weapons, I’ve got a patch of land to sell you on Mars that’s going cheap.

The Campaign to Stop Killer Robots

The Governmental Group of Experts also has about 35 non-state members, including non-governmental organizations and universities. The Campaign to Stop Killer Robots, a coalition of 180 organizations, among them Amnesty International, Human Rights Watch, and the World Council of Churches, is one of these. Launched in 2013, this vibrant group provides important commentary on the technical, legal, and ethical issues presented by LAWS and offers other organizations and individuals a way to become involved in the fight to outlaw such potentially devastating weapons systems.

The continued construction and deployment of killer robots is not inevitable. Indeed, a majority of the world would like to see them prohibited, including U.N. Secretary General Antonio Guterres. Let’s give him the last word: “Machines with the power and discretion to take human lives without human involvement are politically unacceptable, morally repugnant, and should be prohibited by international law.”

I couldn’t agree more.

Copyright 2022 Rebecca Gordon

Via Tomdispatch.com

]]>
The “Graveyard Shift” in a Pandemic World: The Real Meaning of Supply-Chain Woes https://www.juancole.com/2021/12/graveyard-pandemic-meaning.html Wed, 01 Dec 2021 05:02:05 +0000 https://www.juancole.com/?p=201543 ( Tomdispatch.com) – In mid-October, President Biden announced that the Port of Los Angeles would begin operating 24 hours a day, seven days a week, joining the nearby Port of Long Beach, which had been doing so since September. The move followed weeks of White House negotiations with the International Longshore and Warehouse Union, as well as shippers like UPS and FedEx, and major retailers like Walmart and Target.

The purpose of expanding port hours, according to the New York Times, was “to relieve growing backlogs in the global supply chains that deliver critical goods to the United States.” Reading this, you might be forgiven for imagining that an array of crucial items like medicines or their ingredients or face masks and other personal protective equipment had been languishing in shipping containers anchored off the West Coast. You might also be forgiven for imagining that workers, too lazy for the moment at hand, had chosen a good night’s sleep over the vital business of unloading such goods from boats lined up in their dozens offshore onto trucks, and getting them into the hands of the Americans desperately in need of them. Reading further, however, you’d learn that those “critical goods” are actually things like “exercise bikes, laptops, toys, [and] patio furniture.”

Fair enough. After all, as my city, San Francisco, enters what’s likely to be yet another almost rainless winter on a planet in ever more trouble, I can imagine my desire for patio furniture rising to a critical level. So, I’m relieved to know that dock workers will now be laboring through the night at the command of the president of the United States to guarantee that my needs are met. To be sure, shortages of at least somewhat more important items are indeed rising, including disposable diapers and the aluminum necessary for packaging some pharmaceuticals. Still, a major focus in the media has been on the specter of “slim pickings this Christmas and Hanukkah.”

Providing “critical” yard furnishings is not the only reason the administration needs to unkink the supply chain. It’s also considered an anti-inflation measure (if an ineffective one). At the end of October, the Consumer Price Index had jumped 6.2% over the same period in 2020, the highest inflation rate in three decades. Such a rise is often described as the result of too much money chasing too few goods. One explanation for the current rise in prices is that, during the worst months of the pandemic, many Americans actually saved money, which they’re now eager to spend. When the things people want to buy are in short supply — perhaps even stuck on container ships off Long Beach and Los Angeles — the price of those that are available naturally rises.

Republicans have christened the current jump in the consumer price index as “Bidenflation,” although the administration actually bears little responsibility for the situation. But Joe Biden and the rest of the Democrats know one thing: if it looks like they’re doing nothing to bring prices down, there will be hell to pay at the polls in 2022 and so it’s the night shift for dock workers and others in Los Angeles, Long Beach, and possibly other American ports.

However, running West Coast ports 24/7 won’t solve the supply-chain problem, not when there aren’t enough truckers to carry that critical patio furniture to Home Depot. The shortage of such drivers arises because there’s more demand than ever before, and because many truckers have simply quit the industry. As the New York Times reports, “Long hours and uncomfortable working conditions are leading to a shortage of truck drivers, which has compounded shipping delays in the United States.”

Rethinking (Shift) Work

Truckers aren’t the only workers who have been rethinking their occupations since the coronavirus pandemic pressed the global pause button. The number of employees quitting their jobs hit 4.4 million this September, about 3% of the U.S. workforce. Resignations were highest in industries like hospitality and medicine, where employees are most at risk of Covid-19 exposure.

For the first time in many decades, workers are in the driver’s seat. They can command higher wages and demand better working conditions. And that’s exactly what they’re doing at workplaces ranging from agricultural equipment manufacturer John Deere to breakfast-cereal makers Kellogg and Nabisco. I’ve even been witnessing it in my personal labor niche, part-time university faculty members (of which I’m one). So allow me to pause here for a shout-out to the 6,500 part-time professors in the University of California system: Thank you! Your threat of a two-day strike won a new contract with a 30% pay raise over the next five years!

This brings me to Biden’s October announcement about those ports going 24/7. In addition to demanding higher pay, better conditions, and an end to two-tier compensation systems (in which laborers hired later don’t get the pay and benefits available to those already on the job), workers are now in a position to reexamine and, in many cases, reject the shift-work system itself. And they have good reason to do so.

So, what is shift work? It’s a system that allows a business to run continuously, ceaselessly turning out and/or transporting widgets year after year. Workers typically labor in eight-hour shifts: 8:00 a.m. to 4:00 p.m., 4:00 p.m. to midnight, and midnight to 8:00 a.m., or the like. In times of labor shortages, they can even be forced to work double shifts, 16 hours in total. Businesses love shift work because it reduces time (and money) lost to powering machinery up and down. And if time is money, then more time worked means more profit for corporations. In many industries, shift work is good for business. But for workers, it’s often another story.

The Graveyard Shift

Each shift in a 24-hour schedule has its own name. The day shift is the obvious one. The swing shift takes you from the day shift to the all-night, or graveyard, shift. According to folk etymology, that shift got its name because, once upon a time, cemetery workers were supposed to stay up all night listening for bells rung by unfortunates who awakened to discover they’d been buried alive. While it’s true that some coffins in England were once fitted with such bells, the term was more likely a reference to the eerie quiet of the world outside the workplace during the hours when most people are asleep.

I can personally attest to the strangeness of life on the graveyard shift. I once worked in an ice cream cone factory. Day and night, noisy, smoky machines resembling small Ferris wheels carried metal molds around and around, while jets of flame cooked the cones inside them. After a rotation, each mold would tip, releasing four cones onto a conveyor belt, rows of which would then approach my station relentlessly. I’d scoop up a stack of 25, twirl them around in a quick check for holes, and place them in a tall box.


Buy the Book

Almost simultaneously, I’d make cardboard dividers, scoop up three more of those stacks and seal them, well-divided, in that box, which I then inserted in an even larger cardboard carton and rushed to a giant mechanical stapler. There, I pressed it against a switch, and — boom-ba-da-boom — six large staples would seal it shut, leaving me just enough time to put that carton atop a pallet of them before racing back to my machine, as new columns of just-baked cones piled up, threatening to overwhelm my worktable.

The only time you stopped scooping and boxing was when a relief worker arrived, so you could have a brief break or gobble down your lunch. You rarely talked to your fellow-workers, because there was only one “relief” packer, so only one person at a time could be on break. Health regulations made it illegal to drink water on the line and management was too cheap to buy screens for the windows, which remained shut, even when it was more than 100 degrees outside.

They didn’t like me very much at the Maryland Pacific Cone Company, maybe because I wanted to know why the high school boys who swept the floors made more than the women who, since the end of World War II, had been climbing three rickety flights of stairs to stand by those machines. In any case, management there started messing with my shifts, assigning me to all three in the same week. As you might imagine, I wasn’t sleeping a whole lot and would occasionally resort to those “little white pills” immortalized in the truckers’ song “Six Days on the Road.”

But I’ll never forget one graveyard shift when an angel named Rosie saved my job and my sanity. It was probably three in the morning. I’d been standing under fluorescent lights, scooping, twirling, and boxing for hours when the universe suddenly stood still. I realized at that moment that I’d never done anything else since the beginning of time but put ice cream cones in boxes and would never stop doing so until the end of time.

If time lost its meaning then, dimensions still turned out to matter a lot, because the cones I was working on that night were bigger than I was used to. Soon I was falling behind, while a huge mound of 40-ounce Eat-It-Alls covered my table and began to spill onto the floor. I stared at them, frozen, until I suddenly became aware that someone was standing at my elbow, gently pushing me out of the way.

Rosie, who had been in that plant since the end of World War II, said quietly, “Let me do this. You take my line.” In less than a minute, she had it all under control, while I spent the rest of the night at her machine, with cones of a size I could handle.

I have never been so glad to see the dawn.

The Deadly Reality of the Graveyard Shift

So, when the president of the United States negotiated to get dock workers in Los Angeles to work all night, I felt a twinge of horror. There’s another all-too-literal reason to call it the “graveyard” shift. It turns out that working when you should be in bed is dangerous. Not only do more accidents occur when the human body expects to be asleep, but the long-term effects of night work can be devastating. As the Centers for Disease Control and Prevention’s National Institute of Occupational Safety and Health (NIOSH) reports, the many adverse effects of night work include:

“type 2 diabetes, heart disease, stroke, metabolic disorders, and sleep disorders. Night shift workers might also have an increased risk for reproductive issues, such as irregular menstrual cycles, miscarriage, and preterm birth. Digestive problems and some psychological issues, such as stress and depression, are more common among night shift workers. The fatigue associated with nightshift can lead to injuries, vehicle crashes, and industrial disasters.”

Some studies have shown that such shift work can also lead to decreased bone-mineral density and so to osteoporosis. There is, in fact, a catchall term for all these problems: shift-work disorder.

In addition, studies directly link the graveyard shift to an increased incidence of several kinds of cancer, including breast and prostate cancer. Why would disrupted sleep rhythms cause cancer? Because such disruptions affect the release of the hormone melatonin. Most of the body’s cells contain little “molecular clocks” that respond to daily alternations of light and darkness. When the light dims at night, the pineal gland releases melatonin, which promotes sleep. In fact, many people take it in pill form as a “natural” sleep aid. Under normal circumstances, such a melatonin release continues until the body encounters light again in the morning.

When this daily (circadian) rhythm is disrupted, however, so is the regular production of melatonin, which turns out to have another important biological function. According to NIOSH, it “can also stop tumor growth and protect against the spread of cancer cells.” Unfortunately, if your job requires you to stay up all night, it won’t do this as effectively.

There’s a section on the NIOSH website that asks, “What can night shift workers do to stay healthy?” The answers are not particularly satisfying. They include regular checkups and seeing your doctor if you have any of a variety of symptoms, including “severe fatigue or sleepiness when you need to be awake, trouble with sleep, stomach or intestinal disturbances, irritability or bad mood, poor performance (frequent mistakes, injuries, vehicle crashes, near misses, etc.), unexplained weight gain or loss.”

Unfortunately, even if you have access to healthcare, your doctor can’t write you a prescription to cure shift-work disorder. The cure is to stop working when your body should be asleep.

An End to Shift Work?

Your doctor can’t solve your shift work issue because, ultimately, it’s not an individual problem. It’s an economic and an ethical one.

There will always be some work that must be performed while most people are sleeping, including healthcare, security, and emergency services, among others. But most shift work gets done not because life depends upon it, but because we’ve been taught to expect our patio furniture on demand. As long as advertising and the grow-or-die logic of capitalism keep stoking the desire for objects we don’t really need, may not even really want, and will sooner or later toss on a garbage pile in this or some other country, truckers and warehouse workers will keep damaging their health.

Perhaps the pandemic, with its kinky supply chain, has given us an opportunity to rethink which goods are so “critical” that we’re willing to let other people risk their lives to provide them for us. Unfortunately, such a global rethink hasn’t yet touched Joe Biden and his administration as they confront an ongoing pandemic, supply-chain problems, a rise in inflation, and — oh yes! — an existential climate crisis that gets worse with every plastic widget produced, packed, and shipped.

It’s time for Biden — and the rest of us — to take a breath and think this through. There are good reasons that so many people are walking away from underpaid, life-threatening work. Many of them are reconsidering the nature of work itself and its place in their lives, no matter what the president or anyone else might wish.

And that’s a paradigm shift we all could learn to live with.

Copyright 2021 Rebecca Gordon

Featured image: Port of Los Angeles sunrise by pete is licensed under CC BY 2.0 / Flickr

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com

]]>