Rebecca Gordon – Informed Comment https://www.juancole.com Thoughts on the Middle East, History and Religion Sun, 28 Nov 2021 06:27:20 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.18 Celebrating Rep. Barbara Lee, our Cassandra who warned against the Failed 20-Year Wars https://www.juancole.com/2021/10/celebrating-barbara-cassandra.html Mon, 18 Oct 2021 04:04:53 +0000 https://www.juancole.com/?p=200675 ( Tomdispatch.com ) – For decades, I kept a poster on my wall that I’d saved from the year I turned 16. In its upper left-hand corner was a black-and-white photo of a white man in a grey suit. Before him spread a cobblestone plaza. All you could see were the man and the stones. Its caption read, “He stood up alone and something happened.”

It was 1968. “He” was Minnesota Senator Eugene McCarthy.As that campaign slogan suggested, his strong second-place showing in the Maine primary was proof that opposition to the Vietnam War had finally become a viable platform for a Democratic candidate for president. I volunteered in McCarthy’s campaign office that year. My memory of my duties is now vague, but they mainly involved alphabetizing and filing index cards containing information about the senator’s supporters. (Remember, this was the age before there was a computer in every pocket, let alone social media and micro-targeting.)

Running against the Vietnam War, McCarthy was challenging then-President Lyndon Johnson in the Democratic primaries. After McCarthy had a strong second-place showing in Maine, New York Senator Robert F. Kennedy entered the race, too, running against the very war his brother, President John F. Kennedy, had bequeathed to Johnson when he was assassinated. Soon, Johnson would withdraw from the campaign, announcing in a televised national address that he wouldn’t run for another term.

With his good looks and family name, Bobby Kennedy appeared to have a real chance for the nomination when, on June 5, 1968, during a campaign event in Los Angeles, he, like his brother, was assassinated. That left the war’s opponents without a viable candidate for the nomination. Outside the Democratic Party convention in Chicago that August, tens of thousands of angry, mostly young Americans demonstrated their frustration with the war and the party’s refusal to take a stand against it. In what was generally recognized as a police riot, the Chicago PD beat protesters and journalists bloody on national TV, as participants chanted, “The whole world is watching.” And indeed, it was.

In the end, the nomination went to Johnson’s vice president and war supporter Hubert Humphrey, who would face Republican hawk Richard Nixon that November. The war’s opponents watched in frustration as the two major parties closed ranks, cementing their post-World-War-II bipartisan agreement to use military power to enforce U.S. global dominance.

Cassandra Foresees the Future

Of course, the McCarthy campaign’s slogan was wrong on two counts. He didn’t stand up alone. Millions of us around the world were then working to end the war in Vietnam. Sadly, nothing conclusive happened as a result of his campaign. Nixon went on to win the 1968 general election and the Vietnam War dragged on to an ignominious U.S. defeat seven years later.

Nineteen sixty-eight was also the year my high school put on Tiger at the Gates, French playwright Jean Giraudoux’s antiwar drama about the run-up to the Trojan War. Giraudoux chronicled that ancient conflict’s painful inevitability, despite the fervent desire of Troy’s rulers and its people to prevent it. The play opens as Andromache, wife of the doomed Trojan warrior Hector, tells her sister-in-law Cassandra, “There’s not going to be a Trojan war.”

Cassandra, you may remember, bore a double curse from the gods: yes, she could see into the future, but no one would believe her predictions. She informs Andromache that she’s wrong; that, like a tiger pacing outside the city’s walls, war with all its bloody pain is preparing to spring. And, of course, she’s right. Part of the play’s message is that Cassandra doesn’t need her supernatural gift to predict the future. She can guess what will happen simply because she understands the relentless forces driving her city to war: the poets who need tragedies to chronicle; the would-be heroes who desire glory; the rulers caught in the inertia of tradition.

Although Tiger was written in the 1930s, between the two world wars, it could just as easily have appeared in 1968. Substitute the mass media for the poets; the military-industrial complex for the Greek and Trojan warriors; and administration after administration for the city’s rulers, and you have a striking representation of the quicksand war that dragged 58,000 U.S. soldiers and millions of Vietnamese, Laotians, and Cambodians to their deaths. And in some sense, we — the antiwar forces in this country — foresaw it all (in broad outline, if not specific detail): the assassinations, carpet bombings, tiger cages, and the CIA’s first mass assassination and torture scheme, the Phoenix Program. Of course we couldn’t predict the specifics. Indeed, some turned out worse than we’d feared. In any case, our foresight did us no more good than Cassandra’s did her.

Rehabilitations and Revisions

It’s just over a month since the 20th anniversary of the 9/11 attacks and the start of the “Global War on Terror.” The press has been full of recollections and rehabilitations. George W. Bush used the occasion to warn the nation (as if we needed it at that point) about the dangers of what CNN referred to as “domestic violent extremists.” He called them “children of the same foul spirit” as the one that engenders international terrorism. He also inveighed against the January 6th Capitol invasion:

“‘This is how election results are disputed in a banana republic — not our democratic republic,’ he said in a statement at the time, adding that he was ‘appalled by the reckless behavior of some political leaders since the election.’”

You might almost think he’d forgotten that neither should elections in a democracy be “disputed” by three-piece-suited thugs shutting down a ballot count — as happened in Florida during his own first election in 2000. Future Trump operative Roger Stone has claimed credit for orchestrating that so-called Brooks Brothers Rebellion, which stopped the Florida vote count and threw the election to the Supreme Court and, in the end, to George W. Bush.


Buy the Book

You might also think that, with plenty of shoving from his vice president Dick Cheney and a cabal of leftover neocons from the Project for a New American Century, Bush had never led this country into two devastating, murderous, profoundly wasteful wars. You might think we’d never seen the resumption of institutionalized CIA- and military-run state torture on a massive scale under his rule, or his administration’s refusal to join the International Criminal Court.

And finally, you might think that nobody saw all this coming, that there were no Cassandras in this country in 2001. But there you would be wrong. All too many of us sensed just what was coming as soon as the bombing and invasion of Afghanistan began. I knew, for example, as early as November 2001, when the first mainstream article extolling the utility of torture appeared, that whatever else the U.S. response to the 9/11 attacks would entail, organized torture would be part of it. As early as December 2002, we all could have known that. That’s when the first articles began appearing in the Washington Post about the “stress and duress” techniques the CIA was already beginning to use at Bagram Air Base in Afghanistan. Some of the hapless victims would later turn out to have been sold to U.S. forces for bounties by local strongmen.

It takes very little courage for a superannuated graduate student (as I was in 2001) to write academic papers about U.S. torture practices (as I did) and the stupidity and illegality of our invasion of Afghanistan. It’s another thing, however, when a real Cassandra stands up — all alone — and tries to stop something from happening.

I’m talking, of course, about Representative Barbara Lee, the only member of Congress to vote against granting the president the power to “use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons.” It was this Authorization of the Use of Military Force, or AUMF, that provided the legal grounds for the U.S. invasion of Afghanistan in September 2001. Lee was right when, after agonizing about her vote, she decided to follow the counsel of the dean of the National Cathedral, the Reverend Nathan Baxter. That very morning, she had heard him pray that, in response to the terrible crimes of 9/11, we not “become the evil we deplore.”

How right she was when she said on the House floor:

“However difficult this vote may be, some of us must urge the use of restraint. Our country is in a state of mourning. Some of us must say, ‘Let’s step back for a moment, let’s just pause, just for a minute, and think through the implications of our actions today, so that this does not spiral out of control.’”

The legislation she opposed that day would indeed allow “this” to spiral out of control. That same AUMF has since been used to justify an ever-metastasizing series of wars and conflicts that spread from Afghanistan in central Asia through the Middle East, south to Yemen, and leapt to Libya, Somalia, and other lands in Africa. Despite multiple attempts to repeal it, that same minimalist AUMF remains in effect today, ready for use by the next president with aspirations to military adventures. In June 2021, the House of Representatives did finally pass a bill rescinding it, sponsored by Barbara Lee herself. At present, however, it languishes in the Senate’s Committee on Foreign Relations.

In the days after 9/11, Lee was roundly excoriated for her vote. The Wall Street Journal called her a “clueless liberal,” while the Washington Times wrote that she was “a long-practicing supporter of America’s enemies.” Curiously, both those editorials were headlined with the question, “Who Is Barbara Lee?” (Those of us in the San Francisco Bay Area could have answered that. Lee was — and remains — an African American congressional representative from Oakland, California, the inheritor of the seat and mantle of another great black congressional representative, Ron Dellums.) She received mountains of hate mail then and enough death threats to force her to seek police protection.

Like George W. Bush, Lee received some media rehabilitation in various 20th anniversary retrospectives of 9/11. In her case, however, it was well-deserved. The Washington Post, for instance, praised her for her courage, noting that no one — not Bernie Sanders, not Joe Biden — shared her vision, or, I would add, shared Cassandra’s curse with her. Like the character in Tiger at the Gates, Lee didn’t need a divine gift to foresee that the U.S. “war on terror” would spin disastrously out of control. A little historical memory might have served the rest of the country well, reminding us of what happened the last time the United States fought an ever-escalating war.

Cassandras and Their Mirror Images

It was clear from the start that Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld were never that interested in Afghanistan (although that was no solace to the many thousands of Afghans who were bombed, beaten, and tortured). Those officials had another target in mind — Iraq — almost literally from the moment al-Qaeda’s hijacked planes struck New York and Washington.

In 2002, after months of lies about Iraqi leader Saddam Hussein’s possession of (nonexistent) weapons of mass destruction (WMD) and his supposed pursuit of a nuclear bomb, the Bush administration got its second AUMF, authorizing “the President to use the U.S. armed forces to: …defend U.S. national security against the continuing threat posed by Iraq,” functionally condoning the U.S. invasion of his country. This time, Barbara Lee was not alone in her opposition. In the House, she was joined by 132 Democrats, 6 Republicans, and one independent (Bernie Sanders). Only 23 senators, however, voted “nay,” including Rhode Island Republican Lincoln Chafee and Vermont independent Jim Jeffords.

In the run-up to the March 2003 invasion, figures who might be thought of as “anti-Cassandras” took center stage. Unlike the Greek seer, these unfortunates were apparently doomed to tell falsehoods — and be believed. Among them was Condoleezza Rice, President Bush’s national security advisor, who, when pressed for evidence that Saddam Hussein actually possessed WMD, told CNN’s Wolf Blitzer that “we don’t want the smoking gun to be a mushroom cloud,” implying Iraq represented a nuclear threat to this country.

Then there was secretary of State Colin Powell, who put the case for war to the United Nations General Assembly in February 2003, emphasizing the supposedly factual basis of everything he presented:

“My colleagues, every statement I make today is backed up by sources, solid sources. These are not assertions. What we’re giving you are facts and conclusions based on solid intelligence.”

It wasn’t true, of course, but around the world, many believed him.

And let’s not leave the mainstream press out here. There’s plenty of blame to go around, but perhaps the anti-Cassandra crown should go to the New York Times for its promotion of Bush administration war propaganda, especially by its reporter Judith Miller. In 2004, the Times published an extraordinary mea culpa, an apologetic note “from the editors” that said,

“[W]e have found a number of instances of coverage that was not as rigorous as it should have been. In some cases, information that was controversial then, and seems questionable now, was insufficiently qualified or allowed to stand unchallenged. Looking back, we wish we had been more aggressive in re-examining the claims as new evidence emerged — or failed to emerge.”

I suspect the people of Iraq might share the Times’s wish.

There was, of course, one other group of prophets who accurately foresaw the horrors that a U.S. invasion would bring with it: the millions who filled the streets of their cities here and around the world, demanding that the United States stay its hand. So powerful was their witness that they were briefly dubbed “the other superpower.” Writing in the Nation, Jonathan Schell extolled their strength, saying that this country’s “shock and awe” assault on Iraq “has found its riposte in courage and wonder.” Alas, that mass witness in those streets was not enough to forestall one more murderous assault by what would, in the long run, prove to be a dying empire.

Cassandra at the Gates (of Glasgow)

And now, the world is finally waking up to an even greater disaster: the climate emergency that’s burning up my part of the world, the American West, and drowning others. This crisis has had its Cassandras, too. One of these was 89-year-old John Rogalsky, who worked for 35 years as a meteorologist in the federal government. As early as 1963, he became aware of the problem of climate change and began trying to warn us. In 2017, he told the Canadian Broadcasting Company:

“[B]y the time the end of the 60s had arrived, I was absolutely convinced that it was real, it was just a question of how rapidly it would happen and how difficult it would become for the world at large, and how soon before people, or governments would even listen to the science. People I talked to about this, I was letting them know, this is happening, get ready.”

This November, the 197 nations that have signed up to the United Nations Framework Convention on Climate Change will meet in Glasgow, Scotland, at the 2021 United Nations Climate Change Conference. We must hope that this follow-up to the 2015 Paris agreement will produce concrete steps to reverse the overheating of this planet and mitigate its effects, especially in those nations that have contributed the least to the problem and are already suffering disproportionately. Italy and the United Kingdom will serve as co-hosts.

I hope it’s a good sign that at a pre-Glasgow summit in Milan, Italy’s Prime Minister Mario Draghi met with three young “Cassandras” — climate activists Greta Thunberg (Sweden), Vanessa Nakate (Uganda), and Martina Comparelli (Italy) — after Thunberg’s now famous “blah, blah, blah” speech, accusing world leaders of empty talk. “Your pressure, frankly, is very welcome,” Draghi told them. “We need to be whipped into action. Your mobilization has been powerful, and rest assured, we are listening.”

For the sake of the world, let us hope that this time Cassandra will be believed.

Copyright 2021 Rebecca Gordon

Via Tomdispatch.com

]]>
Debt and Disillusionment: Neoliberalism’s Unfair burden on this Generation of College Students https://www.juancole.com/2021/08/disillusionment-neoliberalisms-generation.html Wed, 18 Aug 2021 04:02:38 +0000 https://www.juancole.com/?p=199560 ( Tomdispatch.com ) – For the last decade and a half, I’ve been teaching ethics to undergraduates. Now — admittedly, a little late to the party — I’ve started seriously questioning my own ethics. I’ve begun to wonder just what it means to be a participant, however minor, in the pyramid scheme that higher education has become in the years since I went to college.

Airplane Games

Sometime in the late 1980s, the Airplane Game roared through the San Francisco Bay Area lesbian community. It was a classic pyramid scheme, even if cleverly dressed up in language about women’s natural ability to generate abundance, just as we gestate children in our miraculous wombs. If the connection between feminism and airplanes was a little murky — well, we could always think of ourselves as modern-day Amelia Earharts. (As long as we didn’t think too hard about how she ended up.)

A few women made a lot of money from it — enough, in the case of one friend of mine, for a down payment on a house. Inevitably, a lot more of us lost money, even as some like me stood on the sidelines sadly shaking our heads.

There were four tiers on that “airplane”: a captain, two co-pilots, four crew, and 8 passengers — 15 in all to start. You paid $3,000 to get on at the back of the plane as a passenger, so the first captain (the original scammer), got out with $24,000 — $3,000 from each passenger. The co-pilots and crew, who were in on the fix, paid nothing to join. When the first captain “parachuted out,” the game split in two, and each co-pilot became the captain of a new plane. They then pressured their four remaining passengers to recruit enough new women to fill each plane, so they could get their payday, and the two new co-pilots could each captain their own planes.

Unless new people continued to get on at the back of each plane, there would be no payday for the earlier passengers, so the pressure to recruit ever more women into the game only grew. The original scammers ran through the game a couple of times, but inevitably the supply of gullible women willing to invest their savings ran out. By the time the game collapsed, hundreds of women had lost significant amounts of money.

No one seemed to know the women who’d brought the game and all those “planes” to the Bay Area, but they had spun a winning story about endless abundance and the glories of women’s energy. After the game collapsed, they took off for another women’s community with their “earnings,” leaving behind a lot of sadder, poorer, and perhaps wiser San Francisco lesbians.

Feasting at the Tenure Trough or Starving in the Ivory Tower?

So, you may be wondering, what could that long-ago scam have to do with my ethical qualms about working as a college instructor? More than you might think.

Let’s start with PhD programs. In 2019, the most recent year for which statistics are available, U.S. colleges and universities churned out about 55,700 doctorates — and such numbers continue to increase by about 1% a year. The average number of doctorates earned over the last decade is almost 53,000 annually. In other words, we’re talking about nearly 530,000 PhDs produced by American higher education in those 10 years alone. Many of them have ended up competing for a far smaller number of jobs in the academic world.

It’s true that most PhDs in science or engineering end up with post-doctoral positions (earning roughly $40,000 a year) or with tenure-track or tenured jobs in colleges and universities (averaging $60,000 annually to start). Better yet, most of them leave their graduate programs with little or no debt.

The situation is far different if your degree wasn’t in STEM (science, technology, engineering, or mathematics) but, for example, in education or the humanities. As a start, far more of those degree-holders graduate owing money, often significant sums, and ever fewer end up teaching in tenure-track positions — in jobs, that is, with security, decent pay, and benefits.

Many of the non-STEM PhDs who stay in academia end up joining an exploited, contingent workforce of part-time, or “adjunct,” professors. That reserve army of the underemployed is higher education’s dirty little secret. After all, we — and yes, I’m one of them — actually teach the majority of the classes in many schools, while earning as little as $1,500 a semester for each of them.

I hate to bring up transportation again, but there’s a reason teachers like us are called “freeway flyers.” A 2014 Congressional report revealed that 89% of us work at more than one institution and 27% at three different schools, just to cobble together the most meager of livings.

Many of us, in fact, rely on public antipoverty programs to keep going. Inside Higher Ed, reflecting on a 2020 report from the American Federation of Teachers, describes our situation this way:

“Nearly 25% of adjunct faculty members rely on public assistance, and 40% struggle to cover basic household expenses, according to a new report from the American Federation of Teachers. Nearly a third of the 3,000 adjuncts surveyed for the report earn less than $25,000 a year. That puts them below the federal poverty guideline for a family of four.”

I’m luckier than most adjuncts. I have a union, and over the years we’ve fought for better pay, healthcare, a pension plan, and a pathway (however limited) to advancement. Now, however, my school’s administration is using the pandemic as an excuse to try to claw back the tiny cost-of-living adjustments we won in 2019.

The Oxford Dictionary of English defines an adjunct as “a thing added to something else as a supplementary rather than an essential part.” Once upon a time, in the middle of the previous century, that’s just what adjunct faculty were — occasional additions to the full-time faculty. Often, they were retired professionals who supplemented a department’s offerings by teaching a single course in their area of expertise, while their salaries were more honoraria than true payments for work performed. Later, as more women entered academia, it became common for a male professor’s wife to teach a course or two, often as part of his employment arrangement with the university. Since her salary was a mere adjunct to his, she was paid accordingly.

Now, the situation has changed radically. In many colleges and universities, adjunct faculty are no longer supplements, but the most “essential part” of the teaching staff. Classes simply couldn’t go on without us; nor, if you believe college administrations, could their budgets be balanced without us. After all, why pay a full-time professor $10,000 to teach a class (since he or she will be earning, on average, $60,000 a year and covering three classes a semester) when you can give a part-timer like me $1,500 for the very same work?

And adjuncts have little choice. The competition for full-time positions is fierce, since every year another 53,000 or more new PhDs climb into the back row of the academic airplane, hoping to make it to the pilot’s seat and secure a tenure-track position.


Buy the Book

And here’s another problem with that. These days the people in the pilots’ seats often aren’t parachuting out. They’re staying right where they are. That, in turn, means new PhDs find themselves competing for an ever-shrinking prize, as Laura McKenna has written in the Atlantic, “not only with their own cohort but also with the unemployed PhDs who graduated in previous years.” Many of those now clinging to pilots’ seats are members of my own boomer generation, who still benefit from a 1986 law (signed by then-75-year-old President Ronald Reagan) that outlawed mandatory retirements.

Grade Inflation v. Degree Inflation?

People in the world of education often bemoan the problem of “grade inflation” — the tendency of average grades to creep up over time. Ironically, this problem is exacerbated by the adjunctification of teaching, since adjuncts tend to award higher grades than professors with secure positions. The reason is simple enough: colleges use student evaluations as a major metric for rehiring adjuncts and higher grades translate directly into better evaluations. Grade inflation at the college level is, in my view, a non-issue, at least for students. Employers don’t look at your transcript when they’re hiring you and even graduate schools care more about recommendations and GRE scores.

The real problem faced by today’s young people isn’t grade inflation. It’s degree inflation.

Once upon a time in another America, a high-school diploma was enough to snag you a good job, with a chance to move up as time went on (especially if you were white and male, as the majority of workers were in those days). And you paid no tuition whatsoever for that diploma. In fact, public education through 12th grade is still free, though its quality varies profoundly depending on who you are and where you live.

But all that changed as increasing numbers of employers began requiring a college degree for jobs that don’t by any stretch of the imagination require a college education to perform. The Washington Post reports:

“Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents, and chemical equipment operators.”

In 2017, Manjari Raman of the Harvard Business School wrote that

“the degree gap — the discrepancy between the demand for a college degree in job postings and the employees who are currently in that job who have a college degree — is significant. For example, in 2015, 67% of production supervisor job postings asked for a college degree, while only 16% of employed production supervisors had one.”

In other words, even though most people already doing such jobs don’t have a bachelor’s degree, companies are only hiring new people who do. Part of the reason: that requirement automatically eliminates a lot of applicants, reducing the time and effort involved in making hiring decisions. Rather than sifting through résumés for specific skills (like the ability to use certain computer programs or write fluently), employers let a college degree serve as a proxy. The result is not only that they’ll hire people who don’t have the skills they actually need, but that they’re eliminating people who do have the skills but not the degree. You won’t be surprised to learn that those rejected applicants are more likely to be people of color, who are underrepresented among the holders of college degrees.

Similarly, some fields that used to accept a BA now require a graduate degree to perform the same work. For example, the Bureau of Labor Statistics reports that “in 2015–16, about 39% of all occupational therapists ages 25 and older had a bachelor’s degree as their highest level of educational attainment.” Now, however, employers are commonly insisting that new applicants hold at least a master’s degree — and so up the pyramid we continually go (at ever greater cost to those students).

The Biggest Pyramid of All

In a sense, you could say that the whole capitalist economy is the biggest pyramid of them all. For every one of the fascinating, fulfilling, autonomous, and well-paying jobs out there, there are thousands of boring, mind- and body-crushing ones like pulling items for shipment in an Amazon warehouse or folding clothes at Forever 21.

We know, in other words, that there are only a relatively small number of spaces in the cockpit of today’s economic plane. Nonetheless, we tell our young people that the guaranteed way to get one of those rare gigs at the top of the pyramid is a college education.

Now, just stop for a second and consider what it costs to join the 2021 all-American Airplane Game of education. In 1970, when I went to Reed, a small, private, liberal arts college, tuition was $3,000 a year. I was lucky. I had a scholarship (known in modern university jargon as a “tuition discount”) that covered most of my costs. This year, annual tuition at that same school is a mind-boggling $62,420, more than 20 times as high. If college costs had simply risen with inflation, the price would be about $21,000 a year, or just under triple the price.

If I’d attended Federal City College (now the University of D.C.), my equivalent of a state school then, tuition would have been free. Now, even state schools cost too much for many students. Annually, tuition at the University of California at Berkeley, the flagship school of that state’s system, is $14,253 for in-state students, and $44,007 for out-of-staters.

I left school owing $800, or about $4,400 in today’s dollars. These days, most financial “aid” resembles foreign “aid” to developing countries — that is, it generally takes the form of loans whose interest piles up so fast that it’s hard to keep up with it, let alone begin to pay off the principal in your post-college life. Some numbers to contemplate: 62% of those graduating with a BA in 2019 did so owing money — owing, in fact, an average of almost $29,000. The average debt of those earning a graduate degree was an even more staggering $71,000. That, of course, is on top of whatever the former students had already shelled out while in school. And that, in turn, is before the “miracle” of compound interest takes hold and that debt starts to grow like a rogue zucchini.

It’s enough to make me wonder whether a seat in the Great American College and University Airplane Game is worth the price, and whether it’s ethical for me to continue serving as an adjunct flight attendant along the way. Whatever we tell students about education being the path to a good job, the truth is that there are remarkably few seats at the front of the plane.

Of course, on the positive side, I do still believe that time spent at college offers students something beyond any price — the opportunity to learn to think deeply and critically, while encountering people very different from themselves. The luckiest students graduate with a lifelong curiosity about the world and some tools to help them satisfy it. That is truly a ticket to a good life — and no one should have to buy a seat in an Airplane Game to get one.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com

]]>
The Fires This Time: A Climate View from California https://www.juancole.com/2021/07/fires-climate-california.html Mon, 12 Jul 2021 04:02:33 +0000 https://www.juancole.com/?p=198854 ( Tomdispatch.com) – In San Francisco, we’re finally starting to put away our masks. With 74% of the city’s residents over 12 fully vaccinated, for the first time in more than a year we’re enjoying walking, shopping, and eating out, our faces naked. So I was startled when my partner reminded me that we need to buy masks again very soonN95 masks, that is. The California wildfire season has already begun, earlier than ever, and we’ll need to protect our lungs during the months to come from the fine particulates carried in the wildfire smoke that’s been engulfing this city in recent years.

I was in Reno last September, so I missed the morning when San Franciscans awoke to apocalyptic orange skies, the air freighted with smoke from burning forests elsewhere in the state. The air then was bad enough even in the high mountain valley of Reno. At that point, we’d already experienced “very unhealthy” purple-zone air quality for days. Still, it was nothing like the photos that could have been from Mars then emerging from the Bay Area. I have a bad feeling that I may get my chance to experience the same phenomenon in 2021 — and, as the fires across California have started so much earlier, probably sooner than September.

The situation is pretty dire: this state — along with our neighbors to the north and southeast — is now living through an epic drought. After a dry winter and spring, the fuel-moisture content in our forests (the amount of water in vegetation, living and dead) is way below average. This April, the month when it is usually at its highest, San Jose State University scientists recorded levels a staggering 40% below average in the Santa Cruz Mountains, well below the lowest level ever before observed. In other words, we have never been this dry.

Under the Heat Dome

When it’s hot in most of California, its often cold and foggy in San Francisco. Today is no exception. Despite the raging news about heat records, it’s not likely to reach 65 degrees here. So it’s a little surreal to consider what friends and family are going through in the Pacific Northwest under the once-in-thousands-of-years heat dome that’s settled over the region. A heat dome is an area of high pressure surrounded by upper-atmosphere winds that essentially pin it in place. If you remember your high-school physics, you’ll recall that when a gas (for example, the air over the Pacific Northwest) is contained, the ratio between pressure and temperature remains constant. If the temperature goes up, the pressure goes up.

The converse is also true; as the pressure rises, so does the temperature. And that’s what’s been happening over Oregon, Washington, and British Columbia in normally chilly Canada. Mix in the fact that climate change has driven average temperatures in those areas up by three to four degrees since the industrial revolution, and you have a recipe for the disaster that struck the region recently.

And it has indeed been a disaster. The temperature in the tiny town of Lytton, British Columbia, for instance, hit 121 degrees on June 29th, breaking the Canadian heat record for the third time in as many days. (The previous record had stood since 1937.) That was Tuesday. On Wednesday night, the whole town was engulfed in the flames of multiple fires. The fires, in turn, generated huge pyrocumulus clouds that penetrated as high as the stratosphere (a rare event in itself), producing lightning strikes that ignited new fires in a vicious cycle that, in the end, simply destroyed the kilometer-long town.


Buy the Book

Heat records have been broken all over the Pacific Northwest. Portland topped records for three days running, culminating with a 116-degree day on June 28th; Seattle hit a high of 108, which the Washington Post reported “was 34 degrees above the normal high of 74 and higher than the all-time heat record in Washington, D.C., among many other cities much farther to its south.”

With the heat comes a rise in “sudden and unexpected” deaths. Hundreds have died in Oregon and Washington and, according to the British Columbia coroner, at least 300 in her state — almost double the average number for that time period.

Class, Race, and Hot Air

It’s hardly a new observation that the people who have benefited least from the causes of climate change — the residents of less industrialized countries and poor people of all nations — are already suffering most from its results. Island nations like the Republic of Palau in the western Pacific are a prime example. Palau faces a number of climate-change challenges, according to the United Nations Development Program, including rising sea levels that threaten to inundate some of its lowest-lying islands, which are just 10 meters above sea level. In addition, encroaching seawater is salinating some of its agricultural land, creating seaside strips that can now grow only salt-tolerant root crops. Meanwhile, despite substantial annual rainfall, saltwater inundation threatens the drinking water supply. And worse yet, Palau is vulnerable to ocean storms that, on our heating planet, are growing ever more frequent and severe.

There are also subtle ways the rising temperatures that go with climate change have differential effects, even on people living in the same city. Take air conditioning. One of the reasons people in the Pacific Northwest suffered so horrendously under the heat dome is that few homes in that region are air conditioned. Until recently, people there had been able to weather the minimal number of very hot days each year without installing expensive cooling machinery.

Obviously, people with more discretionary income will have an easier time investing in air conditioning now that temperatures are rising. What’s less obvious, perhaps, is that its widespread use makes a city hotter — a burden that falls disproportionately on people who can’t afford to install it in the first place. Air conditioning works on a simple principle; it shifts heat from air inside an enclosed space to the outside world, which, in turn, makes that outside air hotter.

A 2014 study of this effect in Phoenix, Arizona, showed that air conditioning raised ambient temperatures by one to two degrees at night — an important finding, because one of the most dangerous aspects of the present heat waves is their lack of night-time cooling. As a result, each day’s heat builds on a higher base, while presenting a greater direct-health threat, since the bodies of those not in air conditioning can’t recover from the exhaustion of the day’s heat at night. In effect, air conditioning not only heats the atmosphere further but shifts the burden of unhealthy heat from those who can afford it to those who can’t.

Just as the coronavirus has disproportionately ravaged black and brown communities (as well as poor nations around the world), climate-change-driven heat waves, according to a recent University of North Carolina study reported by the BBC, mean that “black people living in most U.S. cities are subject to double the level of heat stress as their white counterparts.” This is the result not just of poverty, but of residential segregation, which leaves urban BIPOC (black, indigenous, and other people of color) communities in a city’s worst “heat islands” — the areas containing the most concrete, the most asphalt, and the least vegetation — and which therefore attract and retain the most heat.

“Using satellite temperature data combined with demographic information from the U.S. Census,” the researchers “found that the average person of color lives in an area with far higher summer daytime temperatures than non-Hispanic white people.” They also discovered that, in all but six of the 175 urban areas they studied in the continental U.S., “people of color endure much greater heat impacts in summer.” Furthermore, “for black people this was particularly stark. The researchers say they are exposed to an extra 3.12C [5.6F] of heating, on average, in urban neighborhoods, compared to an extra 1.47C [2.6F] for white people.”

That’s a big difference.

Food, Drink, and Fires — the View from California

Now, let me return to my own home state, California, where conditions remain all too dry and, apart from the coast right now, all too hot. Northern California gets most of its drinking water from the snowpack that builds each year in the Sierra Nevada mountains. In spring, those snows gradually melt, filling the rivers that fill our reservoirs. In May 2021, however, the Sierra snowpack was a devastating six percent of normal!

Stop a moment and take that in, while you try to imagine the future of much of the state — and the crucial crops it grows.

For my own hometown, San Francisco, things aren’t quite that dire. Water levels in Hetch Hetchy, our main reservoir, located in Yosemite National Park, are down from previous years, but not disastrously so. With voluntary water-use reduction, we’re likely to have enough to drink this year at least. Things are a lot less promising, however, in rural California where towns tend to rely on groundwater for domestic use.

Shrinking water supplies don’t just affect individual consumers here in this state, they affect everyone in the United States who eats, because 13.5% of all our agricultural products, including meat and dairy, as well as fruits and vegetables, come from California. Growing food requires prodigious amounts of water. In fact, farmland irrigation accounts for roughly 80% of all water used by businesses and homes in the state.

So how are California’s agricultural water supplies doing this year? The answer, sadly, is not very well. State regulators have already cut distribution to about a quarter of California’s irrigated acreage (about two million acres) by a drastic 95%. That’s right. A full quarter of the state’s farmlands have access to just 5% of what they would ordinarily receive from rivers and aqueducts. As a result, some farmers are turning to groundwater, a more easily exhausted source, which also replenishes itself far more slowly than rivers and streams. Some are even choosing to sell their water to other farmers, rather than use it to grow crops at all, because that makes more economic sense for them. As smaller farms are likely to be the first to fold, the water crisis will only enhance the dominance of major corporations in food production.

Meanwhile, we’ll probably be breaking out our N95 masks soon. Wildfire season has already begun — earlier than ever. On July 1st, the then-still-uncontained Salt fire briefly closed a section of Interstate 5 near Redding in northern California. (I-5 is the main north-south interstate along the West coast.) And that’s only one of the more than 4,500 fire incidents already recorded in the state this year.

Last year, almost 10,000 fires burned more than four million acres here, and everything points to a similar or worse season in 2021. Unlike Donald Trump, who famously blamed California’s fires on a failure to properly rake our forests, President Biden is taking the threat seriously. On June 30th, he convened western state leaders to discuss the problem, acknowledging that “we have to act and act fast. We’re late in the game here.” The president promised a number of measures: guaranteeing sufficient, and sufficiently trained, firefighters; raising their minimum pay to $15 per hour; and making grants to California counties under the Federal Emergency Management Agency’s BRIC (Building Resilient Infrastructure and Communities) program.

Such measures will help a little in the short term, but none of it will make a damn bit of difference in the longer run if the Biden administration and a politically divisive Congress don’t begin to truly treat climate change as the immediate and desperately long-term emergency it is.

Justice and Generations

In his famous A Theory of Justice, the great liberal philosopher of the twentieth century John Rawls proposed a procedural method for designing reasonable and fair principles and policies in a given society. His idea: that the people determining such basic policies should act as if they had stepped behind a “veil of ignorance” and had lost specific knowledge of their own place in society. They’d be ignorant of their own class status, ethnicity, or even how lucky they’d been when nature was handing out gifts like intelligence, health, and physical strength.

Once behind such a veil of personal ignorance, Rawls argued, people might make rules that would be as fair as possible, because they wouldn’t know whether they themselves were rich or poor, black or white, old or young — or even which generation they belonged to. This last category was almost an afterthought, included, he wrote, “in part because questions of social justice arise between generations as well as within them.”

His point about justice between generations not only still seems valid to me, but in light of present-day circumstances radically understated. I don’t think Rawls ever envisioned a trans-generational injustice as great as the climate-change one we’re allowing to happen, not to say actively inducing, at this very moment.

Human beings have a hard time recognizing looming but invisible dangers. In 1990, I spent a few months in South Africa providing some technical assistance to an anti-apartheid newspaper. When local health workers found out that I had worked (as a bookkeeper) for an agency in the U.S. trying to prevent the transmission of AIDS, they desperately wanted to talk to me. How, they hoped to learn, could they get people living in their townships to act now to prevent a highly transmissible illness that would only produce symptoms years after infection? How, in the face of the all-too-present emergencies of everyday apartheid life, could they get people to focus on a vague but potentially horrendous danger barreling down from the future? I had few good answers and, almost 30 years later, South Africa has the largest HIV-positive population in the world.

Of course, there are human beings who’ve known about the climate crisis for decades — and not just the scientists who wrote about it as early as the 1950s or the ones who gave an American president an all-too-accurate report on it in 1965. The fossil-fuel companies have, of course, known all along — and have focused their scientific efforts not on finding alternative energy sources, but on creating doubt about the reality of human-caused climate change (just as, once upon a time, tobacco companies sowed doubt about the relationship between smoking and cancer). As early as 1979, the Guardian reports, an internal Exxon study concluded that the use of fossil fuels would certainly “cause dramatic environmental effects” in the decades ahead. “The potential problem is great and urgent,” the study concluded.

A problem that was “great and urgent” in 1979 is now a full-blown existential crisis for human survival.

Some friends and I were recently talking about how ominous the future must look to the younger people we know. “They are really the first generation to confront an end to humanity in their own, or perhaps their children’s lifetimes,” I said.

“But we had The Bomb,” a friend reminded me. “We grew up in the shadow of nuclear war.” And she was right of course. We children of the 1950s and 1960s grew up knowing that someone could “press the button” at any time, but there was a difference. Horrifying as is the present retooling of our nuclear arsenal (going on right now, under President Biden), nuclear war nonetheless remains a question of “if.” Climate change is a matter of “when” and that when, as anyone living in the Northwest of the United States and Canada should know after these last weeks, is all too obviously now.

It’s impossible to overstate the urgency of the moment. And yet, as a species, we’re acting like the children of indulgent parents who provide multiple “last chances” to behave. Now, nature has run out of patience and we’re running out of chances. So much must be done globally, especially to control the giant fossil-fuel companies. We can only hope that real action will emerge from November’s international climate conference. And here in the U.S., unless congressional Democrats succeed in ramming through major action to stop climate change before the 2022 midterms, we’ll have lost one more last, best chance for survival.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com ]]> Social Security Versus National Security: Whose Entitlement Really Makes Us Safer? https://www.juancole.com/2021/06/security-national-entitlement.html Wed, 16 Jun 2021 04:01:08 +0000 https://www.juancole.com/?p=198383 ( Tomdispatch.com) – “Biden’s doing better than I thought he would.”

“Yeah. Vaccinations, infrastructure, acknowledging racism in policing. A lot of pieces of the Green New Deal, without calling it that. The child subsidies. It’s kind of amazing.”

“But on the military–”

“Yeah, same old, same old.”

As my friends and I have noticed, President Joe Biden remains super-glued to the same old post-World War II agreement between the two major parties: they can differ vastly on domestic policies, but they remain united when it comes to projecting U.S. military power around the world and to the government spending that sustains it. In other words, the U.S. “national security” budget is still the third rail of politics in this country.

Assaulting the Old New Deal

It was Democratic House Speaker Tip O’Neill who first declared that Social Security is “the third rail” of American politics. In doing so, he metaphorically pointed to the high-voltage rail that runs between the tracks of subways and other light-rail systems. Touch that and you’ll electrocute yourself.

O’Neill made that observation back in 1981, early in Ronald Reagan’s first presidential term, at a moment when the new guy in Washington was already hell-bent on dismantling Franklin Delano Roosevelt’s New Deal legacy.

Reagan would fight his campaign to do so on two key fronts. First, he would attack labor unions, whose power had expanded in the years since the 1935 Wagner Act (officially the National Labor Relations Act) guaranteed workers the right to bargain collectively with their employers over wages and workplace rules. Such organizing rights had been hard-won indeed. Not a few workers died at the hands of the police or domestic mercenaries like Pinkerton agents, especially in the early 1930s. By the mid-1950s, union membership would peak at around 35% of workers, while wages would continue to grow into the late 1970s, when they stagnated and began their long decline.

Reagan’s campaign began with an attack on PATCO, a union of well-paid professionals — federally-employed air-traffic controllers — which his National Labor Relations Board eventually decertified. That initial move signaled the Republican Party’s willingness, even enthusiasm, for breaking with decades of bipartisan support for organized labor. By the time Donald Trump took office in the next century, it was a given that Republicans would openly support anti-union measures like federal “right-to-work” laws, which, if passed, would make it illegal for employers to agree to a union-only workplace and so effectively destroy the bargaining power of unions. (Fortunately, opponents were able to forestall that move during Trump’s presidency, but in February 2021, Republicans reintroduced their National Right To Work Act.)

The Second Front and the Third Rail

There was a second front in Reagan’s war on the New Deal. He targeted a group of programs from that era that came to be known collectively as “entitlements.” Three of the most important were Aid to Dependent Children, unemployment insurance, and Social Security. In addition, in 1965, a Democratic Congress had added a healthcare entitlement, Medicare, which helps cover medical expenses for those over 65 and younger people with specific chronic conditions, as well as Medicaid, which does the same for poor people who qualify. These, too, would soon be in the Republican gunsights.

The story of Reagan’s racially inflected attacks on welfare programs is well-known. His administration’s urge to go after unemployment insurance, which provided payments to laid-off workers, was less commonly acknowledged. In language eerily echoed by Republican congressional representatives today, the Reagan administration sought to reduce the length of unemployment benefits, so that workers would be forced to take any job at any wage. A 1981 New York Times report, for instance, quoted Reagan Assistant Secretary of Labor Albert Agrisani as saying:

“‘The bottom line… is that we have developed two standards of work, available work and desirable work.’ Because of the availability of unemployment insurance and extended benefits, he said, ‘there are jobs out there that people don’t want to take.’”

Reagan did indeed get his way with unemployment insurance, but when he turned his sights on Social Security, he touched Tip O’Neill’s third rail.

Unlike welfare, whose recipients are often framed as lazy moochers, and unemployment benefits, which critics claim keep people from working, Social Security was then and remains today a hugely popular program. Because workers contribute to the fund with every paycheck and usually collect benefits only after retirement, beneficiaries appear deserving in the public eye. Of all the entitlement programs, it’s the one most Americans believe that they and their compatriots are genuinely entitled to. They’ve earned it. They deserve it.

So, when the president moved to reduce Social Security benefits, ostensibly to offset a rising deficit in its fund, he was shocked by the near-unanimous bipartisan resistance he met. His White House put together a plan to cut $80 billion over five years by — among other things — immediately cutting benefits and raising the age at which people could begin fully collecting them. Under that plan, a worker who retired early at 62 and was entitled to $248 a month would suddenly see that payout reduced to $162.

Buy the Book

Access to early retirement was, and remains, a justice issue for workers with shorter life expectancies — especially when those lives have been shortened by the hazards of the work they do. As South Carolina Republican Congressman Carroll Campbell complained to the White House at the time: “I’ve got thousands of sixty-year-old textile workers who think it’s the end of the world. What the hell am I supposed to tell them?”

After the Senate voted 96-0 to oppose any plan that would “precipitously and unfairly reduce early retirees’ benefits,” the Reagan administration regrouped and worked out a compromise with O’Neill and the Democrats. Economist (later Federal Reserve chair) Alan Greenspan would lead a commission that put together a plan, approved in 1983, to gradually raise the full retirement age, increase the premiums paid by self-employed workers, start taxing benefits received by people with high incomes, and delay cost-of-living adjustments. Those changes were rolled out gradually, the country adjusted, and no politicians were electrocuted in the process.

Panic! The System Is Going Broke!

With its monies maintained in a separately sequestered trust fund, Social Security, unlike most government programs, is designed to be self-sustaining. Periodically, as economist and New York Times columnist Paul Krugman might put it, serious politicians claim to be concerned about that fund running out of money. There’s a dirty little secret that those right-wing deficit slayers never tell you, though: when the Social Security trust fund runs a surplus, as it did from 1983 to 2009, it’s required to invest it in government bonds, indirectly helping to underwrite the federal government’s general fund.

They also aren’t going to mention that one group who contributes to that surplus will never see a penny in benefits: undocumented immigrant workers who pay into the system but won’t ever collect Social Security. Indeed, in 2016, such workers provided an estimated $13 billion out of about $957 billion in Social Security taxes, or almost 3% of total revenues. That may not sound like much, but over the years it adds up. In that way, undocumented workers help subsidize the trust fund and, in surplus years, the entire government.

How, then, is Social Security funded? Each year, employees contribute 6.2% of their wages (up to a cap amount). Employers match that, for a total of 12.4% of wages paid, and both put out another 1.45% each for Medicare. Self-employed people pay both shares or a total of 15.3% of their income, including Medicare. And those contributions add up to about nine-tenths of the fund’s annual income (89% in 2019). The rest comes from interest on government bonds.

So, is the Social Security system finally in trouble? It could be. When the benefits due to a growing number of retirees exceed the fund’s income, its administrators will have to dip into its reserves to make up the difference. As people born in the post-World War II baby boom reach retirement, at a moment when the American population is beginning to age rapidly, dire predictions are resounding about the potential bankruptcy of the system. And there is, in fact, a consensus that the fund will begin drawing down its reserves, possibly starting this year, and could exhaust them as soon as 2034. At that point, relying only on the current year’s income to pay benefits could reduce Social Security payouts to perhaps 79% of what’s promised at present.

You can already hear the cries that the system is going broke!

But it doesn’t have to be that way. Employees and employers only pay Social Security tax on income up to a certain cap. This year it’s $142,800. In other words, employees who make a million dollars in 2021 will contribute no more to Social Security than those who make $142,800. To rescue Social Security, all it would take is raising that cap — or better yet, removing it altogether.

In fact, the Congressional Budget Office has run the numbers and identified two different methods of raising it to eventually tax all wage income. Either would keep the trust fund solvent.

Naturally, plutocrats and their congressional minions don’t want to raise the Social Security cap. They’d rather starve the entitlement beast and blame possible shortfalls on greedy boomers who grew up addicted to government handouts. Under the circumstances, we, and succeeding generations, had better hope that Social Security remains, as it was in 1981, the third rail in American politics.

Welfare for Weapons Makers

Of course, there’s a second high-voltage, untouchable rail in American politics and that’s funding for the military and weapons manufacturers. It takes a brave politician indeed to suggest even the most minor of reductions in Pentagon spending, which has for years been the single largest item of discretionary spending in the federal budget.

It’s notoriously difficult to identify how much money the government actually spends annually on the military. President Trump’s last Pentagon budget, for the fiscal year ending on September 30th, offered about $740 billion to the armed services (not including outlays for veteran services and pensions). Or maybe it was only $705.4 billion. Or perhaps, including Department of Energy outlays involving nuclear weapons, $753.5 billion. (And none of those figures even faintly reflected full national-security spending, which is certainly well over a trillion dollars annually.)

Most estimates put President Biden’s 2022 military budget at $753 billion — about the same as Trump’s for the previous year. As former Senator Everett Dirksen is once supposed to have said, “A billion here, a billion there, and pretty soon you’re talking real money.”

Indeed, we’re talking real money and real entitlements here that can’t be touched in Washington without risking political electrocution. Unlike actual citizens, U.S. arms manufacturers seem entitled to ever-increasing government subsidies — welfare for weapons, if you like. Beyond the billions spent to directly fund the development and purchase of various weapons systems, every time the government permits arms sales to other countries, it’s expanding the coffers of companies like Lockheed-Martin, Northrup-Grumman, Boeing, and Raytheon Technologies. The real beneficiaries of Donald Trump’s so-called Abraham Accords between Israel and the majority Muslim states of Morocco, the United Arab Emirates, Bahrain, and Sudan were the U.S. companies that sell the weaponry that sweetened those deals for Israel’s new friends.

When Americans talk about undeserved entitlements, they’re usually thinking about welfare for families, not welfare for arms manufacturers. But military entitlements make the annual federal appropriation of $16.5 billion for Temporary Aid to Needy Families (TANF) look puny by comparison. In fact, during Republican and Democratic administrations alike, the yearly federal outlay for TANF hasn’t changed since it was established through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act, known in the Clinton era as “welfare reform.” Inflation has, however, eroded its value by about 40% in the intervening years.

And what do Americans get for those billions no one dares to question? National security, right?

But how is it that the country that spends more on “defense” than the next seven, or possibly 10, countries combined is so insecure that every year’s Pentagon budget must exceed the last one? Why is it that, despite those billions for military entitlements, our critical infrastructure, including hospitals, gas pipelines, and subways (not to mention Cape Cod steamships), lies exposed to hackers?

And if, thanks to that “defense” budget, we’re so secure, why is it that, in my wealthy home city of San Francisco, residents now stand patiently in lines many blocks long to receive boxes of groceries? Why is “national security” more important than food security, or health security, or housing security? Or, to put it another way, which would you rather be entitled to: food, housing, education, and healthcare, or your personal share of a shiny new hypersonic missile?

But wait! Maybe defense spending contributes to our economic security by creating, as Donald Trump boasted in promoting his arms deals with Saudi Arabia, “jobs, jobs, jobs.” It’s true that spending on weaponry does, in fact, create jobs, just not nearly as many as investing taxpayer dollars in a variety of far less lethal endeavors would. As Brown University’s Costs of War project reports:

“Military spending creates fewer jobs than the same amount of money would have, if invested in other sectors. Clean energy and health care spending create 50% more jobs than the equivalent amount of spending on the military. Education spending creates more than twice as many jobs.”

It seems that President Joe Biden is ready to shake things up by attacking child poverty, the coronavirus pandemic, and climate change, even if he has to do it without any Republican support. But he’s still hewing to the old Cold War bipartisan alliance when it comes to the real third rail of American politics — military spending. Until the power can be cut to that metaphorical conduit, real national security remains an elusive dream.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel, Songlands (the final one in his Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com

]]>
Can Biden-Harris make a Counter-Revolution against Reaganism and Restore Dignity to Work? https://www.juancole.com/2021/03/counter-revolution-reaganism.html Wed, 10 Mar 2021 05:01:07 +0000 https://www.juancole.com/?p=196560 ( Tomdispatch.com ) – A year ago, just a few weeks before San Francisco locked itself down for the pandemic, I fell deeply in love with a 50-year-old. The object of my desire was a wooden floor loom in the window of my local thrift shop. Friends knowledgeable on such matters examined photos I took of it and assured me that all the parts were there, so my partner (who puts up with such occasional infatuations) helped me wrangle it into one of our basement rooms and I set about learning to weave.

These days, all I want to do is weave. The loom that’s gripped me, and the pandemic that’s gripped us all, have led me to rethink the role of work (and its subset, paid labor) in human lives. During an enforced enclosure, this 68-year-old has spent a lot of time at home musing on what the pandemic has revealed about how this country values work. Why, for example, do the most “essential” workers so often earn so little — or, in the case of those who cook, clean, and care for the people they live with, nothing at all? What does it mean when conservatives preach the immeasurable value of labor, while insisting that its most basic price in the marketplace shouldn’t rise above $7.25 per hour?

That, after all, is where the federal minimum wage has been stuck since 2009. And that’s where it would probably stay forever, if Republicans like Kansas Senator Roger Marshall had their way. He brags that he put himself through college making $6 an hour and doesn’t understand why people can’t do the same today for $7.25. One likely explanation: the cost of a year at Kansas State University has risen from $898 when he was at school to $10,000 today. Another? At six bucks an hour, he was already making almost twice the minimum wage of his college years, a princely $3.35 an hour.

It’s Definitely Not Art, But Is It Work?

It’s hard to explain the pleasure I’ve gotten from learning the craft of weaving, an activity whose roots extend at least 20,000 years into the past. In truth, I could devote the next (and most likely last) 20 years of my life just to playing with “plain weave,” its simplest form — over-under, over-under — and not even scratch the surface of its possibilities. Day after day, I tromp down to our chilly basement and work with remarkable satisfaction at things as simple as getting a straight horizontal edge across my cloth.

But is what I’m doing actually “work”? Certainly, at the end of a day of bending under the loom to tie things up, of working the treadles to raise and lower different sets of threads, my aging joints are sore. My body knows all too well that I’ve been doing something. But is it work? Heaven knows, I’m not making products crucial to our daily lives or those of others. (We now possess more slightly lopsided cloth napkins than any two-person household could use in a lifetime.) Nor, at my beginner’s level, am I producing anything that could pass for “art.”

I don’t have to weave. I could buy textiles for a lot less than it costs me to make them. But at my age, in pandemic America, I’m lucky. I have the time, money, and freedom from personal responsibilities to be able to immerse myself in making cloth. For me, playing with string is a first-world privilege. It won’t help save humanity from a climate disaster or reduce police violence in communities of color. It won’t even help a union elect an American president, something I was focused on last fall, while working with the hospitality-industry union. It’s not teaching college students to question the world and aspire to living examined lives, something I’ve done in my official work as a part-time professor for the last 15 years. It doesn’t benefit anyone but me.

Nevertheless, what I’m doing certainly does have value for me. It contributes, as philosophers might say, to my human flourishing. When I practice weaving, I’m engaged in something political philosopher Iris Marion Young believed essential to a good life. As she put it, I’m “learning and using satisfying and expansive skills.” Young thought that a good society would offer all its members the opportunity to acquire and deploy such complicated skills in “socially recognized settings.” In other words, a good society would make it possible for people to do work that was both challenging and respected.

Writing in the late 1980s, she took for granted that “welfare capitalism” of Europe, and to a far lesser extent the United States, would provide for people’s basic material needs. Unfortunately, decades later, it’s hard even to teach her critique of such welfare capitalism — a system that sustained lives but didn’t necessarily allow them to flourish — because my students here have never experienced an economic system that assumes any real responsibility for sustaining life. Self-expression and an opportunity to do meaningful work? Pipe dreams if you aren’t already well-off! They’ll settle for jobs that pay the rent, keep the refrigerator stocked, and maybe provide some health benefits as well. That would be heaven enough, they say. And who could blame them when so many jobs on offer will fall far short of even such modest goals?

What I’m not doing when I weave is making money. I’m not one of the roughly 18 million workers in this country who do earn their livings in the textile industry. Such “livings” pay a median wage of about $28,000 a year, which likely makes it hard to keep a roof over your head. Nor am I one of the many millions more who do the same around the world, people like Seak Hong who sews garments and bags for an American company in Cambodia. Describing her life, she told a New York Times reporter, “I feel tired, but I have no choice. I have to work.” Six days a week,

“Ms. Hong wakes up at 4:35 a.m. to catch the truck to work from her village. Her workday begins at 7 and usually lasts nine hours, with a lunch break. During the peak season, which lasts two to three months, she works until 8:30 p.m.”

“Ms. Hong has been in the garment business for 22 years. She earns the equivalent of about $230 a month and supports her father, her sister, her brother (who is on disability) and her 12-year-old son.”

Her sister does the unpaid — but no less crucial — work of tending to her father and brother, the oxen, and their subsistence rice plants.

Hong and her sister are definitely working, one with pay, the other without. They have, as she says, no choice.

Catherine Gamet, who makes handbags in France for Louis Vuitton, is also presumably working to support herself. But hers is an entirely different experience from Hong’s. She loves what she’s been doing for the last 23 years. Interviewed in the same article, she told the Times, “To be able to build bags and all, and to be able to sew behind the machine, to do hand-sewn products, it is my passion.” For Gamet, “The time flies by.”

Both these women have been paid to make bags for more than 20 years, but they’ve experienced their jobs very differently, undoubtedly thanks to the circumstances surrounding their work, rather than the work itself: how much they earn; the time they spend traveling to and from their jobs; the extent to which the “decision” to do a certain kind of work is coerced by fear of poverty. We don’t learn from Hong’s interview how she feels about the work itself. Perhaps she takes pride in what she does. Most people find a way to do that. But we know that making bags is Gamet’s passion. Her work is not merely exhausting, but in Young’s phrase “satisfying and expansive.” The hours she spends on it are lived, not just endured as the price of survival.

Pandemic Relief and Its Discontents

Joe Biden and Kamala Harris arrived at the White House with a commitment to getting a new pandemic relief package through Congress as soon as possible. It appears that they’ll succeed, thanks to the Senate’s budget reconciliation process — a maneuver that bypasses the possibility of a Republican filibuster. Sadly, because resetting the federal minimum wage to $15 per hour doesn’t directly involve taxation or spending, the Senate’s parliamentarian ruled that the reconciliation bill can’t include it.


Buy the Book

Several measures contained in the package have aroused conservative mistrust, from the extension of unemployment benefits to new income supplements for families with children. Such measures provoke a Republican fear that somebody, somewhere, might not be working hard enough to “deserve” the benefits Congress is offering or that those benefits might make some workers think twice about sacrificing their time caring for children to earn $7.25 an hour at a soul-deadening job.

As New York Times columnist Ezra Klein recently observed, Republicans are concerned that such measures might erode respect for the “natural dignity” of work. In an incisive piece, he rebuked Republican senators like Mike Lee and Marco Rubio for responding negatively to proposals to give federal dollars to people raising children. Such a program, they insisted, smacked of — the horror! — “welfare,” while in their view, “an essential part of being pro-family is being pro-work.” Of course, for Lee and Rubio “work” doesn’t include changing diapers, planning and preparing meals, doing laundry, or helping children learn to count, tell time, and tie their shoelaces — unless, of course, the person doing those things is employed by someone else’s family and being paid for it. In that case it qualifies as “work.” Otherwise, it’s merely a form of government-subsidized laziness.

There is, however, one group of people that “pro-family” conservatives have long believed are naturally suited to such activities and who supposedly threaten the well-being of their families if they choose to work for pay instead. I mean, of course, women whose male partners earn enough to guarantee food, clothing, and shelter with a single income. I remember well a 1993 article by Pat Gowens, a founder of Milwaukee’s Welfare Warriors, in the magazine Lesbian Contradiction. She wondered why conservative anti-feminists of that time thought it good if a woman with children had a man to provide those things, but an outrage if she turned to “The Man” for the same aid. In the first case, the woman’s work is considered dignified, sacred, and in tune with the divine plan. Among conservatives, then or now, the second could hardly be dignified with the term “work.”

The distinction they make between private and public paymasters, when it comes to domestic labor contains at least a tacit, though sometimes explicit, racial element. When the program that would come to be known as “welfare” was created as part of President Franklin Roosevelt’s New Deal in the 1930s, it was originally designed to assist respectable white mothers who, through no fault of their own, had lost their husbands to death or desertion. It wasn’t until the 1960s that African American women decided to secure their right to coverage under the same program and built the National Welfare Rights Organization to do so.

The word “welfare” refers, as in the preamble to the Constitution, to human wellbeing. But when Black women started claiming those rights, it suddenly came to signify undeserved handouts. You could say that Ronald Reagan rode into the White House in 1980 in a Cadillac driven by the mythical Black “welfare queen” he continually invoked in his campaign. It would be nice to think that the white resentment harnessed by Reagan culminated (as in “reached its zenith and will now decline”) with Trump’s 2016 election, but, given recent events, that would be unrealistically optimistic.

Reagan began the movement to undermine the access of poor Americans to welfare programs. Ever since, starving the entitlement beast has been the Republican lodestar. In the same period, of course, the wealthier compatriots of those welfare mothers have continued to receive ever more generous “welfare” from the government. Those would include subsidies to giant agriculture, oil-depletion allowances and other subsidies for fossil-fuel companies, the mortgage-interest tax deduction for people with enough money to buy rather than rent their homes, and the massive tax cuts for billionaires of the Trump era. However, it took a Democratic president, Bill Clinton, to achieve what Reagan couldn’t, and, as he put it, “end welfare as we know it.”

The Clinton administration used the same Senate reconciliation process in play today for the Biden administration’s Covid-19 relief bill to push through the 1996 Personal Responsibility and Work Opportunity Reconciliation Act. It was more commonly known as “welfare reform.” That act imposed a 32-hour-per-week work or training requirement on mothers who received what came to be known as Temporary Assistance to Needy Families. It also gave “temporary” its deeper meaning by setting a lifetime benefits cap of five years. Meanwhile, that same act proved a bonanza for non-profits and Private Industry Councils that got contracts to administer “job training” programs and were paid to teach women how to wear skirts and apply makeup to impress future employers. In the process, a significant number of unionized city and county workers nationwide were replaced with welfare recipients “earning” their welfare checks by sweeping streets or staffing county offices, often for less than the minimum wage.

In 1997, I was working with Californians for Justice (CFJ), then a new statewide organization dedicated to building political power in poor communities, especially those of color. Given the high unemployment rates in just such communities, our response to Clinton’s welfare reforms was to demand that those affected by them at least be offered state-funded jobs at a living wage. If the government was going to make people work for pay, we reasoned, then it should help provide real well-paying jobs, not bogus “job readiness” programs. We secured sponsors in the state legislature, but I’m sure you won’t be shocked to learn that our billion-dollar jobs bill never got out of committee in Sacramento.

CFJ’s project led me into an argument with one of my mentors, the founder of the Center for Third World Organizing, Gary Delgado. Why on earth, he asked me, would you campaign to get people jobs? “Jobs are horrible. They’re boring: they waste people’s lives and destroy their bodies.” In other words, Gary was no believer in the inherent dignity of paid work. So, I had to ask myself, why was I?

Among those who have inspired me, Gary wasn’t alone in holding such a low opinion of jobs. The Greek philosopher Aristotle, for instance, had been convinced that those whose economic condition forced them to work for a living would have neither the time nor space necessary to live a life of “excellence” (his requirement for human happiness). Economic coercion and a happy life were, in his view, mutually exclusive.

Reevaluating Jobs

One of the lies capitalism tells us is that we should be grateful for our jobs and should think of those who make a profit from our labor not as exploiters but as “job creators.” In truth, however, there’s no creativity involved in paying people less than the value of their work so that you can skim off the difference and claim that you earned it. Even if we accept that there could be creativity in “management” — the effort to organize and divide up work so it’s done efficiently and well — it’s not the “job creators” who do that, but their hirelings. All the employers bring to the game is money.

Take the example of the admirable liberal response to the climate emergency, the Green New Deal. In the moral calculus of capitalism, it’s not enough that shifting to a green economy could promote the general welfare by rebuilding and extending the infrastructure that makes modern life possible and rewarding. It’s not enough that it just might happen in time to save billions of people from fires, floods, hurricanes, or starvation. What matters — the selling point — is that such a conversion would create jobs (along with the factor no one mentions out loud: profits).

Now, I happen to support exactly the kind of work involved in building an economy that could help reverse climate devastation. I agree with Joe Biden’s campaign statement that such an undertaking could offer people jobs with “good wages, benefits, and worker protections.” More than that, such jobs would indeed contribute to a better life for those who do them. As the philosopher Iris Marion Young puts it, they would provide the chance to learn and use “satisfying and expansive skills in a socially recognized setting.” And that would be a very good thing even if no one made a penny of profit in the process.

Now, having finished my paid labor for the day, it’s back to the basement and loom for me.

Copyright 2021 Rebecca Gordon

Via Tomdispatch.com

]]>
Time to Roll up our Sleeves and Fix this: Trump left us with Heightened Climate, Nuclear Danger https://www.juancole.com/2021/02/heightened-climate-nuclear.html Wed, 10 Feb 2021 05:01:46 +0000 https://www.juancole.com/?p=196066 ( Tomdispatch.com) – If you live in California, you’re likely to be consumed on occasion by thoughts of fire. That’s not surprising, given that, in last year alone, actual fires consumed over four and a quarter million acres of the state, taking with them 10,488 structures, 33 human lives, and who knows how many animals. By the end of this January, a month never before even considered part of the “fire” season, 10 wildfires had already burned through 2,337 more acres, according to the California Department of Forestry and Fire Protection (CalFire).

With each passing year, the state’s fire season arrives earlier and does greater damage. In 2013, a mere eight years ago, fires consumed about 602,000 acres and started significantly later. That January, CalFire reported only a single fire, just two in February, and none in March. Fire season didn’t really begin until April and had tapered off before year’s end. This past December, however, 10 fires still burned at least 10,000 acres. In fact, it almost doesn’t make sense to talk about a fire “season” anymore. Whatever the month, wildfires are likely to be burning somewhere in the state.

Clearly, California’s fires (along with Oregon’s and Washington’s) are getting worse. Just as clearly, notwithstanding Donald Trump’s exhortations to do a better job of “raking” our forests, climate change is the main cause of this growing disaster.

Fortunately, President Joe Biden seems to take the climate emergency seriously. In just his first two weeks in office, he’s cancelled the Keystone XL pipeline project, forbidden new drilling for oil or gas on public lands, and announced a plan to convert the entire federal fleet of cars and trucks to electric vehicles. Perhaps most important of all, he’s bringing the U.S. back into the Paris climate accords, signaling an understanding that a planetary crisis demands planetwide measures and that the largest carbon-emitting economies should be leading the way. “This isn’t [the] time for small measures,” Biden has said. “We need to be bold.”

Let’s just hope that such boldness has arrived in time and that the Biden administration proves unwilling to sacrifice the planet on an altar of elusive congressional unity and illusionary bipartisanship.

Another Kind of Fire

If climate change threatens human life as we know it, so does another potential form of “fire” — the awesome power created when a nuclear reaction converts matter to energy. This is the magic of Einstein’s observation that e=mc2, or that the energy contained in a bit of matter is equal to its mass (roughly speaking, its weight) multiplied by the speed of light squared expressed in meters per second. Roughly speaking, as we’ve all known since August 6, 1945, when an atomic bomb was dropped on the Japanese city of Hiroshima, that’s an awful lot of energy. When a nuclear reaction is successfully controlled, the energy can be regulated and used to produce electricity without emitting carbon dioxide in the process.

Unfortunately, while nuclear power plants don’t add greenhouse gasses to the atmosphere, they do create radioactive waste, some of which remains deadly for thousands of years. Industry advocates who argue for nuclear power as a “green” alternative generally ignore the problem which has yet to be solved ­­ of disposing of that waste.

In what hopefully is just a holdover from the Trump administration, the Energy Department website still “addresses” this issue by suggesting that all the nuclear waste produced to date “could fit on a football field at a depth of less than 10 yards!” The site neglects to add that, if you shoved that 3,456,000 square feet of nuclear waste together the wrong way, the resultant explosive chain reaction would probably wipe out most life on Earth.

Remember, too, that “controlled” nuclear reactions don’t always remain under human control. Ask anyone who lived near the Three Mile Island nuclear reactor in Pennsylvania, the Chernobyl nuclear power plant in the Ukraine, or the Fukushima Daiichi nuclear power plant in Japan.

There is, however, another far more devastating form of “controlled” nuclear reaction, the kind created when a nuclear bomb explodes. Only one country has ever deployed atomic weapons in war, of course: the United States, in its attack on Hiroshima and, three days later, on Nagasaki. Those bombs were of the older uranium-based variety and were puny by the standards of today’s nuclear weapons. Still, the horror of those attacks was sufficient to convince many that such weapons should never be used again.

Treaties and Entreaties

In the decades since 1945, various configurations of nations have agreed to treaties prohibiting the use of, or limiting the proliferation of, nuclear weapons — even as the weaponry spread and nuclear arsenals grew. In the Cold War decades, the most significant of these were the bilateral pacts between the two superpowers of the era, the U.S. and the Soviet Union. When the latter collapsed in 1991, Washington signed treaties instead with the Russian Federation government, the most recent being the New START treaty, which came into effect in 2011 and was just extended by Joe Biden and Vladimir Putin.

In addition to such bilateral agreements, the majority of nations on the planet agreed on various multilateral pacts, including the Nuclear Non-Proliferation Treaty, or NPT, which has been signed by 191 countries and has provided a fairly effective mechanism for limiting the spread of such arms. Today, there are still “only” nine nuclear-armed states. Of these, five are signatories of the NPT: China, France, Russia, the United Kingdom, and United States. Israel has never publicly acknowledged its growing nuclear arsenal. Three other nuclear-armed countries — India, Pakistan, and North Korea — have never signed the treaty at all. Worse yet, in 2005, the George W. Bush administration inked a side-deal with India that gave Washington’s blessing to the acceleration of that country’s nuclear weapons development program outside the monitoring constraints of the NPT.

Buy the Book

The treaty assigns to the International Atomic Energy Agency (IAEA) the authority to monitor compliance. It was this treaty, for example, that gave the IAEA the right to inspect Iraq’s nuclear program in the period before the U.S. invaded in 2003. Indeed, the IAEA repeatedly reported that Iraq was, in fact, in compliance with the treaty in the months that preceded the invasion, despite the claims of the Bush administration that Iraqi ruler Saddam Hussein had such weaponry. The United States must act, President Bush insisted then, before the “smoking gun” of proof the world demanded turned out to be a “mushroom cloud” over some American city. As became clear after the first few months of the disastrous U.S. military occupation, there simply were no weapons of mass destruction in Iraq. (At least partly in recognition of the IAEA’s attempts to forestall that U.S. invasion, the agency and its director general, Mohamed El Baradei, would receive the 2005 Nobel Peace Prize.)

Like Iraq, Iran also ratified the NPT in 1968, laying the foundation for ongoing IAEA inspections there. In recent years, having devastated Iraq’s social, economic, and political infrastructure, the United States shifted its concern about nuclear proliferation to Iran. In 2015, along with China, Russia, France, the United Kingdom, Germany, and the European Union, the Obama administration signed the Joint Comprehensive Plan of Action (JCPOA), informally known as the Iran nuclear deal.

Under the JCPOA, in return for the lifting of onerous economic sanctions that were affecting the whole population, Iran agreed to limit the development of its nuclear capacity to the level needed to produce electricity. Again, IAEA scientists would be responsible for monitoring the country’s compliance, which by all accounts was more than satisfactory — at least until 2018. That’s when President Donald Trump unilaterally pulled the U.S. out of the agreement and reimposed heavy sanctions. Since then, as its economy began to be crushed, Iran was, understandably enough, reluctant to uphold its end of the bargain.

In the years since 1945, the world has seen treaties signed to limit or ban the testing of nuclear weapons or to cap the size of nuclear arsenals, as well as bilateral treaties to decommission parts of existing ones, but never a treaty aimed at outlawing nuclear weapons altogether. Until now. On January 22, 2021, the United Nations Treaty on the Prohibition of Nuclear Weapons took effect. Signed so far by 86 countries, the treaty represents “a legally binding instrument to prohibit nuclear weapons, leading towards their total elimination,” according to the U.N. Sadly, but unsurprisingly, none of the nine nuclear powers are signatories.

“Fire and Fury”

I last wrote about nuclear danger in October 2017 when Donald Trump had been in the White House less than a year and, along with much of the world, I was worried that he might bungle his way into a war with North Korea. Back then, he and Kim Jong-un had yet to fall in love or to suffer their later public breakup. Kim was still “Little Rocket Man” to Trump, who had threatened to “rain fire and fury like the world has never seen” on North Korea.

The world did, in the end, survive four years of a Trump presidency without a nuclear war, but that doesn’t mean he left us any safer. On the contrary, he took a whole series of rash steps leading us closer to nuclear disaster:

  • He pulled the U.S. out of the JCPOA, thereby destabilizing the Iran nuclear agreement and reigniting Iran’s threats (and apparent efforts toward) someday developing nuclear weapons.
  • He withdrew from the 1987 Intermediate Range Nuclear Forces Treaty between the U.S. and the Soviet Union (later the Russian Federation), which, according to the nonpartisan Arms Control Association,

“required the United States and the Soviet Union to eliminate and permanently forswear all of their nuclear and conventional ground-launched ballistic and cruise missiles with ranges of 500 to 5,500 kilometers. The treaty marked the first time the superpowers had agreed to reduce their nuclear arsenals, eliminate an entire category of nuclear weapons, and employ extensive on-site inspections for verification.”

  • He withdrew from the Open Skies Treaty, which gave signatories permission to fly over each other’s territories to identify military installations and activities. Allowing this kind of access was meant to contribute to greater trust among nuclear-armed nations.
  • He threatened to allow the New START Treaty to expire, should he be reelected.
  • He presided over a huge increase in spending on the “modernization” of the U.S. nuclear arsenal, including on new submarine- and land-based launching capabilities. A number of these programs are still in their initial stages and could be stopped by the Biden administration.

In January 2021, after four years of Trump, the Bulletin of Atomic Scientists adjusted its “Doomsday Clock,” moving the minute hand forward, to a mere 100 seconds to midnight. Since 1947, that Clock’s annual resetting has reflected how close, in the view of the Bulletin’s esteemed scientists and Nobel laureates, humanity has come to ending it all. As the Bulletin’s editors note, “The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.”

Why so close to midnight? The magazine lists a number of reasons, including the increased danger of nuclear war, due in large part to steps taken by the United States in the Trump years, as well as to the development of “hypersonic” missiles, which are supposed to fly at five times the speed of sound and so evade existing detection systems. (Trump famously referred to these “super-duper” weapons as “hydrosonic,” a term that actually describes a kind of toothbrush.) There is disagreement among weapons experts about the extent to which such delivery vehicles will live up to the (hyper) hype about them, but the effort to build them is destabilizing in its own right.

The Bulletin points to a number of other factors that place humanity in ever greater danger. One is, of course, the existential threat of climate change. Another is the widespread dissemination of “false and misleading information.” The spread of lies about Covid-19, its editors say, exemplifies the life-threatening nature of a growing “wanton disregard for science and the large-scale embrace of conspiratorial nonsense.” This is, they note, “often driven by political figures and partisan media.” Such attacks on knowledge itself have “undermined the ability of responsible national and global leaders to protect the security of their citizens.”

Passing the (Nuclear) Ball

When Donald Trump announced that he wouldn’t attend the inauguration of Joe Biden and Kamala Harris, few people were surprised. After all, he was still insisting that he’d actually won the election, even after that big lie fueled an insurrectionary invasion of the Capitol. But there was another reason for concern: if Trump was going to be at Mar-a-Lago, how would he hand over the “nuclear football” to the new president? That “football” is, in fact, a briefcase containing the nuclear launch codes, which presidents always have with them. Since the dawn of the nuclear age, it’s been passed from the outgoing president to the new one on Inauguration Day.

Consternation! The problem was resolved through the use of two briefcases, which were simultaneously deactivated and activated at 11:59:59 a.m. on January 20th, just as Biden was about to be sworn in.

The football conundrum pointed to a far more serious problem, however — that the fate of humanity regularly hangs on the actions of a single individual (whether as unbalanced as Donald Trump or as apparently sensible as Joe Biden) who has the power to begin a war that could end our species.

There’s good reason to think that Joe Biden will be more reasonable about the dangers of nuclear warfare than the narcissistic idiot he succeeds. In addition to agreeing to extend the New START treaty, he’s also indicated a willingness to rejoin the Iran nuclear deal and criticized Trump’s nuclear buildup. Nevertheless, the power to end the world shouldn’t lie with one individual. Congress could address this problem, by (as I suggested in 2017) enacting “a law that would require a unanimous decision by a specified group of people (for example, officials like the secretaries of state and defense together with the congressional leadership) for a nuclear first strike.”

The Fire Next Time?

“God gave Noah the rainbow sign
No more water but the fire next time”

These words come from the African-American spiritual “I Got a Home in that Rock.” The verse refers to God’s promise to Noah in Genesis, after the great flood, never again to destroy all life on earth, a promise signified by the rainbow.

Those who composed the hymn may have been a bit less trusting of God — or of human destiny — than the authors of Genesis, since the Bible account says nothing about fire or a next time. Sadly, recent human history suggests that there could indeed be a next time. If we do succeed in destroying ourselves, it seems increasingly likely that it will be by fire, whether the accelerating heating of the globe over decades, or a nuclear conflagration any time we choose. The good news, the flame of hope, is that we still have time — at least 100 seconds — to prevent it.

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com

]]>
The Rubble of Empire: Doctrines of Disaster and Dreams of Security as the Biden Years Begin https://www.juancole.com/2021/01/doctrines-disaster-security.html Wed, 20 Jan 2021 05:01:22 +0000 https://www.juancole.com/?p=195654 ( Tomdispatch.com ) – How can you tell when your empire is crumbling? Some signs are actually visible from my own front window here in San Francisco.

Directly across the street, I can see a collection of tarps and poles (along with one of my own garbage cans) that were used to construct a makeshift home on the sidewalk. Beside that edifice stands a wooden cross decorated with a string of white Christmas lights and a red ribbon — a memorial to the woman who built that structure and died inside it earlier this week. We don’t know — and probably never will — what killed her: the pandemic raging across California? A heart attack? An overdose of heroin or fentanyl?

Behind her home and similar ones is a chain-link fence surrounding the empty playground of the Horace Mann/Buena Vista elementary and middle school. Like that home, the school, too, is now empty, closed because of the pandemic. I don’t know where the families of the 20 children who attended that school and lived in one of its gyms as an alternative to the streets have gone. They used to eat breakfast and dinner there every day, served on the same sidewalk by a pair of older Latina women who apparently had a contract from the school district to cook for the families using that school-cum-shelter. I don’t know, either, what any of them are now doing for money or food.

Just down the block, I can see the line of people that has formed every weekday since early December. Masked and socially distanced, they wait patiently to cross the street, one at a time, for a Covid test at a center run by the San Francisco Department of Health. My little street seems an odd choice for such a service, since — especially now that the school has closed — it gets little foot traffic. Indeed, a representative of the Latino Task Force, an organization created to inform the city’s Latinx population about Covid resources told our neighborhood paper Mission Local that

“Small public health clinics such as this one ‘will say they want to do more outreach, but I actually think they don’t want to.’ He believes they chose a low-trafficked street like Bartlett to stay under the radar. ‘They don’t want to blow the spot up, because it does not have a large capacity.’”

What do any of these very local sights have to do with a crumbling empire? They’re signs that some of the same factors that fractured the Roman empire back in 476 CE (and others since) are distinctly present in this country today — even in California, one of its richest states. I’m talking about phenomena like gross economic inequality; over-spending on military expansion; political corruption; deep cultural and political fissures; and, oh yes, the barbarians at the gates. I’ll turn to those factors in a moment, but first let me offer a brief defense of the very suggestion that U.S. imperialism and an American empire actually exist.

Imperialism? What’s That Supposed to Mean?

What better source for a definition of imperialism than the Encyclopedia Britannica, that compendium of knowledge first printed in 1768 in the country that became the great empire of the nineteenth and first part of the twentieth centuries? According to the Encyclopedia, “imperialism” denotes “state policy, practice, or advocacy of extending power and dominion, especially by direct territorial acquisition or by gaining political and economic control of other areas.” Furthermore, imperialism “always involves the use of power, whether military or economic or some subtler form.” In other words, the word indicates a country’s attempts to control and reap economic benefit from lands outside its borders.

In that context, “imperialism” is an accurate description of the trajectory of U.S. history, starting with the country’s expansion across North America, stealing territory and resources from Indian nations and decimating their populations. The newly independent United States would quickly expand, beginning with the 1803 Louisiana Purchase from France. That deal, which effectively doubled its territory, included most of what would become the state of Louisiana, together with some or all of the present-day states of New Mexico, Texas, Arkansas, Missouri, Oklahoma, Kansas, Colorado, Iowa, Nebraska, Wyoming, Minnesota, North and South Dakota, Montana, and even small parts of what are today the Canadian provinces of Alberta and Saskatchewan.

Buy the Book

Of course, France didn’t actually control most of that land, apart from the port city of New Orleans and its immediate environs. What Washington bought was the “right” to take the rest of that vast area from the native peoples who lived there, whether by treaty, population transfers, or wars of conquest and extermination. The first objective of that deal was to settle land on which to expand the already hugely lucrative cotton business, that economic engine of early American history fueled, of course, by slave labor. It then supplied raw materials to the rapidly industrializing textile industry of England, which drove that country’s own imperial expansion.

U.S. territorial expansion continued as, in 1819, Florida was acquired from Spain and, in 1845, Texas was forcibly annexed from Mexico (as well as various parts of California a year later). All of those acquisitions accorded with what newspaper editor John O’Sullivan would soon call the country’s manifest — that is, clear and obvious — destiny to control the entire continent.

Eventually, such expansionism escaped even those continental borders, as the country went on to gobble up the Philippines, Hawaii, the Panama Canal Zone, the Virgin Islands, Puerto Rico, Guam, American Samoa, and the Mariana Islands, the last five of which remain U.S. territories to this day. (Inhabitants of the nation’s capital, where I grew up, were only partly right when we used to refer to Washington, D.C., as “the last colony.”)

American Doctrines from Monroe to Truman to (G.W.) Bush

U.S. economic, military, and political influence has long extended far beyond those internationally recognized possessions and various presidents have enunciated a series of “doctrines” to legitimate such an imperial reach.

Monroe: The first of these was the Monroe Doctrine, introduced in 1823 in President James Monroe’s penultimate State of the Union address. He warned the nations of Europe that, while the United States recognized existing colonial possessions in the Americas, it would not permit the establishment of any new ones.

President Teddy Roosevelt would later add a corollary to Monroe’s doctrine by establishing Washington’s right to intercede in any country in the Americas that, in the view of its leaders, was not being properly run. “Chronic wrongdoing,” he said in a 1904 message to Congress, “may in America, as elsewhere, ultimately require intervention by some civilized nation.” The United States, he suggested, might find itself forced, “however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power.” In the first quarter of the twentieth century, that Roosevelt Corollary would be used to justify U.S. occupations of Cuba, the Dominican Republic, Haiti, and Nicaragua.

Truman: Teddy’s cousin, President Franklin D. Roosevelt, publicly renounced the Monroe Doctrine and promised a hands-off attitude towards Latin America, which came to be known as the Good Neighbor Policy. It didn’t last long, however. In a 1947 address to Congress, the next president, Harry S. Truman, laid out what came to be known as the Truman Doctrine, which would underlie the country’s foreign policy at least until the collapse of the Soviet Union in 1991. It held that U.S. national security interests required the “containment” of existing Communist states and the prevention of the further spread of Communism anywhere on Earth.

It almost immediately led to interventions in the internal struggles of Greece and Turkey and would eventually underpin Washington’s support for dictators and repressive regimes from El Salvador to Indonesia. It would justify U.S.-backed coups in places like Iran, Guatemala, and Chile. It would lead this country into a futile war in Korea and a disastrous defeat in Vietnam.

That post-World War II turn to anticommunism would be accompanied by a new kind of colonialism. Rather than directly annexing territories to extract cheap labor and cheaper natural resources, under this new “neocolonial” model, the United States — and soon the great multilateral institutions of the post-war era, the World Bank and the International Monetary Fund — would gain control over the economies of poor nations. In return for aid — or loans often pocketed by local elites and repaid by the poor — those nations would accede to demands for the “structural adjustment” of their economic systems: the privatization of public services like water and utilities and the defunding of human services like health and education, usually by American or multinational corporations. Such “adjustments,” in turn, allowed the recipients to service the loans, extracting scarce hard currency from already deeply impoverished nations.

Bush: You might have thought that the fall of the Soviet empire and the end of the Cold War would have provided Washington with an opportunity to step away from resource extraction and the seemingly endless military and CIA interventions that accompanied it. You might have imagined that the country then being referred to as the “last superpower” would finally consider establishing new and different relationships with the other countries on this little planet of ours. However, just in time to prevent even the faint possibility of any such conversion came the terrorist attacks of 9/11, which gave President George W. Bush the chance to promote his very own doctrine.

In a break from postwar multilateralism, the Bush Doctrine outlined the neoconservative belief that, as the only superpower in a now supposedly “unipolar” world, the United States had the right to take unilateral military action any time it believed it faced external threat of any imaginable sort. The result: almost 20 years of disastrous “forever wars” and a military-industrial complex deeply embedded in our national economy. Although Donald Trump’s foreign policy occasionally feinted in the direction of isolationism in its rejection of international treaties, protocols, and organizational responsibilities, it still proved itself a direct descendant of the Bush Doctrine. After all, it was Bush who first took the United States out of the Anti-Ballistic Missile Treaty and rejected the Kyoto Protocol to fight climate change.

His doctrine instantly set the stage for the disastrous invasion and occupation of Afghanistan, the even more disastrous Iraq War, and the present-day over-expansion of the U.S. military presence, overt and covert, in practically every corner of the world. And now, to fulfill Donald Trump’s Star Trek fantasies, even in outer space.

An Empire in Decay

If you need proof that the last superpower, our very own empire, is indeed crumbling, consider the year we’ve just lived through, not to mention the first few weeks of 2021. I mentioned above some of the factors that contributed to the collapse of the famed Roman empire in the fifth century. It’s fair to say that some of those same things are now evident in twenty-first-century America. Here are four obvious candidates:

Grotesque Economic Inequality: Ever since President Ronald Reagan began the Republican Party’s long war on unions and working people, economic inequality has steadily increased in this country, punctuated by terrible shocks like the Great Recession of 2007-2008 and, of course, by the Covid-19 disaster. We’ve seen 40 years of tax reductions for the wealthy, stagnant wages for the rest of us (including a federal minimum wage that hasn’t changed since 2009), and attacks on programs like TANF (welfare) and SNAP (food stamps) that literally keep poor people alive.

The Romans relied on slave labor for basics like food and clothing. This country relies on super-exploited farm and food-factory workers, many of whom are unlikely to demand more or better because they came here without authorization. Our (extraordinarily cheap) clothes are mostly produced by exploited people in other countries.

The pandemic has only exposed what so many people already knew: that the lives of the millions of working poor in this country are growing ever more precarious and desperate. The gulf between rich and poor widens by the day to unprecedented levels. Indeed, as millions have descended into poverty since the pandemic began, the Guardian reports that this country’s 651 billionaires have increased their collective wealth by $1.1 trillion. That’s more than the $900 billion Congress appropriated for pandemic aid in the omnibus spending bill it passed at the end of December 2020.

An economy like ours, which depends so heavily on consumer spending, cannot survive the deep impoverishment of so many people. Those 651 billionaires are not going to buy enough toys to dig us out of this hole.

Wild Overspending on the Military: At the end of 2020, Congress overrode Trump’s veto of the annual National Defense Authorization Act, which provided a stunning $741 billion to the military this fiscal year. (That veto, by the way, wasn’t in response to the vast sums being appropriated in the midst of a devastating pandemic, but to the bill’s provisions for renaming military bases currently honoring Confederate generals, among other extraneous things.) A week later, Congress passed that omnibus pandemic spending bill and it contained an additional $696 billion for the Defense Department.

All that money for “security” might be justified, if it actually made our lives more secure. In fact, our federal priorities virtually take food out of the mouths of children to feed the maw of the military-industrial complex and the never-ending wars that go with it. Even before the pandemic, more than 10% of U.S. families regularly experienced food insecurity. Now, it’s a quarter of the population.

Corruption So Deep It Undermines the Political System: Suffice it to say that the man who came to Washington promising to “drain the swamp” has presided over one of the most corrupt administrations in U.S. history. Whether it’s been blatant self-dealing (like funneling government money to his own businesses); employing government resources to forward his reelection (including using the White House as a staging ground for parts of the Republican National Convention and his acceptance speech); tolerating corrupt subordinates like Secretary of Commerce Wilbur Ross; or contemplating a self-pardon, the Trump administration has set the bar high indeed for any future aspirants to the title of “most corrupt president.”

One problem with such corruption is that it undermines the legitimacy of government in the minds of the governed. It makes citizens less willing to obey laws, pay taxes, or act for the common good by, for example, wearing masks and socially distancing during a pandemic. It rips apart social cohesion from top to bottom.

Of course, Trump’s most dangerous corrupt behavior — one in which he’s been joined by the most prominent elected and appointed members of his government and much of his party — has been his campaign to reject the results of the 2020 general election. The concerted and cynical promotion of the big lie that the Democrats stole that election has so corrupted faith in the legitimacy of government that up to 68% of Republicans now believe the vote was rigged to elect Joe Biden. At “best,” Trump has set the stage for increased Republican suppression of the vote in communities of color. At worst, he has so poisoned the electoral process that a substantial minority of Americans will never again accept as free and fair an election in which their candidate loses.

A Country in Ever-Deepening Conflict: White supremacy has infected the entire history of this country, beginning with the near-extermination of its native peoples. The Constitution, while guaranteeing many rights to white men, proceeded to codify the enslavement of Africans and their descendants. In order to maintain that enslavement, the southern states seceded and fought a civil war. After a short-lived period of Reconstruction in which Black men were briefly enfranchised, white supremacy regained direct legal control in the South, and flourished in a de facto fashion in the rest of the country.

In 1858, two years before that civil war began, Abraham Lincoln addressed the Illinois Republican State Convention, reminding those present that

“‘A house divided against itself cannot stand.’ I believe this government cannot endure, permanently half slave and half free. I do not expect the Union to be dissolved — I do not expect the house to fall – but I do expect it will cease to be divided. It will become all one thing, or all the other.”

More than 160 years later, the United States clearly not only remains but has become ever more divided. If you doubt that the Civil War is still being fought today, look no farther than the Confederate battle flags proudly displayed by members of the insurrectionary mob that overran the Capitol on January 6th.

Oh, and the barbarians? They are not just at the gate; they have literally breached it, as we saw in Washington when they burst through the doors and windows of the center of government.

Building a Country From the Rubble of Empire

Human beings have long built new habitations quite literally from the rubble — the fallen stones and timbers — of earlier ones. Perhaps it’s time to think about what kind of a country this place — so rich in natural resources and human resourcefulness — might become if we were to take the stones and timbers of empire and construct a nation dedicated to the genuine security of all its people. Suppose we really chose, in the words of the preamble to the Constitution, “to promote the general welfare, and to secure the blessings of liberty to ourselves and our posterity.”

Suppose we found a way to convert the desperate hunger for ever more, which is both the fuel of empires and the engine of their eventual destruction, into a new contentment with “enough”? What would a United States whose people have enough look like? It would not be one in which tiny numbers of the staggeringly wealthy made hundreds of billions more dollars and the country’s military-industrial complex thrived in a pandemic, while so many others went down in disaster.

This empire will fall sooner or later. They all do. So, this crisis, just at the start of the Biden and Harris years, is a fine time to begin thinking about what might be built in its place. What would any of us like to see from our front windows next year?

Copyright 2021 Rebecca Gordon

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel Frostlands (the second in the Splinterlands series), Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Via Tomdispatch.com

]]>
It’s almost 20 Years since 9/11; Can we finally End our Constant, Disastrous Wars? https://www.juancole.com/2020/12/finally-constant-disastrous.html Fri, 18 Dec 2020 05:01:51 +0000 https://www.juancole.com/?p=195031 ( Tomdispatch.com) – It was the end of October 2001. Two friends, Max Elbaum and Bob Wing, had just dropped by. (Yes, children, believe it or not, people used to drop in on each other, maskless, once upon a time.) They had come to hang out with my partner Jan Adams and me. Among other things, Max wanted to get some instructions from fellow-runner Jan about taping his foot to ease the pain of plantar fasciitis. But it soon became clear that he and Bob had a bigger agenda for the evening. They were eager to recruit us for a new project.

And so began War Times/Tiempo de Guerras, a free, bilingual, antiwar tabloid that, at its height, distributed 100,000 copies every six weeks to more than 700 antiwar organizations around the country. It was already clear to the four of us that night — as it was to millions around the world — that the terrorist attacks of September 11th would provide the pretext for a major new projection of U.S. military power globally, opening the way to a new era of “all-war-all-the-time.” War Times was a project of its moment (although the name would still be apt today, given that those wars have never ended). It would be superseded in a few years by the explosive growth of the Internet and the 24-hour news cycle. Still, it represented an early effort to fill the space where a peace movement would eventually develop.

All-War-All-the-Time — For Some of Us

We were certainly right that the United States had entered a period of all-war-all-the-time. It’s probably hard for people born since 9/11 to imagine how much — and how little — things changed after September 2001. By the end of that month, this country had already launched a “war” on an enemy that then-Secretary of Defense Donald Rumsfeld told us was “not just in Afghanistan,” but in “50 or 60 countries, and it simply has to be liquidated.”

Five years and two never-ending wars later, he characterized what was then called the war on terror as “a generational conflict akin to the Cold War, the kind of struggle that might last decades as allies work to root out terrorists across the globe and battle extremists who want to rule the world.” A generation later, it looks like Rumsfeld was right, if not about the desires of the global enemy, then about the duration of the struggle.

Here in the United States, however, we quickly got used to being “at war.” In the first few months, interstate bus and train travelers often encountered (and, in airports, still encounter) a new and absurd kind of “security theater.” I’m referring to those long, snaking lines in which people first learned to remove their belts and coats, later their hats and shoes, as ever newer articles of clothing were recognized as potential hiding places for explosives. Fortunately, the arrest of the Underwear Bomber never led the Transportation Security Administration to the obvious conclusion about the clothing travelers should have to remove next. We got used to putting our three-ounce containers of liquids (No more!) into quart-sized baggies (No bigger! No smaller!).

It was all-war-all-the-time, but mainly in those airports. Once the shooting wars started dragging on, if you didn’t travel by airplane much or weren’t deployed to Afghanistan or Iraq, it was hard to remember that we were still in war time at all. There were continuing clues for those who wanted to know, like the revelations of CIA torture practices at “black sites” around the world, the horrors of military prisons like the ones at Bagram Air Force Base in Afghanistan, Abu Ghraib in Baghdad, and the still-functioning prison complex at Guantánamo Bay, Cuba. And soon enough, of course, there were the hundreds and then thousands of veterans of the Iraq and Afghan wars taking their places among the unhoused veterans of earlier wars in cities across the United States, almost unremarked upon, except by service organizations.

So, yes, the wars dragged on at great expense, but with little apparent effect in this country. They even gained new names like “the long war” (as Donald Trump’s Secretary of Defense James Mattis put it in 2017) or the “forever wars,” a phrase now so common that it appears all over the place. But apart from devouring at least $6.4 trillion dollars through September 2020that might otherwise have been invested domestically in healthcare, education, infrastructure, or addressing poverty and inequality, apart from creating increasingly militarized domestic police forces armed ever more lethally by the Pentagon, those forever wars had little obvious effect on the lives of most Americans.

Of course, if you happened to live in one of the places where this country has been fighting for the last 19 years, things are a little different. A conservative estimate by Iraq Body Count puts violent deaths among civilians in that country alone at 185,454 to 208,493 and Brown University’s Costs of War project points out that even the larger figure is bound to be a significant undercount:

“Several times as many Iraqi civilians may have died as an indirect result of the war, due to damage to the systems that provide food, health care, and clean drinking water, and as a result, illness, infectious diseases, and malnutrition that could otherwise have been avoided or treated.”

And that’s just Iraq. Again, according to the Costs of War Project, “At least 800,000 people have been killed by direct war violence in Iraq, Afghanistan, Syria, Yemen, and Pakistan.”

Of course, many more people than that have been injured or disabled. And America’s post-9/11 wars have driven an estimated 37 million people from their homes, creating the greatest human displacement since World War II. People in this country are rightly concerned about the negative effects of online schooling on American children amid the ongoing Covid-19 crisis (especially poor children and those in communities of color). Imagine, then, the effects on a child’s education of losing her home and her country, as well as one or both parents, and then growing up constantly on the move or in an overcrowded, under-resourced refugee camp. The war on terror has truly become a war of generations.

Every one of the 2,977 lives lost on 9/11 was unique and invaluable. But the U.S. response has been grotesquely disproportionate — and worse than we War Times founders could have imagined that October night so many years ago.

Those wars of ours have gone on for almost two decades now. Each new metastasis has been justified by George W. Bush’s and then Barack Obama’s use of the now ancient 2001 Authorization for the Use of Military Force (AUMF), which Congress passed in the days after 9/11. Its language actually limited presidential military action to a direct response to the 9/11 attacks and the prevention of future attacks by the same actors. It stated that the president

“…is authorized to use all necessary and appropriate force against those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored such organizations or persons, in order to prevent any future acts of international terrorism against the United States by such nations, organizations or persons.”

Despite that AUMF’s limited scope, successive presidents have used it to justify military action in at least 18 countries. (To be fair, President Obama realized the absurdity of his situation when he sent U.S. troops to Syria and tried to wring a new authorization out of Congress, only to be stymied by a Republican majority that wouldn’t play along.)

In 2002, in the run-up to the Iraq War, Congress passed a second AUMF, which permitted the president to use the armed forces as “necessary and appropriate” to “defend U.S. national security against the continuing threat posed by Iraq.” In January 2020, Donald Trump used that second authorization to justify the murder by drone of Qasem Soleimani, an Iranian general, along with nine other people.

Trump Steps In

In 2016, peace activists were preparing to confront a Hillary Clinton administration that we expected would continue Obama’s version of the forever wars — the “surge” in Afghanistan, the drone assassination campaigns, the special ops in Africa. But on Tuesday, November 8, 2016, something went “Trump” in the night and Donald J. Trump took over the presidency with a promise to end this country’s forever wars, which he had criticized relentlessly during his campaign. That, of course, didn’t mean we should have expected a peace dividend anytime soon. He was also committed to rebuilding a supposedly “depleted” U.S. military. As he said at a 2019 press conference,

“When I took over, it was a mess… One of our generals came in to see me and he said, ‘Sir, we don’t have ammunition.’ I said, ‘That’s a terrible thing you just said.’ He said, ‘We don’t have ammunition.’ Now we have more ammunition than we’ve ever had.”

It’s highly unlikely that the military couldn’t afford to buy enough bullets when Trump entered the Oval Office, given that publicly acknowledged defense funding was then running at $580 billion a year. He did, however, manage to push that figure to $713 billion by fiscal year 2020. That December, he threatened to veto an even larger appropriation for 2021 — $740 billion — but only because he wanted the military to continue to honor Confederate generals by keeping their names on military bases. Oh, and because he thought the bill should also change liability rules for social media companies, an issue you don’t normally expect to see addressed in a defense appropriations bill. And, in any case, Congress passed the bill with a veto-proof majority.

As Pentagon expert Michael Klare pointed out recently, while it might seem contradictory that Trump would both want to end the forever wars and to increase military spending, his actions actually made a certain sense. The president, suggested Klare, had been persuaded to support the part of the U.S. military command that has favored a sharp pivot away from reigning post-9/11 Pentagon practices. For 19 years, the military high command had hewed fairly closely to the strategy laid out by Secretary of Defense Donald Rumsfeld early in the Bush years: maintaining the capacity to fight ground wars against one or two regional powers (think of that “Axis of Evil” of Iraq, North Korea, and Iran), while deploying agile, technologically advanced forces in low-intensity (and a couple of higher-intensity) counterterrorism conflicts. Nineteen years later, whatever its objectives may have been — a more-stable Middle East? Fewer and weaker terrorist organizations? — it’s clear that the Rumsfeld-Bush strategy has failed spectacularly.

Klare points out that, after almost two decades without a victory, the Pentagon has largely decided to demote international terrorism from rampaging monster to annoying mosquito cloud. Instead, the U.S. must now prepare to confront the rise of China and Russia, even if China has only one overseas military base and Russia, economically speaking, is a rickety petro-state with imperial aspirations. In other words, the U.S. must prepare to fight short but devastating wars in multiple domains (including space and cyberspace), perhaps even involving the use of tactical nuclear weapons on the Eurasian continent. To this end, the country has indeed begun a major renovation of its nuclear arsenal and announced a new 30-year plan to beef up its naval capacity. And President Trump rarely misses a chance to tout “his” creation of a new Space Force.

Meanwhile, did he actually keep his promise and at least end those forever wars? Not really. He did promise to bring all U.S. troops home from Afghanistan by Christmas, but acting Defense Secretary Christopher Miller only recently said that we’d be leaving about 2,500 troops there and a similar number in Iraq, with the hope that they’d all be out by May 2021. (In other words, he dumped those wars in the lap of the future Biden administration.)

In the meantime in these years of “ending” those wars, the Trump administration actually loosened the rules of engagement for air strikes in Afghanistan, leading to a “massive increase in civilian casualties,” according to a new report from the Costs of War Project. “From the last year of the Obama administration to the last full year of recorded data during the Trump administration,” writes its author, Neta Crawford, “the number of civilians killed by U.S.-led airstrikes in Afghanistan increased by 330 percent.”

In spite of his isolationist “America First” rhetoric, in other words, President Trump has presided over an enormous buildup of an institution, the military-industrial complex, that was hardly in need of major new investment. And in spite of his anti-NATO rhetoric, his reduction by almost a third of U.S. troop strength Germany, and all the rest, he never really violated the post-World War II foreign policy pact between the Republican and Democratic parties. Regardless of how they might disagree about dividing the wealth domestically, they remain united in their commitment to using diplomacy when possible, but military force when necessary, to maintain and expand the imperial power that they believed to be the guarantor of that wealth.

And Now Comes Joe

On January 20, 2021, Joe Biden will become the president of a country that spends as much on its armed forces, by some counts, as the next 10 countries combined. He’ll inherit responsibility for a nation with a military presence in 150 countries and special-operations deployments in 22 African nations alone. He’ll be left to oversee the still-unfinished, deeply unsuccessful, never-ending war on terror in Iraq, Syria, Afghanistan, Yemen, and Somalia and, as publicly reported by the Department of Defense, 187,000 troops stationed outside the United States.

Nothing in Joe Biden’s history suggests that he or any of the people he’s already appointed to his national security team have the slightest inclination to destabilize that Democratic-Republican imperial pact. But empires are not sustained by inclination alone. They don’t last forever. They overextend themselves. They rot from within.

If you’re old enough, you may remember stories about the long lines for food in the crumbling Soviet Union, that other superpower of the Cold War. You can see the same thing in the United States today. Once a week, my partner delivers food boxes to hungry people in our city, those who have lost their jobs and homes, because the pandemic has only exacerbated this country’s already brutal version of economic inequality. Another friend routinely sees a food line stretching over a mile, as people wait hours for a single free bag of groceries.

Perhaps the horrors of 2020 — the fires and hurricanes, Trump’s vicious attacks on democracy, the death, sickness, and economic dislocation caused by Covid-19 — can force a real conversation about national security in 2021. Maybe this time we can finally ask whether trying to prop up a dying empire actually makes us — or indeed the world — any safer. This is the best chance in a generation to start that conversation. The alternative is to keep trudging mindlessly toward disaster.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rebecca Gordon

Via Tomdispatch.com

]]>
In a Looking-Glass World, Our Work is Just beginning https://www.juancole.com/2020/11/looking-glass-beginning.html Fri, 06 Nov 2020 05:01:55 +0000 https://www.juancole.com/?p=194263 By Rebecca Gordon | –

( Tomdispatch.com) – In the chaos of this moment, it seems likely that Joe Biden will just squeeze into the presidency and that he’ll certainly win the popular vote, Donald Trump’s Mussolini-like behavior and election night false claim of victory notwithstanding. Somehow, it all brings another moment in my life to mind.

Back in October 2016, my friends and I frequently discussed the challenges progressives would face if the candidate we expected to win actually entered the Oval Office. There were so many issues to worry about back then. The Democratic candidate was an enthusiastic booster of the U.S. armed forces and believed in projecting American power through its military presence around the world. Then there was that long record of promoting harsh sentencing laws and the disturbing talk about “the kinds of kids that are called superpredators — no conscience, no empathy.”

In 2016, the country was already riven by deep economic inequality. While Hillary Clinton promised “good-paying jobs” for those struggling to stay housed and buy food, we didn’t believe it. We’d heard the same promises so many times before, and yet the federal minimum wage was still stuck where it had been ever since 2009, at $7.25 an hour. Would a Clinton presidency really make a difference for working people? Not if we didn’t push her — and hard.

The candidate we were worried about was never Donald Trump, but Hillary Clinton. And the challenge we expected to confront was how to shove that quintessential centrist a few notches to the left. We were strategizing on how we might organize to get a new administration to shift government spending from foreign wars to human needs at home and around the world. We wondered how people in this country might finally secure the “peace dividend” that had been promised to us in the period just after the Cold War, back when her husband Bill became president. In those first (and, as it turned out, only) Clinton years, what we got instead was so-called welfare reform whose consequences are still being felt today, as layoffs drive millions into poverty.

We doubted Hillary Clinton’s commitment to addressing most of our other concerns as well: mass incarceration and police violence, structural racism, economic inequality, and most urgent of all (though some of us were just beginning to realize it), the climate emergency. In fact, nationwide, people like us were preparing to spend a day or two celebrating the election of the first woman president and then get down to work opposing many of her anticipated policies. In the peace and justice movements, in organized labor, in community-based organizations, in the two-year-old Black Lives Matter movement, people were ready to roll.

And then the unthinkable happened. The woman we might have loved to hate lost that election and the white-supremacist, woman-hating monster we would grow to detest entered the Oval Office.

For the last four years, progressives have been fighting largely to hold onto what we managed to gain during Barack Obama’s presidency: an imperfect healthcare plan that nonetheless insured millions of Americans for the first time; a signature on the Paris climate accord and another on a six-nation agreement to prevent Iran from pursuing nuclear weapons; expanded environmental protections for public lands; the opportunity for recipients of Deferred Action for Childhood Arrivals — DACA — status to keep on working and studying in the U.S.

For those same four years, we’ve been fighting to hold onto our battered capacity for outrage in the face of continual attacks on simple decency and human dignity. There’s no need to recite here the catalogue of horrors Donald Trump and his spineless Republican lackeys visited on this country and the world. Suffice it to say that we’ve been living like Alice in Through the Looking Glass, running as hard as we can just to stand still. That fantasy world’s Red Queen observes to a panting Alice that she must come from

“A slow sort of country! Now, here, you see, it takes all the running you can do to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

It wasn’t simply the need to run faster than full speed just in order to stay put that made Trump World so much like Looking-Glass Land. It’s that, just as in Lewis Carroll’s fictional world, reality has been turned inside out in the United States. As new Covid-19 infections reached an all-time high of more than 100,000 in a single day and the cumulative death toll surpassed 230,000, the president in the mirror kept insisting that “we’re rounding the corner” (and a surprising number of Americans seemed to believe him). He neglected to mention that, around that very corner, a coronaviral bus is heading straight toward us, accelerating as it comes. In a year when, as NPR reported, “Nearly 1 in 4 households have experienced food insecurity,” Trump just kept bragging about the stock market and reminding Americans of how well their 401k’s were doing — as if most people even had such retirement accounts in the first place.

Trump World, Biden Nation, or Something Better?

After four years of running in place, November 2016 seems like a lifetime ago. The United States of 2020 is a very different place, at once more devastated and more hopeful than at least we were a mere four years ago. On the one hand, pandemic unemployment has hit women, especially women of color, much harder than men, driving millions out of the workforce, many permanently. On the other, we’ve witnessed the birth of the #MeToo movement against sexual harassment and of the Time’s Up Legal Defense Fund, which has provided millions of dollars for working-class women to fight harassment on the job. In a few brief years, physical and psychological attacks on women have ceased to be an accepted norm in the workplace. Harassment certainly continues every day, but the country’s collective view of it has shifted.

Black and Latino communities still face daily confrontations with police forces that act more like occupying armies than public servants. The role of the police as enforcers of white supremacy hasn’t changed in most parts of the country. Nonetheless, the efforts of the Black Lives Matter movement and of the hundreds of thousands of people who demonstrated this summer in cities nationwide have changed the conversation about the police in ways no one anticipated four years ago. Suddenly, the mainstream media are talking about more than body cams and sensitivity training. In June 2020, the New York Times ran an op-ed entitled, “Yes, We Mean Literally Abolish the Police,” by Miramne Kaba, an organizer working against the criminalization of people of color. Such a thing was unthinkable four years ago.

In the Trumpian pandemic moment, gun purchases have soared in a country that already topped the world by far in armed citizens. And yet young people — often led by young women — have roused themselves to passionate and organized action to get guns off the streets of Trump Land. After a gunman shot up Emma Gonzalez’s school in Parkland, Florida, she famously announced, “We call BS” on the claims of adults who insisted that changing the gun laws was unnecessary and impossible. She led the March for Our Lives, which brought millions onto the streets in this country to denounce politicians’ inaction on gun violence.

While Donald Trump took the U.S. out of the Paris climate agreement, Greta Thunberg, the 17-year-old Swedish environmental activist, crossed the Atlantic in a carbon-neutral sailing vessel to address the United Nations, demanding of the adult world “How dare you” leave it to your children to save an increasingly warming planet:

“You have stolen my dreams and my childhood with your empty words. And yet I’m one of the lucky ones. People are suffering. People are dying. Entire ecosystems are collapsing. We are in the beginning of a mass extinction, and all you can talk about is money and fairy tales of eternal economic growth. How dare you!”

“How dare you?” is a question I ask myself every time, as a teacher, I face a classroom of college students who, each semester, seem both more anxious about the future and more determined to make it better than the present.

Public attention is a strange beast. Communities of color have known for endless years that the police can kill them with impunity, and it’s not as if people haven’t been saying so for decades. But when such incidents made it into the largely white mainstream media, they were routinely treated as isolated events — the actions of a few bad apples — and never as evidence of a systemic problem. Suddenly, in May 2020, with the release of a hideous video of George Floyd’s eight-minute murder in Minneapolis, Minnesota, systematic police violence against Blacks became a legitimate topic of mainstream discussion.

The young have been at the forefront of the response to Floyd’s murder and the demands for systemic change that have followed. This June in my city of San Francisco, where police have killed at least five unarmed people of color in the last few years, high school students planned and led tens of thousands of protesters in a peaceful march against police violence.

Now that the election season has reached its drawn-out crescendo, there is so much work ahead of us. With the pandemic spreading out of control, it’s time to begin demanding concerted federal action, even from this most malevolent president in history. There’s no waiting for Inauguration Day, no matter who takes the oath of office on January 20th. Many thousands more will die before then.

And isn’t it time to turn our attention to the millions who have lost their jobs and face the possibility of losing their housing, too, as emergency anti-eviction decrees expire? Isn’t it time for a genuine congressional response to hunger, not by shoring up emergency food distribution systems like food pantries, but by putting dollars in the hands of desperate Americans so they can buy their own food? Congress must also act on the housing emergency. The Centers for Disease Control and Prevention’s “Temporary Halt in Residential Evictions To Prevent the Further Spread of Covid-19” only lasts until December 31st and it doesn’t cover tenants who don’t have a lease or written rental agreement. It’s crucial, even with Donald Trump still in the White House as the year begins, that it be extended in both time and scope. And now Senate Republican leader Mitch McConnell has said that he won’t even entertain a new stimulus bill until January.

Another crucial subject that needs attention is pushing Congress to increase federal funding to state and local governments, which so often are major economic drivers for their regions. The Trump administration and McConnell not only abandoned states and cities, leaving them to confront the pandemic on their own just as a deep recession drastically reduced tax revenues, but — in true looking-glass fashion — treated their genuine and desperate calls for help as mere Democratic Party campaign rhetoric.

“In Short, There Is Still Much to Do”

My favorite scene in Gillo Pontecorvo’s classic 1966 film The Battle of Algiers takes place at night on a rooftop in the Arab quarter of that city. Ali La Pointe, a passionate recruit to the cause of the National Liberation Front (NLF), which is fighting to throw the French colonizers out of Algeria, is speaking with Ben M’Hidi, a high-ranking NLF official. Ali is unhappy that the movement has called a general strike in order to demonstrate its power and reach to the United Nations. He resents the seven-day restriction on the use of firearms. “Acts of violence don’t win wars,” Ben M’Hidi tells Ali. “Finally, the people themselves must act.”

For the last four years, Donald Trump has made war on the people of this country and indeed on the people of the entire world. He’s attacked so many of us, from immigrant children at the U.S. border to anyone who tries to breathe in the fire-choked states of California, Oregon, Washington, and most recently Colorado. He’s allowed those 230,000 Americans to die in a pandemic that could have been controlled and thrown millions into poverty, to mention just a few of his “war” crimes. Finally, the people themselves must act.

On that darkened rooftop in an eerie silence, Ben M’Hidi continues his conversation with La Pointe. “You know, Ali,” he says. “It’s hard enough to start a revolution, even harder to sustain it, and hardest of all to win it.” He pauses, then continues, “But it’s only afterwards, once we’ve won, that the real difficulties begin. In short, there is still much to do.”

It’s hard enough to vote out a looking-glass president. But it’s only once we’ve won, whether that’s now or four years from now, that the real work begins. There is, indeed, still much to do.

Rebecca Gordon, a TomDispatch regular, teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes and is now at work on a new Dispatch book on the history of torture in the United States.

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Copyright 2020 Rebecca Gordon

Via Tomdispatch.com

]]>