Reclaiming Patriotism for the Left


Surrendering national pride to champions of a blood-and-soil vision abdicates the fight for the soul and meaning of the American project.

The resurgence of blood-and-soil nationalism around the world seems to prove that appeals to nationhood are too racist, too tribal and too dangerous to be of value. Yet surrendering patriotism to champions of the ethno-state abdicates the fight for the soul and meaning of the American project.

The American left, from the center of the Democratic Party to its insurgent challengers, needs a dose of national vision. One of the core lessons of Trumpian politics is that Americans are starved for a meaningful politics of what it means to be American. Getting rid of the vainglorious Trump administration is only a partial solution. The causes of his rise remain.

Call what is needed a reinvigoration of “civic nationalism” or “civic republicanism” (a reference to the ancient political ideal, not the party). This is a revival of the “bond of common faith,” the “bond of common goal,” as Robert Kennedy once put it, which needs constructive outlets if what is left of American democracy is to survive.

In recent decades, progressive forces in the United States have split between two positions, both of which surrender a robust and hopeful sense of national citizenship. On one track can be found a cosmopolitan economic elite that embrace a multicultural world order shaped largely by the politics of corporate globalization. On the other track are radical critics of the racism and imperialism of the American state who often support local community and transnational solidarity but maintain a deep cynicism, even despair, about the American project. Both groups have abdicated the national story to their shared political enemies. What remains is a fervent hybrid of nationalism and anti-statism, an echo of the rebel yell.

The American past, according to the historian Gary Gerstle in his book “American Crucible,” can be understood as a struggle between “two powerful and contradictory ideals” — a civic and racialized national vision. Yet the dissolution of a progressive civic dimension has left us with an unchallenged ethno-racial nationalism.

Globalization has further complicated the problem. In a dizzying world of oppressive economic and political inequality, global trade, immigration and technological disruption, voters seek grounding not in technocratic detail but in place, in time, in tradition and, above all, in the shared fate, history and meaning of the nation.

The unhealed wounds of the 2008 financial crisis may have laid the way for Donald Trump, but the full mosaic of the American working class has long been looking desperately for routes to make America great again. As globalization expanded, it pounded foreign cars with sledge hammers, sponsored protective tariffs, promoted “Buy American” campaigns, tried to defeat Nafta, tried to organize unions and fought against undocumented migrant labor. But the plants closed anyway, domestic and foreign capital moved around, mass migrations happened, attacks on worker protections proceeded at a relentless pace, and the increasingly complicated world of national politics seemed more focused on Davos than Peoria.

Before the 1960s, dissenting and progressive movements regularly invoked nationalist and patriotic themes. The 19th-century Knights of Labor — one of the more inclusive labor organizations in American history — couldn’t get enough of the Fourth of July and the Declaration of Independence. Teddy Roosevelt advocated his “New Nationalism” as a counterbalance to the seemingly unchecked power of the robber barons. The socialist leader Eugene V. Debs drew on American traditions to frame his radical critiques of corporate power. The labor upheavals of the 1930s openly expressed faith in a “working-class Americanism.” Even the American Communist Party cloaked itself in “Americanism” and the words and visage of Abraham Lincoln. In Franklin Roosevelt’s efforts to reconfigure class power, he did not attempt to speak for workers or the poor but simply said that tax on the rich was “the American thing to do.

In the midst of the Cold War, when Paul Robeson was questioned by House Committee on Un-American Activities about his association with the African-American radical Ben Davis, he replied, “I say that he is as patriotic an American as there can be, and you gentlemen belong with the Alien and Sedition Acts, and you are the nonpatriots, and you are the un-Americans, and you ought to be ashamed of yourselves.”

Reviving this older stream of dissenting rests on the active interests and lost authority of its citizens and its fading democratic values. This would replace “my country right or wrong” with the centuries-long struggle, as the Rev. Dr. Martin Luther King Jr. put it, “to be true to what you said on paper.” This is the position from which voting rights, civil rights, immigrant rights and economic rights can be fought: with a vision of what is acceptably American and what is not. Decent people will rise to the challenge.

The nation is the only “imagined community,” as Benedict Anderson put, where everything from mass transit to health care to wealth distribution to a green economy can find traction. A rejuvenated national vision would transcend the backward-looking — and often reactionary — search for an America in arrested decay that has too often informed politics since Ronald Reagan first promised to make America great again.

Civic patriotism must also be an aspirational story of struggle and inclusion. The narcissistic and racist politics of right-wing nationalism must be challenged with an expansive and inclusive civic vision about hope and potential. It’s what Barack Obama spoke of at the 50th anniversary of the Selma march. Standing before the Edmund Pettus Bridge, he asked, “What greater form of patriotism is there than the belief that America is not yet finished, that we are strong enough to be self-critical, that each successive generation can look upon our imperfections and decide that it is in our power to remake this nation to more closely align with our highest ideals?”

To be sure, the rhetoric of nationalism can be dangerous in a place with a history of settler colonialism, slavery, anti-immigrant hysteria and territorial expansion. Any civic framing risks fomenting exclusion by drawing lines between those who are in and those who are out — an especially profound problem in an era of mass migration. Yet when the American left abandons any vision of social patriotism because of the racist ugliness it has come to symbolize, it concedes the American story to the voices of exclusion and avarice.

The pragmatist philosopher Richard Rorty made many of these arguments 20 years ago in a book, “Achieving Our Country.” That book became famous after the 2016 election for having predicted the rise of a “strongman” to fill the void in national politics. He feared that indulging in cultural politics rather than emphasizing the material interests of American working people, and surrendering the struggle to shape the national vision where that can happen, would lead to such a catastrophe. While his nightmare of the nationalist demagogue has come to pass, few people are talking about the foundation of his predictions.

Patriotism may well be the last refuge of the scoundrel, but as a pragmatist like Mr. Rorty would tell you, it is too powerful and too important to leave to the scoundrels. Voters are in search of a place of vision for average Americans, a place of idealism in an age of cynicism, a place of unity in a time of fracture and a place where policy can be embedded in something greater than technocracy.

While commentators are getting worked up over the revival of “socialism,” an increasing number of insurgent blue-collar Democrats across the country are looking to recapture a sense of nation. The dark-horse candidate from Kansas, the Army veteran James Thompson, for instance, promises to “Fight for America.”

As we approach midterm elections, we urgently need to hear these messages in good faith and rise to their challenge.

Jefferson Cowie is a professor of history at Vanderbilt University and the author, most recently, of “The Great Exception: The New Deal and the Limits of American Politics.”

We Can Imagine And Create A World Without Military Enemies And Wars

By Robert C. Koehler, Common Wonders, June 3, 2018

America does what it wants.

This is obvious, except it’s also monstrously unnerving. Let’s at least add some quote marks: “America” does what it wants — this secretly defined, self-obsessed, unelected entity that purports to be the United States of America, all 325 million of us, but is, in fact, a narrowly focused amalgam of generals, politicians and corporate elites who value only one thing: global dominance, from now to eternity.

Indeed, they’re capable of imagining nothing else, which is the truly scary part. Until this changes, “peace” is a feel-good delusion and “disarmament” (nuclear and otherwise) is the butt of a joke. The American empire may be collapsing, but the war games continue.

So I realized with a sudden start as I read Nick Turse’s analysis of a collection of U.S. military documents, which the TomDispatch website got hold of via the Freedom of Information Act. The documents contained a detailed description of the 33rd annual Joint Land, Air, and Sea Strategic Special Program, “an elaborate war game,” Turse explains, “carried out in 2016 by students and faculty from the U.S. military’s war colleges, the training grounds for its future generals and admirals.”

The war game was wrapped around a fantasy future of “dystopian dangers,” set in 2020, in which, “as the script for the war game put it, ‘lingering jealousy and distrust of American power and national interests have made it politically and culturally difficult for the United States to act unilaterally.’”

In other words, as Turse explains, quoting the war game’s summary, the threat to America’s near-future security is completely a matter of maintaining its global hegemony in the face of scientific and military advances “by both state and non-state actors” that “have increasingly constricted U.S. freedom of action.”

There’s nothing particularly surprising here, yet something jolted me into a new level of shock and awe, you might say, about the deep state apparatus that controls the national direction. There’s nothing in this controlling consciousness devoted to creating — or imagining — a world without nuclear weapons or a world free of war and poverty. That’s just not part of the future “America” has any interest in envisioning. The next war is utterly unquestioned. “Us vs. them” is utterly unquestioned. There will always be enemies. What would we do without them?

While the invisible state may fear losing its global dominance, it seems to be completely in control of its domestic dominance.

And peace is out of the picture, at least the evolving concept of that word: peace that transcends militarism and is not based on armed enforcement. As long as the generals and war profiteers have it their way, peace is merely the lull between wars or, even more cynically, that brief pause while the combatants reload.

In other words, we are moving into the future committed — financially, politically, ideologically — to continuing to do what has failed in the past: wage war, dominate, win.

And in countless ways, we are losing. The empire is collapsing. The consequences of armed dominance are eating us alive and destroying Planet Earth. But no matter. As long as we’re not aware that we’re causing our own destruction, we (I mean “we”) can continue to do so, in the process reaping not merely profit but a sense of purpose.

“Two years after the war game was conducted,” Turse writes, “in a time of almost metronomic domestic mass killings, President Trump continues to spotlight the supposedly singular danger posed by ‘inadequately vetted people’ in the U.S., although stovetops and ovens, hot air balloons, and burning pajamas are far more deadly to Americans.”

Apparently our national sense of identity would collapse without an enemy, and the enemy du jour is the terrorist (no longer the communist, no longer the “savage”). So we’re not only fighting endless wars across Africa and the Middle East, we’re intensifying our deportation of “illegals” and amping up “border security,” all in order to keep America safe.

One reason the nation’s leaders are able to keep waging wars that do not, in fact, keep America safe is because the harm they cause is almost totally borne by the other. And as long as the messy details are seldom in the news, Americans need not let their awareness of government policy stray beyond the clichés of patriotism.

Regarding Mexican border security, for instance, the Trump administration has implemented a policy of separating children from parents seeking asylum in order to send a Keep Out message to other would-be immigrants. The cruelty of such a policy has been magnified by a recent ACLU report, based on documents obtained through the Freedom of Information Act, showing widespread abuse of children in U.S. custody.

And in point of fact, the abuses are pre-Trump. The documents cover 2009-2014, during the Obama years. Dominance and racism may be more blatant under the current president than they’ve been for a while, but they have always been part of national policy.

According to the report: “The documents show numerous cases involving federal officials’ verbal, physical and sexual abuse of migrant children; the denial of clean drinking water and adequate food; failure to provide necessary medical care; detention in freezing, unsanitary facilities; and other violations of federal law and policy and international law. The documents provide evidence that U.S. officials were aware of these abuses as they occurred, but failed to properly investigate, much less to remedy, these abuses.”

Abuses listed in the report include children kicked in the ribs, punched in the head, shot with a stun gun (causing a boy “to fall to the ground, shaking, with his eyes rolling back in his head”) and run over with a patrol vehicle. A pregnant minor “was denied medical attention when she reported pain, which preceded a stillbirth.” A 16-year-old girl was subjected to a search in which they “forcefully spread her legs and touched her private parts so hard that she screamed.”

My God, it was like they placed these children in Gitmo!

These are the war games we play that aren’t games, but real-world actions. From cruelty at the border to nuclear testing, a philosophy of dominance over the enemy creates nothing but a poisoned planet and endless war. Paradoxically, a primary qualification for being a national leader is not knowing this.

‘We Are Climbing Rapidly Out of Humankind’s Safe Zone’: New Report Warns Dire Climate Warnings Not Dire Enough

By Jon Queally, staff writer, Common Dreams, August 20, 2018

“Climate change is now reaching the end-game, where very soon humanity must choose between taking unprecedented action, or accepting that it has been left too late and bear the consequences.”

But this doesn’t mean: “be scared.” This means: “Act. Demand action. With everything we got.”

“It is no longer possible to follow a gradual transition path to restore a safe climate,” write David Spratt and Ian Dunlop, authors the new report. “We have left it too late; emergency action, akin to a war footing, will eventually be accepted as inevitable. The longer that takes, the greater the damage inflicted upon humanity.” (Image: Breakthrough)

Offering a stark warning to the world, a new report out Monday argues that the reticence of the world’s scientific community—trapped in otherwise healthy habits of caution and due diligence—to downplay the potentially irreversible and cataclysmic impacts of climate change is itself a threat that should no longer be tolerated if humanity is to be motivated to make the rapid and far-reaching transition away from fossil fuels and other emissions-generating industries.

“It is no longer possible to follow a gradual transition path to restore a safe climate. We have left it too late; emergency action, akin to a war footing, will eventually be accepted as inevitable. The longer that takes, the greater the damage inflicted upon humanity.” —David Splatt & Ian Dunlop, report authors In the new report—titled What Lies Beneath: The Understatement of Existential Climate Risk (pdf)—authors David Splatt and Ian Dunlop, researchers with the National Centre for Climate Restoration (Breakthrough), an independent think tank based in Australia, argue that the existential threats posed by the climate crisis have still not penetrated the collective psyche of humanity and that world leaders, even those demanding aggressive action, have not shown the kind of urgency or imagination that the scale of the pending catastrophe presents.

While the report states that “a fast, emergency-scale transition to a post-fossil fuel world is absolutely necessary to address climate change,” it bemoans the fact that this solution continues to be excluded from the global policy debate because it is considered by the powerful as “too disruptive.” However, the paper argues, it is precisely this lack of imagination and political will that could doom humanity’s future.

As Splatt and Dunlop summarize at Renew Economy, their paper analyzes why:

  • Human-induced climate change is an existential risk to human civilisation: an adverse outcome that will either annihilate intelligent life or permanently and drastically curtail its potential, unless dramatic action is taken.
  • The bulk of climate research has tended to underplay these risks, and exhibited a preference for conservative projections and scholarly reticence.
  • IPCC reports tend toward reticence and caution, erring on the side of “least drama,” and downplaying the more extreme and more damaging outcomes, and are now becoming dangerously misleading with the acceleration of climate impacts globally.
  • Why this is a particular concern with potential climatic “tipping points,” the passing of critical thresholds which result in step changes in the climate system. Under-reporting on these issues is contributing to the “failure of imagination” in our understanding of, and response to, climate change.

“Climate change is now reaching the end-game,” reads the forward to the report by Hans Joachim Schellnhuber, head of the Potsdam Institute for Climate Impact Research, “where very soon humanity must choose between taking unprecedented action, or accepting that it has been left too late and bear the consequences.”

“It is no longer possible to follow a gradual transition path to restore a safe climate,” write Spratt and Dunlop in an op-ed published in the Guardian on Monday. “We have left it too late; emergency action, akin to a war footing, will eventually be accepted as inevitable. The longer that takes, the greater the damage inflicted upon humanity.”

At the center of their argument, the pair explain, is that while the global scientific community—including the vital work of the UN-sponsored Intergovernmental Panel on Climate Change (IPCC)—has been at the forefront of warning humanity about the processes and dangers of human-caused global warming, there has been simply too much “reticence and caution” that has led researchers to downplay the most “extreme and damaging outcomes” that lurk beneath their publicly stated findings and pronouncements. 

While this has been understandable historically, given the pressure exerted upon the IPCC by political and vested interests, it is now becoming dangerously misleading with the acceleration of climate impacts globally. What were lower probability, higher-impact events are now becoming more likely.

This is a particular concern with potential climatic tipping points – passing critical thresholds which result in step changes in the climate system – such as melting polar ice sheets (and hence increasing sea levels), permafrost and other carbon stores, where the impacts of global warming are nonlinear and difficult to model with current scientific knowledge.

The extreme risks which these tipping points represent justify strong precautionary risk management. Under-reporting on these issues is irresponsible, contributing to the failure of imagination that is occurring today in our understanding of, and response to, climate change.

“Either we act with unprecedented speed,” Spratt and Dunlop conclude, “or we face a bleak future.”

The progressive insurgency is only just beginning

Opinion by Katrina vanden Heuvel, Columnist,, August 14, 2018

Democrats are still in the early stages of a big debate on direction. 

How do you cover an insurgency like that now roiling the Democratic Party? To date, the mainstream media’s treatment would give regular readers a severe case of whiplash. The primaries had barely begun when some in the media announced the virtual demise of the movement spawned by Sen. Bernie Sanders. Then, when Alexandria Ocasio-Cortez eviscerated Joseph Crowley, the fourth-ranking Democrat in the House and possible heir to House Minority Leader Nancy Pelosi, in a New York primary, the New York Times headlined, “Democrats brace as storm brews far to their left,” warning that “a new generation of confrontational progressives has put Democrats at the precipice of a sweeping transition.” Then, when some of the candidates whom Ocasio-Cortez and Sanders stumped for lost in Kansas and Michigan, Politico declared “Down goes socialism” (not bothering to tell us when “socialism” had been up); and The Washington Post concluded the “liberal insurgency hits a wall.

It’s worth sorting this out. There surely is a powerful reform movement building on the left. It is spearheaded by activists inspired not only by the Sanders campaign but also by movements such as Black Lives Matter and #MeToo, the plight of the “dreamers” and growing environmental activism. What is surprising — and should be exciting to Democrats — is that much of the energy of a new generation of activists is focused on electoral politics, and largely on remaking the Democratic Party rather than leaving it.

Fears about this movement dividing or weakening the Democratic Party are overblown. President Trump rouses and unites progressives. The White House plans to put Trump on the stump this fall, hoping to rescue the Republican majority by nationalizing the election. That provocation will help heal any wounds resulting from bruising primary battles among Democrats.

The upheaval in the party is a long-overdue response to the failure of the Democratic establishment. The policy failure is expressed in stagnant wages, rising insecurity and corrosive calamities — inequality, corruption and climate, to name a few — that continue to worsen. The political failure is undeniable, with the loss of the White House to the most unpopular candidate in modern times, as well as control of Congress and a thousand seats in state legislatures across the country.

To date, the reform movement has made its greatest gains in the war of ideas. This shouldn’t be surprising. The reforms the activists are championing are bold and striking and address real needs: Medicare for all, tuition-free public college, $15 minimum wage, universal pre-kindergarten, a jobs guarantee, a commitment to rebuild the United States, a challenge to the corruptions of big-money politics, criminal-justice reform and a fierce commitment to liberty and justice for all.

These ideas aren’t “radical.” They enjoy broad popular support. Not surprisingly, they are increasingly championed not simply by progressives such as Sanders and Elizabeth Warren but also by more-mainstream liberals, including Kirsten Gillibrand, Kamala Harris and Cory Booker as they potentially gear up for the 2020 presidential race.

Almost without exception, the leaders of the reform movement — from Ocasio-Cortez and Warren to Sanders and Ben Jealous — dismiss the much-ballyhooed tension between “identity politics” and economic populism. That supposed choice was driven by the Wall Street wing of the party, hoping to use social liberalism to cover for a neoliberal economics that doesn’t work for working people. Insurgent candidates of all genders, races and sexual orientations have no problem championing social progress and economic populism.

Electorally, insurgent candidates have fared remarkably well given the odds. They are, almost by definition, fresh and inexperienced. They face opponents who start with more money, more-experienced operatives, and often, greater name recognition. Outside groups with deep pockets line up against them. Many are seeking to build small-donor, volunteer-driven campaigns from the ground up.

The victories in House primaries — Ocasio-Cortez, Kara Eastman in Nebraska, Rashida Tlaib in Michigan, Katie Porter in California — and many more are impressive. Harder for the national media to cover is the remarkable surge of insurgent candidates in down-ballot state and local races. One that did get attention was the race in Missouri for St. Louis County prosecutor, where Wesley Bell ousted the 27-year incumbent who had failed to get indictments in the 2014 police shooting of Michael Brown in Ferguson.

Moreover, movement challengers often drive the debate even when they are defeated. The media too often assumes that if the movement candidate has lost, a “moderate” has won. In the Michigan gubernatorial primary, however, Sanders and Ocasio-Cortez stumped for Abdul El-Sayed, a remarkable candidate who ended with 30 percent of the vote. The victor, “establishment favorite” Gretchen Whitmer, was hardly a conservative Democrat. A strong advocate for working people, she ran with the support of the United Automobile Workers. She backs a state bank to rebuild Michigan, a $15 minimum wage, statewide universal preschool and legalization of marijuana. Similarly, Brent Welder narrowly lost his Kansas congressional primary despite the support of Sanders and Ocasio-Cortez. The victor — Sharice Davids — is an openly gay Native American, running as a feminist on populist economics. The Sanders candidate may have lost, but the reform movement keeps on building.

The media needs to focus less on horse-race coverage and more on what’s building and what’s left behind. The insurgency in the Democratic Party isn’t on its deathbed, nor is it about to sweep out the old. It is only just beginning. Democrats are still in the early stages of a big debate on direction. Insurgent candidates are only starting to build the capacity — in ideas, in small donors, in supporting institutions — to run serious challengers. But there is new energy and a new generation that is demanding change. That reality is forcing more-established Democrats to adjust. In the face of Trump venom, Republican reaction and establishment Democratic failure, that surely is a good thing.

Read more from Katrina vanden Heuvel’s archive or follow her on Twitter.

Democrats Must Reclaim the Center … by Moving Hard Left

By NICK HANAUER,, Aug 14, 2018

America needs a centrist party that actually represents the economic center, not just zillionaires like me.

Every time Democrats lose a presidential election, blue America promptly collapses into civil war—and never more so than in the aftermath of 2016. Progressive Democrats, buoyed by a number of high-profile victories, insist that if the party is to have any hope of fending off Trumpism, it must decisively move to the political left by embracing the populist messaging and agenda of insurgent outsiders like Bernie Sanders and Alexandria Ocasio-Cortez. Establishment Democrats (egged on by eye-rolling pundits and concern-trolling never-Trumpers) dismiss that idea as electoral suicide, contending that now more than ever is the time for the party to reclaim the political center by championing an agenda that pragmatically appeals to voters on both sides of the aisle.

And you know what? They’re absolutely right. All of them. The Democratic Party must reclaim the political center. And the only way to do that is by boldly moving toward the so-called “radical” left.

If this strikes you as counterintuitive, you’re not alone. By respectively attempting to purge the center or marginalize the left, progressive and establishment Democrats alike have displayed a willful ignorance of where and what the center actually is. This is not mere wordplay. Over the past several decades, Democrats have allowed a mistaken and self-destructive definition of centrism to become party orthodoxy. It continues to undermine party unity at a time when a unified Democratic Party is more essential than ever.

In fact, there are two kinds of political centers: There’s the ideological center—the one that Democrats are waging a civil war over. And there’s the majoritarian center—the one where most of the people are. If Democrats hope to be a majority party, it’s the majoritarian center they need to embrace. And to understand the difference between these two strains of centrism, it’s important to understand exactly what the center is measuring.

Imagine lining up every person in America on a yardstick, with the poorest person standing to the far-left edge of the stick (zero inches) and the wealthiest person standing to the far right (36 inches). Assuming that people are equally spaced, and that there is no correlation between wealth and weight—if you could balance that yardstick on the tip of your finger, the fulcrum would fall on the 18-inch mark, the exact center of the yardstick, with exactly half of all Americans standing to the left, and the other half standing to the right. Clustered on and near that 18-inch mark are the median American families—the middle-middle class—the majoritarian center of the American electorate, at least from an economic perspective.

Now imagine that very same yardstick with every American standing in their very same spots—only this time, rather than balancing people, we are balancing their personal wealth, stacked up in $100 bills. [see illustration] But because 2 percent of Americans (of which I am one) own 50 percent of the nation’s wealth, to balance this yardstick you’d now have to slide your finger nearly all the way over, beyond the 35-inch mark, just inside the far-right edge. This fulcrum balances the interests of capital, not people. And unfortunately, this is the yardstick of our current ideological center—a centrism informed by the bad economic theories that have guided the policies of both parties for more than 30 years.

This precarious balancing act helps explain why policies that would clearly benefit the majoritarian center are so often rejected as ideologically “far left;” for a centrism that seeks to balance the interests of capital is a centrism that seeks to balance the interests of the very wealthiest Americans against those of everybody else. It’s this sort of “one dollar, one vote” logic that led to the Citizens United Supreme Court decision—a logic that threatens to subvert American democracy itself. For a system that justifies the wealthiest 2 percent purchasing the same political influence as the other 98 percent, isn’t really a democracy at all. I’m not saying that self-described “centrist” Democrats are any more greedy or corrupt than their progressive colleagues, but if they’re honest with themselves, they should recognize how much they have internalized this orthodox ideological bias. Indeed, this is what they mean by “pragmatic centrism”: an economic policy agenda that necessarily balances the interests of business (the few) versus the interests of labor (the many), in an attempt to best serve the interests of all. Yet as pragmatic as such an approach might at first appear, when viewed from a majoritarian perspective, the ideological center consistently fails to hold.

Take, for example, the minimum wage, a stereotypically lefty policy if there ever were one. But is it really so lefty? In fact, the federal minimum wage is extraordinarily popular, with 71 percent of Americans supporting a raise to at least $10 an hour—even according to a partisan survey conducted by Republican pollster Frank Luntz on behalf of the minimum-wage-hating National Restaurant Association! Ouch. In a country as politically polarized as ours, I’d say that 71 percent support for anything is about as majoritarian centrist a policy as you’re likely to get.

But what about a $15 minimum wage? Surely, the higher we raise the minimum wage the more extremist the policy becomes, right? Again, not from a majoritarian perspective. According to a 2017 Pew Research Center poll, a majority of registered voters—52 percent—favor raising the federal minimum wage from $7.25 to $15 an hour (support far stronger than the 41 percent approval rating currently enjoyed by President Trump). But more importantly, a $15 minimum wage would benefit far more workers. At $7.25 an hour, the current federal minimum wage provides a floor under only 1.3 percent of all wage and salary workers—a cohort one might fairly characterize as occupying the far left of our economic yardstick. By comparison, a hike to $15 an hour would directly or indirectly benefit 29.2 percent of workers. And with half of all American jobs paying less than $18 an hour, a $20 minimum wage would directly cover a majority of workers, while indirectly pushing wages higher for many millions more. From a majoritarian perspective—a perspective that asks, “Who does it benefit?”—the higher you raise the minimum wage, the more centrist the policy becomes!

And the same holds true for many other policies routinely caricatured as “far left.” After 40 years of erosion, the current $23,660 overtime threshold now guarantees time-and-a-half overtime pay to only 8 percent of workers, but a return to a 1970s-level threshold would cover 66 percent. Less than 30 percent of college graduates manage to get through school without accumulating often crushing levels of student debt, but tuition-free public college would offer an affordable higher education to every qualified student. And with an enrollment age of 65, only 15 percent of Americans enjoy the privilege of purchasing affordable health insurance through Medicare, but “Medicare for All” would deliver exactly what its name implies. All three of these proposals enjoy majority support, while directly benefiting the majority of Americans. So to mischaracterize these policies as “lefty” rather than “centrist” would be to abuse those words. Small wonder that “socialists” like Sanders and Ocasio-Cortez have gained so much traction with mainstream voters. “This race is about people versus money,” says Ocasio-Cortez. “We’ve got people, they’ve got money.” That’s as clear a declaration of majoritarian centrism as voters might hope to hear.

You self-described “pragmatic” Democrats who publicly fret that such policies would be far too costly to taxpayers or to employers should remember that there is nothing pragmatic about losing elections. You can’t be the grownup in the room if you’re not in the room. That’s why you never hear Republicans worrying about how to pay for their $1.5 trillion tax cut for the rich: It’s a losing argument. And besides, your trickle-down instincts are wrong! The U.S. economy was never as strong, nor its middle class as secure, as during the three decades when the real minimum wage and the overtime threshold and public subsidies for higher education were at their peak. Medicare for All wouldn’t burden employers; it would relieve them of their costliest employee benefit. And if we do need to raise taxes on the wealthy in order to rebuild the middle class, so what? There is simply no correlation between top tax rates and growth. Stand up for the middle class for a change and you might be rewarded. But there is absolutely nothing to gain—economically or electorally—by aping a trickle-down narrative that just isn’t true. On economic issues, the Democratic Party has long embraced an ideological definition of centrism that simply has nothing to do with the center. And both the party and the nation have suffered as a result.


There once was a time when both parties vied to occupy the majoritarian center, an era when American politics was more a struggle over means than of ends—until, after three decades of unprecedented and broad-based post-war prosperity, the Republican Party lurched violently to the right, and the age of New Deal centrism came to a close. Supply-side tax cuts, attacks on unions, a crusade against “big government” and other tactics of the Reagan revolution helped put us on the road to a new Gilded Age. And while Republicans certainly led the way, we wouldn’t have gotten here as quickly had Democrats not kept driving in the same direction every time we managed to get our hands on the wheel.

Sure, we drove a bit slower and made a few detours before delivering the working class to a neoliberal paradise of billionaires and paupers. And we offered an occasional helping hand to the millions of Americans we left behind in a ditch. But while we tried to strike a more compassionate balance than our GOP counterparts, our deference to the tenets of neoliberal orthodoxy kept our economic agenda firmly tilted toward the interests of the super-rich.

Under Bill Clinton, the wealthy got financial deregulation and capital gains tax cuts worth hundreds of billions of dollars, while our most vulnerable citizens got a higher Earned Income Tax Credit and CHIP—benefits worth just tens of billions. Under Barack Obama, the Wall Street titans who had used their regulatory freedom to crash the global economy got bailouts and bonuses and a monetary policy that inflated their assets, while some in the middle-class got affordable (or, at least, less unaffordable) health insurance, and millions of homeowners were left to drown in their underwater mortgages.

This isn’t to say that the Affordable Care Act and EITC expansion weren’t worthwhile programs. I’m a Democrat for a reason. But “we suck less than the Republicans” just doesn’t cut it, politically or economically. Reagan’s class war left the economic center in ruins. Restoring shared prosperity required nothing less than a Marshall Plan for the middle class. Instead, centrist Democrats let them eat charter schools.

In the golden age of New Deal liberalism, organized labor delivered shared prosperity for the middle class and electoral success for the Democratic Party. A mountain of studies have shown that strong, private-sector unions reduce inequality and raise middle-class wages for workers who belong to them and for those who don’t. Other studies have shown that when unions decline in a state, the Democratic Party’s share of that state’s vote declines along with them. Thus, upon returning to power in 1992, it should have been a no-brainer for Democrats to rewrite the nation’s labor laws to make it easier for workers to organize—and harder for bosses to stop them.

But they didn’t. In the 1980s, as Reaganism was ascendant, “centrist” Democrats started blaming much of their party’s struggles on organized labor: By doing the bidding of that “special interest group,” centrists argued, Democrats had alienated the middle. So, instead of taking its policy cues from a labor movement it dismissed as corrupt, lazy and market-distorting, Bill Clinton’s Democratic Party let Wall Street set its agenda. The Robert Rubin wing of the White House believed that working-class Americans didn’t need collective bargaining rights to force their employers to pay a living wage, or redistributive programs to guarantee them a fair share of after-tax income. No, what the economic center really needed was for government to wage war on the deficits, trade barriers and financial regulations that were holding back economic growth. Labor law reform was out; welfare reform, NAFTA and deregulated derivatives markets were in.

Some of these measures might very well have contributed to the late-1990s growth spurt—but they also set the stage for the 2008 crisis while accelerating the decline of American manufacturing and with it the labor movement. And though median wages did rise during the Clinton expansion, so did economic inequality. And unlike this brief spike in wages, inequality has been relentlessly rising ever since.

Centrist Democrats weren’t blind to this inequality. They just refused to believe that it was a product of the economic rules they helped write. Working people weren’t falling behind because markets were structured to funnel all the rewards of growth to the top, centrists told themselves; the middle class was falling behind because it lacked the skills to compete in the new “knowledge economy.” After all, wages were rising for highly educated Americans—you know, like centrist Democratic politicians—so middle-class Americans just needed to become more like them. Companies said they were desperate to give high-paying jobs to American workers, if only they could find workers who were qualified to do the jobs. None of this meant that centrist Democrats were blaming the middle class for its struggles. They didn’t expect working Americans to “pull themselves up by their bootstraps” without any help from Uncle Sam (they weren’t Republicans, after all.) They just thought that the median worker needed a better education or retraining, not a modicum of bargaining power with her employer. It wasn’t rapacious economic elites who were preventing workers from getting reasonable wages and benefits. It was the damn teachers unions.

Of course, minting more college graduates didn’t reduce inequality; it just produced a new class of extremely well-read baristas with crushing college debt. The “skills gap” was always a lie told by corporations who just wanted to pay less for high-skill labor. Still, the myth survived well into the Obama presidency, when a unified Democratic government once again declined to modernize labor law, while selling education reform as an elixir for inequality.

It’s been three decades since centrist Democrats abandoned the majoritarian economic center, and the consequences for the middle class have been devastating. Since 1980, the bottom 80 percent of American workers have effectively been bypassed by economic growth while absorbing most of the costs of public disinvestment in housing, education and the social safety net. After-tax corporate profits have doubled from approximately 5 percent of GDP to 10 percent—about a trillion dollars a year—while wages as a share of GDP have fallen by about the same amount. Meanwhile, the richest 1 percent of Americans went from collecting 9 percent of personal income to about 22 percent today. Taken together, these changes amount to a shift of more than $2 trillion a year from middle-class paychecks to the bank accounts of corporations and the very rich.

Some of my fellow filthy-rich capitalists would like you to believe the middle class has actually benefited from having us gobbling up more and more of America’s annual income gains. After all, they claim, we “job creators” know best how to productively invest wealth: The more capital we get to control, the more economic growth we’ll be able to produce—and the benefits will trickle down!

Yeah, right. Contrary to popular wisdom, America hasn’t enjoyed drastically higher economic growth since 1980 than more egalitarian Western countries. And a moment’s glance at how the “job creators” are currently investing their windfall illustrates why: Right now, roughly 55 percent of corporate profits (about $1 trillion, or 5 percent GDP) is going into stock buybacks—the signal marker of corporate malfeasance and self-dealing—while an additional 37 percent goes to dividends. This means that 92 percent of the profits that American businesses make this year will be spent on enriching the small, elite fraction of the population that owns significant amounts of corporate stock. Meanwhile, at a time of so-called “full employment,” inflation-adjusted wages for the bottom 80 percent of American workers are actually declining.

After the corporate elite slices off its giant share of the income pie, the median American family—those standing near the 18-inch halfway mark of our majoritarian yardstick—is left with about $59,000 a year. Had inequality held constant since 1980, that figure would be $86,000. Had middle-class incomes grown with productivity (as they had in the previous three decades), the median American family would be earning over $100,000 a year.

Let me underscore this point: America owes the median family a raise of somewhere between $25,000 and $40,000. Per year. This—not the fictional entitlement crisis—is the inconvenient economic truth that elites in both parties lack the political courage to confront. And only by enacting policies that right this wrong can Democrats lay claim to being a truly centrist party.

Of course, Democrats never quite gave up their belief in redistribution. But because they stubbornly stayed beholden to their ideological center, these redistributive measures were always too little, too late—and of little help to the majoritarian center. The Affordable Care Act expanded Medicaid to millions of low-income Americans—but it condemned the median family to complicated, costly and uncertain health insurance exchanges. Similarly, Obama proposed raising the minimum wage, but at first only to $9 an hour, and eventually to $10.10—a change that would have raised the wages of just a small percentage of workers at the very bottom of the income distribution, many of them nonvoters. Attending to the needs of the most vulnerable is a fine thing, of course. But it is impossible to build a winning coalition out of the bottom 20 percent of the income distribution. The rich get tax cuts, the poor get scraps and the middle is left to fend for itself.

Is it any wonder then that so many working- and middle-class voters could see little practical difference between the policies of Democrat Hillary Clinton (one of the more qualified presidential candidates ever to win a major party nomination) and those of a lying, racist, vainglorious, authoritarian, know-nothing like Republican Donald Trump? Clinton was correctly seen as the leader of a Democratic establishment whose “centrist” policies had long served to undermine the legitimate interests of the middle class. So why not take a flyer on Trump? It is in this way that three decades of Democratic ideological centrism helped lead our nation down the neoliberal road to Trumpdom.


All is not lost. Well, not quite yet. There is still an opportunity for a unified Democratic Party to retake Congress in November and remove Trump from office in 2020 (if not sooner). But being anti-Trump is not enough. To build an electoral majority Democrats must come together and embrace an economic policy agenda that boldly and decisively reclaims the majoritarian center. And this will require sacrifices from progressives and centrists alike.

For you centrist Democrats who have long dominated the party establishment, it is time for you to admit that a “pragmatically centrist” agenda that enjoys neither majority support nor serves a majority of voters, is neither pragmatic nor centrist. In fact, it’s suicidal. Indeed, when Starbucks CEO Howard Schultz earnestly insists that Democrats must “go after entitlements” (“Medicare for Fewer” rather than “Medicare for All”), his only real chance of uniting voters is in opposition to Howard Schultz. And don’t you dare think for a moment that you somehow know better than voters what’s good for them, because “Econ 101!” or something. Econ 101 is bullshit—at least in the way that it’s been relentlessly misapplied to public policy these past 40 years. And as for trickle-down economics—“a rising tide lifts all boats” and all that—well, that’s a demonstrable con: Tax cuts for the rich don’t create growth (if they did, Trump’s $1.5 trillion tax giveaway wouldn’t have resulted in declining wages and record stock buybacks). A higher minimum wage doesn’t kill jobs (if it did, the job market wouldn’t be booming in Seattle and San Francisco and New York and in every other city or state that has recently hiked its local wage floor). Deregulation isn’t a magical potion of market efficiency (unless if, by “efficient,” you mean efficiently wiping out the savings of millions of working- and middle-class families through a predatory lending and derivative-fueled economic collapse).

As for progressive Democrats, it’s time for us to stop trashing the very notion of centrism itself. The “centrist” wing of our party (and to be clear, it’s a wing, not the center) isn’t uniformly evil or corrupt. They’re not bad people. They’re just wrong. They believed what economists told them, and then tried to govern accordingly. But they’re still Democrats, and as such, we all broadly share the same inclusive values and goals. Moreover, by relentlessly reviling the center, progressives needlessly cede it. Which is stupid. The Congressional Progressive Caucus is already the largest Democratic caucus in Congress. So you know what that makes them? The center!

Free from the elitist constraints of ideological centrism and refocused on the wants and needs of the majoritarian center, a unified Democratic Party has an opportunity to build an electoral wave strong enough to swamp the gerrymandered seawalls of the Republican-controlled Congress. And what would a truly centrist Democratic agenda look like? A $15 minimum wage, a restored overtime threshold, affordable public college, Medicare for All, paid family leave, crucial infrastructure investments, modern labor laws, and substantially higher taxes on wealthy corporations and individuals would be a good start. If that sounds like the platform of lefties like Ocasio-Cortez, it’s because it is. But when ideologically “lefty” ideas are both broadly popular and broadly economically beneficial, they occupy the majoritarian center from which electoral majorities are built.

Democrats need to stop balancing the economic interests of the top 2 percent against the interests of everyone else and start focusing on the needs of the majoritarian center—the 80 percent of families who have been left behind by 40 years of trickle-down economics. Raise wages now. That’s the kind of pragmatic centrism the majority of Americans truly want, and that our economy needs.

Nick Hanauer is a Seattle-based entrepreneur and venture capitalist, and the founder of Civic Ventures, a public-policy incubator.

A License to Speculate

by Tom Habib·Wednesday, August 15, 2018

The word hypothesis is in essence a license to speculate while subjecting our truth claims to the criteria of empiricism. As Integralist we love to speculate about various lines of consciousness and states of experiences. It’s fine to conjecture, speculate, or to use one’s experience and observation as long as you frame it as such and not present it as fact. Speculation from experience is indeed the initial stage in developing knowledge which would not advance without venturing into the possibilities. Although we may strongly believe a speculation to be true, there is indeed a difference between belief/opinion and the criteria of evidence (yes established by modernity) that establishes a fact. We see many folks in our community who struggle or are oblivious to this distinction today. In empirical parlance a belief is held as a hypothesis. This conceptual distinction has many benefits.

As soon as you use the word hypothesis you have kept the scientifically trained folks in a discussion and perhaps open to possibilities they would otherwise dismiss. The self-inflicted wound of dismissal was the point of the first blog, Modern & Post-modern Dialectics. Secondly, by holding a conjecture loosely, that is as a hypothesis, you are staying open to corrections and new information as a line of thought develops. Finally, you have set a higher standard of proof than mere belief and group consensus. This is just a good Integral mind-set. This nod to orange levels of knowing can be easily incorporated into our awareness and strategically stated by everyone in our community.

  • ·         IEC 2020 has set as a goal to strengthen the Academic & Scientific Tract of its outstanding program. As part of this initiative, a series of blogs will be written on this subject and what it might contribute. It is time for Integral to better hold the dialectical tension of the truth claims of modernity alongside the freedom granted by post-modern sensibilities (McIntosh, 2012).

McIntosh, S. (2012) Evolutions purpose: An integral interpretation of the scientific story of our origins. Select Books, Inc. New York.

Neoliberalism Drives Climate Breakdown, Not Human Nature

By Alex Randall,, August 8, 2018

Many zoos have an exhibit like this: a wall with a hatch, and under the hatch words like “Do you want to see the most dangerous animal in the world?”. Of course everyone does, and before they open the hatch they speculate as to what the animal behind the hatch will be. A lion? A crocodile? However, when you open the hatch there is a mirror, and you see yourself staring back. You are the most dangerous animal in the world.

Of course this is nonsense. Not everyone who opens that hatch and sees themselves looking back is equally dangerous. We are not all equally responsible for destruction of the world’s ecosystems. Some humans who open the hatch probably are responsible for a great deal of destruction. Other are not. Many people bear the brunt of someone else’s destruction.

The idea that all humanity is equally and collectively responsible for climate change – or any other environmental or social problem – is extremely weak. In a basic and easily calculable way, not everyone is responsible for the same quantity of greenhouse gasses. People in the world’s poorest countries produce roughly one hundredth of the emissions of the richest people in the richest countries. Through the chance of our births, and the lifestyle we choose we are not all equally responsible for climate change.

Some people through the power they wield, have stood in the way of halting climate change.

But we are not all equally responsible in a more fundamental way. Some people through the power they wield, have stood in the way of halting climate change. Not because they were stubborn or incompetent or failed to understand the seriousness. But because they acted in pursuit of a fundamental re-organising of our economies during the 1970s and 80s. And this shake-up militated against the kinds of policies and government intervention that might have halted – or at least slowed – climate change.

This is the point that is missed in ‘Losing Earth’, the New York Times’ 30,000 word feature on climate change. The piece charts the failure of the US government to act on climate change between 1979 and 1989. During this period we knew enough about the issue to act, but didn’t. The piece sets out to explain this failure.

‘Losing Earth’ presents the failure as one of political tragedy. Politicians and policy makers simply couldn’t agree. Not because of the undue influence of lobbyists, but because – as humans and politicians – they could not look far enough into the future. They could not take political risks now, in return for the long term safety of the planet.

As humans we cannot engage with complex long term problems. We favour short term comfort over long term safety, even when this is illogical. Our political systems are set up to favour short term political wins. Our politicians think only as far ahead as the next election. This failure to stop climate change was no one’s fault, ‘Losing Earth’ argues. It happened because we’re human, and because our electoral systems aren’t geared up for this kind of problem.

But is this really why the US didn’t act on climate change during the 1980s?

The late 70s and 80s were also a time when the economies of most developed countries underwent a fundamental restructuring.

Since the end of the Second World War the economies of Europe and the US had been growing steadily. Ordinary people had been taking home and ever growing slice of this new economic growth. In the US, unionised workforces were consistently negotiating better pay and conditions. In Europe people also began to see the benefits of nationalised healthcare and house building.

The very richest people in society had also been getting richer as developed economies grew. But the slice of the pie they were keeping was shrinking. In 1940 the wealthiest 0.1% kept about 20% of all the money earned. While the poorest 90% (almost everyone) kept about the same. By the mid 70s the slice kept by the 0.1% had dipped to around 7%, while the slice kept by the 90% had climbed to over 30%. The US economy was still vastly unequal, but it was becoming more equal. Many working people were gaining, at the expense of very rich.

We should not pretend that the gains of working people were evenly shared. These figures disguise cruel inequalities amongst the 90% shaped by race, religion, gender and geography.

By the middle of the 1970s it was clear to the wealthiest in society that something had to change. More and more of the spoils of economic growth were going into the pockets of ordinary people. Across the Western world, governments were taxing growing profits and spending them on housing, healthcare and education – mainly for the benefit or ordinary people.

The economy, and people’s expectations of it, needed a shake up. Crucially, a shake up that reversed the growing trend of economic equality. A shake up which would return the 0.1% to the position they had been in during the 1930s and 40s when they were keeping a much greater cut of the all the money that was earned.

To do this they turned to a collection of political ideas that had been largely ignored since their formation in the 1920s. These ideas and the economies shaped by them have come known as neoliberalism.

Government – neoliberals believed – stood in the way of prosperity. The size of the state should be reduced, the number of people on the public payroll should go down. Areas that had been the domain of government – healthcare, house building, transport, energy – should no longer be. Instead these should become the domain of private enterprise.

These ideas held that the role of the state should shrink. Government – neoliberals believed – stood in the way of prosperity. The size of the state should be reduced, the number of people on the public payroll should go down. Areas that had been the domain of government – healthcare, house building, transport, energy – should no longer be. Instead these should become the domain of private enterprise.

Markets should decide what receives investment and what does not. If there is demand (say) for new energy generation then the price of electricity should provide the signal for power companies to build it and profit from doing so. The government should step back and let the market decide what happens.

In addition, regulation and corporate taxes of all kinds should be stripped back. This – they argued – would drive more investment. Environmental regulation controlling pollution simply prevented businesses providing energy to people cheaply, they argued. Taxes on polluting substances did the same. Stripping these away – they argued – would give people what they wanted. In place of regulation the proposed consumer choice. If people wanted non-polluting products – if that mattered to them – they would pay extra for them. And businesses would respond to this demand by providing them.

The ideology and the practice of neoliberalism were not always consistent. While the ideology demanded the withdrawal of the state, many private businesses continued to demand (and receive) vast government subsidies. In the US during the 1980s the government continued to sponsor billions of dollars worth of research into fossil fuel extraction.

While the ideology demanded the withdrawal of the state, many private businesses continued to demand (and receive) vast government subsidies.

For an introduction to the rise of neoliberalism these podcasts are very good.

The impact of these changes on the overall economy was also well understood by those who proposed them. As responsibility for infrastructure, energy, housing and the other usual domains of the state – moved to the private sector so did the money. These became new areas in which to make profit. The lack of regulation, lower taxes and subsidies meant making these profits was easier.

The wealthiest 0.1% began to see their share of the society’s wealth increasing. Starting around 1974 the economy swung around in favour of the richest. Their slice of all the money earned began to climb, while the slice taken home by the 90% began to fall. This trend has continued until now. In the US levels of income inequality have returned to where they were before the Second World War. This was the drive behind this vast shake up, and it worked.

The reshaping of the US economy took place during the period covered by ‘Losing Earth’. It was during the decade – 1979 to 1989 – that neoliberalism truly entered the political mainstream.

However in doing this, the US government had stripped itself of the tools it needed to address climate change – regulation of polluting businesses, taxation of carbon emissions and state investment in energy alternatives.

In order to address climate change the US (and other nations) needed to do things that were no longer politically possible. Fossil fuels needed to be taxed in order to reduce their consumption. Carbon emissions needed to be taxed, or capped. The government needed to invest heavily in renewable energy. Or it needed to force energy companies to do so through legislation.

These things might have been possible in previous decades, when governments saw this kind of investment and legislation as their job. But in this new neoliberal era, these kind of interventions were impossible – especially for the US.

So the US government’s failure to act was not a political or human accident as ‘Losing Earth’ holds. Rather, the economy of the US had very deliberately been re-shaped. It had been re-shaped in order to return economic advantage to the very wealthiest people, who had been losing that advantage over several decades. However in doing this, the US government had stripped itself of the tools it needed to address climate change – regulation of polluting businesses, taxation of carbon emissions and state investment in energy alternatives.

We did not lose the earth in the 1980s. Rather, the tools governments needed to act had be taken from them.

Bannon’s Deviant ‘Badge of Honor’

By Jason Stanley, New York Times, March 13, 2018

The tactic of subverting language to turn vice into virtue has a very dark past.

In a speech last weekend in France, Stephen Bannon, the former top adviser to President Trump, urged an audience of far-right National Front Party members to “let them call you racists, let them call you xenophobes.” He went on: “Let them call you nativists. Wear it as a badge of honor.”

On the face of it, Bannon’s advice is strange. After all, by any normative understanding, “racist,” “xenophobe” and “nativist” are negative words from both a moral and rational point of view. Their definitions, taken from any standard dictionary, will bear this out. Racism, xenophobia and nativism embody, in their very meanings, both irrationality and unfairness. Irrationality is considered to be a negative quality (except perhaps by Dadaists); so is unfairness.

For those of us to wish to understand the way Bannon is manipulating language here, and to what end, it is important to note what he is not doing. It is typical for far-right politicians who want to attract racist, xenophobic or nativist voters to attempt to provide at least the pretense of reasons, invariably shoddy ones, for animus against racial minorities, immigrants, or foreigners.

In the United States, President Trump regularly connects individual crimes or criminal gangs with immigration in an effort to more broadly establish a link between immigrants with crime in the public consciousness. Though studies have shown that immigrants, both legal and illegal, are less likely to commit crimes than native-born Americans, Trump continues to make such claims (his statement that immigrants bring “tremendous amounts of crime” received a score of “four Pinocchios” from fact-checkers).




Similarly bizarre false claims are made by non-American far-right politicians, who also regularly engage in anti-Semitic innuendo. There is a natural next step to using innuendo and the manufacturing of reasons to justify racism and xenophobia — namely to drop the facade altogether. Bannon is urging the adoption of an irrational bias against racial minorities, immigrants and foreigners, one that does not require reasons, even bad ones, to support it. And he recommends presenting such irrationality as virtuous.

Accepting Bannon’s advice requires rejecting empathy for already embattled groups. Some might view that as acceptable, preferring to weigh only statistical arguments in deciding what to do about, for example, immigration policy. But taking Bannon’s advice also requires rejecting any recognizable practice of giving plausible reasons for holding a view or position. To proudly identify as a xenophobe is to identify as someone who is not interested in argument. It is to be irrationally fearful of foreigners, and proudly so. It means not masking one’s irrationality even from oneself.

Bannon’s rhetorical move of transforming vices based on irrational prejudice into virtues is not without historical precedent. Hitler devotes the second chapter of “Mein Kampf” to explaining how his time in Vienna as a young man transformed him into a “fanatical anti-Semite.” To be fanatical is, by definition, to be irrational. To be an anti-Semite is to have irrational prejudice against Jews. Hitler presents his transformation into a fanatical anti-Semite positively. It is, in Hitler’s rhetoric, not only a good thing to harbor irrational hatred of Jewish people. One should do so fanatically. Such fanatical irrationality is, in Hitler’s rhetoric, virtuous.

Of course, comparing rhetoric and policies are two different things. No recent far-right movement in Europe or the United States has enacted the sort of genocidal policies that the Nazis did, and no such comparison is intended. But history has shown that the sort of subversion of language that Bannon has engaged in is often deeply intertwined with what a government will do, and what its people will allow. Bannon’s own cheer to the National Front members — “The tide of history is with us and it will compel us to victory after victory after victory” — shows clearly enough that he does not mean his efforts to end in mere speech.



In the 20th-century scholar Victor Klemperer’s “Language of the Third Reich” there is a chapter titled “Fanatical.” In it, Klemperer reports that the Nazis regularly inverted the meanings of ordinary words, in just the way that Bannon recommends: by turning vices or negative ideals into virtues. (This point is well-known enough to have been the basis of a famous comedy sketch, “Are we the baddies?”) To be sure, it was not only the Nazis who chose to invert the valence of words that were negative because of their connection with unthinking irrationality; some American slavery abolitionists called their own beliefs in the moral wrongness of slavery fanatical. But it was the Nazis who most thoroughly and efficiently turned the ungrounded hatred and fear of minority groups into an explicit virtue.


Performing such inversions is an attempt to change the ideologies and behaviors of large groups of people. It is done to legitimate extreme, inhumane treatment of minority populations (or perhaps, to render such treatment no longer in need of legitimation). In this country, we are familiar with it from the criminal justice system’s treatment of black Americans, in some of the “get tough on crime” rhetoric that fed racialized mass incarceration in Northern cities, or the open racism sometimes connected to Southern white identity or “heritage.” Its aim is to create a population seeking leaders who are utterly ruthless and cruel, intolerant, irrational and unyielding in the face of challenges to the cultural and political dominance of the majority racial or religious group. It normalizes fascism.

At the end of his chapter, Klemperer assures us that even when the practice of treating negative ideals as if they were virtues becomes routine, people nevertheless retain a clear understanding that this is indeed an inversion. He notes that as soon as World War II ended, ordinary Germans returned to using the word “fanatical” as a negative. This observation is crucial. It means that we can remind even those inclined to take Bannon’s advice that while language can be manipulated, the attempt to change its meaning, and the shape of reality with it, is ultimately temporary.

How Revisionist History Works

 by Cristen Conger, How Stuff Works, 

German students protest against the signing of the Treaty of Versailles in 1932. Reaction to the treaty after World War I marked the beginning of modern historical revisionism.

When you hear the word “square,” you need context to know whether it refers to the shape, the mathematical operation or a slang insult for a conventional person. The term “revisionist history” can be similarly vague when standing alone since it usually connotes one of the three perspectives discussed on the previous page.

Let’s consider the legacy of Thomas Jefferson to understand how you can apply these different perspectives. People accept that Thomas Jefferson wrote the Declaration of Independence and served as the third president of the United States. But another biographical fact is that Jefferson had a slave mistress named Sally Hemings, with whom he fathered children. Despite people’s discomfort with that nugget of information, DNA evidence in the late 1990s confirmed it was true. So what did that discovery mean for revisionist historians?

Revisionism Through Social and Theoretical Lenses

Historians refer to the years immediately following World War II as the age of historical consensus [source: Foner and Garraty]. A strong sense of patriotism and unity dominated the historical framework during that time. Then, that stability began to crack apart with the turmoil and uncertainty of the 1960s. No longer was the country sitting victorious after succeeding in World War II. The combination of the protracted war in Vietnam and the struggle for equality throughout the Civil Rights movement changed the tone across the United States radically. Technicolor Uncle Sam and victory gardens were replaced by race riots and student protests. Revisionist historians understood that these events affected groups in different ways, which reshaped the overall narrative of U.S. history.

Revisionism as a Means of Correcting the Facts 

Recounting­ historical events through the centuries can be similar to playing a game of telephone. Th­e first person starts with something simple, like the meeting of Capt. John Smith and Pocahontas in Jamestown. By the time the message reaches the last person in the circle, it’s become primped and polished into a colonial love story. Revising history can untangle that string of miscommunication.

Recounting­ historical events through the centuries can be similar to playing a game of telephone. Th­e first person starts with something simple, like the meeting of Capt. John Smith and Pocahontas in Jamestown. By the time the message reaches the last person in the circle, it’s become primped and polished into a colonial love story. Revising history can untangle that string of miscommunication. 

In the Disney version of the Pocahontas story, the Native American is a leggy, attractive woman who falls madly in love with Smith. Aside from the musical numbers, the plotline from the animated film isn’t too far from the history lesson that was taught in schools. But like the tale of George Washington and the cherry tree, that of Pocahontas and John Smith has been revised. Thanks to Smith’s journals and other written sources, we know now that the famous Native American was probably 11 years old when they met — there was no steamy romance or marriage between the couple. Instead, Pocahontas married a widower named John Rolfe and died around the age of 21 [source: LaRoe].

Revisionism as a Negative Term           

The inconsistent quality of revisionist theories, including those surrounding JFK’s assassination, contributes to the low credibility of historical revisionism.

In popular culture, revisionist history has become synonymous with telling lies or embellishing the truth. For instance, in 2003, President Bush used the term “revisionist historians” in reference to the media covering the war in Iraq. He claimed that certain reporters had wrongfully questioned the reasons for invading the Middle Eastern country and muddied the public’s opinion of the conflict. Some professional historians didn’t take kindly to Bush’s comment because it cast an unflattering light on the academic study of history. After all, they reasoned, all histories are revisionist at some point. A few years later, in 2006, Florida passed a law banning “revisionist and postmodernist history” from being taught in the state’s public schools [source: History News Network]. The language of Florida’s Education Omnibus Bill stated that students should learn facts, not “constructed” elements of American history — essentially equating revisionism with lies.

Why does revisionist history have a bad reputation? First, it’s associated sometimes with highly contentious theories, such as Holocaust denial. Recall the public furor in response to Iranian President Mahmoud Ahmadinejad’s 2007 speech at Columbia University, when he stated that the Holocaust didn’t happen. Historians emphasize that people who deny the events of the Holocaust during World War II aren’t practicing revisionist history but rather negationism. Another revisionism-related scandal occurred recently in Japan, also concerning World War II. The general of the Japanese air force authored an essay asserting that Japan was bullied into Pearl Harbor by the United States and only engaged in combat as a defensive measure [source: Economist].