No Wonder So Many Are Disillusioned by Our Politics — We’ve Got an 18th Century Political System

AlterNet [1] / By Steven Hill [2] November 26, 2012

The following is an excerpt from the Introduction to 10 Steps to Repair American Democracy: A More Perfect Union, 2012 Election Edition [3] by Steven Hill.

In 2008, an economic earthquake of historic proportions shook the world. That was followed by numerous aftershocks whose effects are still being felt years later. In the middle of the economic crash, a new audacity of hope arrived in the form of the first African American president elected in U.S. history. It was a jubilant moment showing America at its best, taking a giant step toward the dream of a multiracial society.

The very campaign of Barack Obama, which drew in unprecedented numbers of young people, seemed to auger a new era in politics that held out the potential for a badly needed transformation. Time magazine featured the face of President-elect Obama on its cover, photoshopped into a likeness of President Franklin Roosevelt complete with tipped cigarette holder and grey fedora. It seemed that a new New Deal was on the horizon for an America suffering from the ravages of a historic economic collapse.

Yet, within a short time, the Obama administration found itself flat-footed on nearly all policy fronts. Confronted by intractable challenges and difficult choices presented by the economic crisis, and hindered by a polarized Congress more interested in political brinksmanship and deploying cheap, anti-majoritarian strategies like the filibuster, the Obama administration responded with timid proposals that failed to realize its promise. It turned out that America’s antiquated political system was so creaky and sclerotic that it was impervious to even the most talented of its politicians.

President Obama’s leadership failure came at a crucial moment. Without a politics that could rein in the economics, Wall Street honchos at Goldman Sachs, Lehman Brother and others had turned our banks and financial system into their personal casinos to be bailed out by taxpayers when their bets tanked. The economic crash had been caused by a hyper-deregulated U.S. financial system that, without sufficient political and administrative oversight, had spun out of control. Wall Street’s brand of capitalism resulted in a socializing of the losses and privatizing of the gains. And so it fell to the American political system, which had failed to rein in the runaway train to begin with, to re-regulate the economy and try to make the country safe again for capitalism.

Yet following President Obama’s inauguration in January 2009, not only did he continue many of the Bush administration’s policies but the people he chose as his cabinet members and regulators were industry insiders who were not going to change things fundamentally. Wall Street executives, at first cowed by their own incompetence and the sudden systemic instability that forced them to accept government handouts, rediscovered their bravado and began digging in against fundamental reform. The Obama administration and other key authorities, such as the New York Federal Reserve, stood back while Wall Street and the corrupted ratings agencies resurrected much of the ultra-complex trading system that had led to such a spectacular global collapse. New regulations eventually were passed, especially the Dodd-Frank legislation, but many financial experts felt that certain key defects were never adequately addressed. After the dust had settled, many of the “too big to fail” banks and financial institutions that remained were even bigger than before the crisis, having swallowed those that went belly up. Wall Street was back to its high-flying ways, and a smoldering anger rumbled across the nation as it became clear that Main Street had been swindled by Wall Street, and government had done little to protect everyday people.

The Weakening Pulse of Democracy

But this spectacular economic collapse was just the climax of a long downhill slide for America, the last remaining superpower of the Cold War era. The crash of the economy exposed an ugly truth: The forces that have drastically increased economic insecurity and risk for so many Americans are rooted in fundamental shifts over the last thirty years. These longer-term trends, which culminated in the Great Recession, mark one of the great social transformations of the postwar era. But how could such alarming concentrations of wealth be possible, more than eighty years after the lessons we thought we had learned from the Great Depression? How could there be so much poverty and inequality within a nation that has so much wealth and power? And how could those who wrecked our economy end up profiting so brazenly from the destruction they created? These are the questions on many lips and minds, not only in the United States but around the world, where a struggling Uncle Sam has lost much of its narrative appeal and sizzle as an international beacon.

There are many potential responses to those questions, but overwhelmingly one factor is looming larger: The economic collapse was preceded by a long-standing political collapse.  Unfortunately, the antiquated American political system is desperately broken, mired in antiquated ways that have resulted in political paralysis in the face of urgent new challenges. Even today, policies that will exacerbate the inequality, unemployment and sluggish economy, such as tax cuts for the wealthy and the slashing of Social Security and Medicare, retain considerable political momentum among political elites, despite their unpopularity in opinion polls. The political system has become unresponsive to “We the People.” At this point, the economic system — and those who dominate it — have captured the political system.

Understandably this has prompted great outrage and frustration among the American people. These passions crystallized into a right-wing populist movement known as the Tea Party, and later into a youth-inspired protest encampment in the belly of the beast called Occupy Wall Street (which eventually spread to other cities around the United States). Both of these populist insurrections struck a chord when they said, in effect, that it is time for Americans to take back our government. More than two hundred years after our national “birthquake,” government of, by, and for the people remains an unfulfilled promise.

Today, the antiquated American political system finds itself plagued by partisan polarization, a rigidly divided Congress, superficial debate, and paralysis in the face of new global challenges. Even before the economic crisis, the U.S. was beset by choiceless elections, out-of-control campaign spending, backward voter registration laws, a filibuster-gone-wild U.S. Senate, mindless media, even a partisan Supreme Court. Americans are increasingly frustrated and have tuned out, causing the middle to collapse and allowing the partisans and apparatchiks to take over.

The Way Forward

Fortunately, there is a better way than the status quo, one that can lead us toward a brighter national future. That better way involves making fundamental changes to our basic political and media institutions, and bringing them into the 21st century. It involves not only crafting laws to cope with new assaults on American democracy— such as the horrendous US Supreme Court ruling known as Citizens United, which has opened up the spigot for corporate donations and unlimited spending in our elections—but also figuring out how to transform our increasingly creaky and antiquated political institutions.

In 10 Steps to Repair American Democracy [3] I provide a blueprint for renewing the American republic. I outline ten essential steps that can repair and modernize American democracy including:  ways to allow more people to register and vote and to make sure voting equipment counts our ballots properly; replacing winner-take-all elections with alternative methods like proportional voting and ranked choice voting (also known as instant runoff voting) that give better representation, promote higher voter participation and encourage constructive debate and coalition-building rather than scorched-earth tactics. I propose that presidents be elected by a national popular vote, and I advance reforms for the sclerotic Senate, which increasingly exaggerates the power of sparsely populated, conservative “red” states, and its arcane rules like the filibuster that undermine majority rule. I also propose reforms for the U.S. Supreme Court, which now resembles an unelected legislature of nine members that not only is badly out of step with mainstream America but where “five votes beats a reason any day.”  And I promote the advantages of publicly financed elections, free media time for candidates, more vigorous public broadcasting and media reforms that will produce a more robust debate in the free marketplace of ideas.

Not only do I propose specific reforms, but I discuss ways to enact them and provide at the end of each chapter a list of organizations that already are working on these reforms. By parsing the big picture into smaller bite-size pieces suitable for activism, 10 Steps provides a roadmap for moving forward. 

As the New Yorker’s Hendrik Hertzberg says in the foreword, 10 Steps shows “that there actually is a way we can keep faith with our Founding Fathers. And it’s not to pretend that the particular political improvisations and compromises they came up with more than two centuries ago—brilliant and clever though they were in the context of their times—provide the answer to every question. No, the way to honor the Founders is not to worship them. It’s to imitate them. It’s to do what they did: diagnose what’s wrong; be fearless about innovation; learn from experience; design political mechanisms with a view to taking account of human imperfection and marshaling the self-interest of politicians for the common good. The question isn’t, ‘What, way back when, did Jefferson (and Madison and Hamilton) do? The question is, What would they do now?’ The answers begin here.”

So it’s time for American patriots to roll up our sleeves and get to work. I believe these are commonsense changes, most of them are already working in other nations and in some parts of the United States. I’m certain that the Founders and Framers of our nation, being the enlightened pragmatists that they were, would have applauded efforts to modernize their eighteenth century political creation and make it into one that lives up to the lofty rhetoric and aspirations of their astonishing age.  The challenge before us of remaking American democracy is an epic one. The brightness of our national future depends on our success. But Americans have risen to great challenges before, and I believe we will again.

Source URL:


Did the Dalai Lama Just Call for an End to Religion?

Well, not exactly—here is what the spiritual leader of Tibetan Buddhism actually told his four million friends on Facebook earlier this fall:

“All the world’s major religions, with their emphasis on love, compassion, patience, tolerance, and forgiveness can and do promote inner values. But the reality of the world today is that grounding ethics in religion is no longer adequate. This is why I am increasingly convinced that the time has come to find a way of thinking about spirituality and ethics beyond religion altogether.” 

It is easy to sympathize with the Dalai Lama’s frustration. After millennia of being preached at by priests and prophets, humanity is still addicted to war; we continue to lay waste to the planet’s fragile ecosystem; we torture animals, repress ethnic minorities, and ignore the plight of the poor.

Worse still, religion has often in service of the very sins of intolerance that its prophets have railed against. Abortion clinics are bombed to support a “pro-life” agenda; religiously inspired hatred in the Middle East have fueled ongoing war—religiously inspired hatred everywhere have led to countless horrors. 

In the past, such moral failings, while contributing to human misery, did not put life itself at risk. But that has changed. Our once-marginal species is now the dominant life form on the planet numbering over seven billion souls. Granted, there are still more microorganisms in a shovelful of prime agricultural soil than human beings on Earth. But bacteria don’t have brains, and the crux of the problem is that we do.

To call the brain a “problem,” of course, is only half of the story. The human mind has created art, science, philosophy, government, education, and the miracles of modern medicine. Religion, with its exalted ethical and spiritual teachings, is another example—whatever Richard Dawkins might say—of our human capacity for creating good.

The New Atheists are right of course when they fault religion for not living up to its own ideals. They would get no argument from the Dalai Lama on this. But His Holiness would be quick to point out that the moral principles themselves are not to blame—it’s our failure to act on them.

The Dalai Lama recommends a radical new approach: a religionless religion, if you will, stripped of myth, superstition, and narrow dogmatism, and focused on the practical work of transforming human behavior. He wants to incorporate the insights of the hard sciences as well as psychology, philosophy, and sociology into a broad-based new discipline to address our current moral crisis.

But can religion be rationalized into a pure system of ethics without losing its (historically) persuasive power?

Some have pointed to Buddhism itself as an example of just such a system. Western practitioners like to think of Buddhism as a methodology for self-cultivation rather than as a religion per se. But Tibetan Buddhism, with its pantheon of deities and arcane practices, certainly looks familiarly religious to those of us brought up on Western religious myths and symbols.

I suspect that His Holiness would agree that these religious elements are not a bad thing. Because religion, for all its faults, seems to have an unrivaled capacity to move us, and to motivate us.

Perhaps that has something to do with stories—we want to know how our private stories fit into the greater cosmic narrative. The Dalai Lama seems to be saying that religion needs to work harder to bridge the gap between the story that it tells and our actions in the world. It is not enough to provide believers with a comforting world view; religion should give people tools to act upon the sacred ideals that it preaches.

The way to accomplish this, according to the Dalai Lama, is spiritual practice. “We are now in the twenty-first century,” writes Tibet’s leading monk.

“The world is also facing a lot of new problems, most of which are man-made. The root cause of these man made problems is the inability of human being to control their agitated minds. How to control such a state of mind is taught by the various religions of this world.”

The Dalai Lama advocates prayer and meditation as an antidote to the mind’s capacity for mischief. But he insists that we need not limit ourselves to traditional spiritual techniques. He has written a book on the convergence of views between Buddhism and science and he helped to organize conferences where religious thinkers meet with scientists to explore their common ground. This is because, in his view, science can help religion to fine tune its own methods. (Neurology has already gone a long way toward validating the reality of spiritual states by documenting, for example, similar changes in regions of the cerebral cortex in Cistercian monks during prayer as it has shown in Buddhist monks during meditation.)

The Dalai Lama believes that the fundamental ethical discoveries of religion are scientifically verifiable. When we actually live religiously—and don’t just profess a set of beliefs—we become more forgiving, peaceful, tolerant, attentive and inspired. This in turn leads to profound psychological and physiological changes which can be studied—and even measured.

It is time, the Dalai Lama says, to take the discoveries of spirituality out of the monasteries and into the world. While mindfulness meditation has been introduced into schools, hospitals, and even corporate boardrooms as a technique to lower stress, improve concentration, and help resolve conflicts, Tibet’s religious leader is acutely aware that none of this is enough. “It is all too evident that our moral thinking simply has not been able to keep pace with such rapid progress in our acquisition of knowledge and power,” the Dalai Lama told a group of scientists in 2005. 

The bottom line is that taming the mind creates more peaceful and contented human beings. This is the crux of the Dalai Lama’s message—because, as his urgency suggests, we are running out of time to get it right.

Richard Schiffman is a spiritual author, poet and journalist. His work has appeared in the New York Times, the Washington Post, and the Christian Science Monitor and he is a regular blogger on The Huffington Post.

Class Wars of 2012

By PAUL KRUGMAN, New York Times, November 29, 2012

On Election Day, The Boston Globe reported, Logan International Airport in Boston was running short of parking spaces. Not for cars — for private jets. Big donors were flooding into the city to attend Mitt Romney’s victory party.

They were, it turned out, misinformed about political reality. But the disappointed plutocrats weren’t wrong about who was on their side. This was very much an election pitting the interests of the very rich against those of the middle class and the poor.

And the Obama campaign won largely by disregarding the warnings of squeamish “centrists” and embracing that reality, stressing the class-war aspect of the confrontation. This ensured not only that President Obama won by huge margins among lower-income voters, but that those voters turned out in large numbers, sealing his victory.

The important thing to understand now is that while the election is over, the class war isn’t. The same people who bet big on Mr. Romney, and lost, are now trying to win by stealth — in the name of fiscal responsibility — the ground they failed to gain in an open election.

Before I get there, a word about the actual vote. Obviously, narrow economic self-interest doesn’t explain everything about how individuals, or even broad demographic groups, cast their ballots. Asian-Americans are a relatively affluent group, yet they went for President Obama by 3 to 1. Whites in Mississippi, on the other hand, aren’t especially well off, yet Mr. Obama received only 10 percent of their votes.

These anomalies, however, weren’t enough to change the overall pattern. Meanwhile, Democrats seem to have neutralized the traditional G.O.P. advantage on social issues, so that the election really was a referendum on economic policy. And what voters said, clearly, was no to tax cuts for the rich, no to benefit cuts for the middle class and the poor. So what’s a top-down class warrior to do?

The answer, as I have already suggested, is to rely on stealth — to smuggle in plutocrat-friendly policies under the pretense that they’re just sensible responses to the budget deficit.

Consider, as a prime example, the push to raise the retirement age, the age of eligibility for Medicare, or both. This is only reasonable, we’re told — after all, life expectancy has risen, so shouldn’t we all retire later? In reality, however, it would be a hugely regressive policy change, imposing severe burdens on lower- and middle-income Americans while barely affecting the wealthy. Why? First of all, the increase in life expectancy is concentrated among the affluent; why should janitors have to retire later because lawyers are living longer? Second, both Social Security and Medicare are much more important, relative to income, to less-affluent Americans, so delaying their availability would be a far more severe hit to ordinary families than to the top 1 percent.

Or take a subtler example, the insistence that any revenue increases should come from limiting deductions rather than from higher tax rates. The key thing to realize here is that the math just doesn’t work; there is, in fact, no way limits on deductions can raise as much revenue from the wealthy as you can get simply by letting the relevant parts of the Bush-era tax cuts expire. So any proposal to avoid a rate increase is, whatever its proponents may say, a proposal that we let the 1 percent off the hook and shift the burden, one way or another, to the middle class or the poor.

The point is that the class war is still on, this time with an added dose of deception. And this, in turn, means that you need to look very closely at any proposals coming from the usual suspects, even — or rather especially — if the proposal is being represented as a bipartisan, common-sense solution. In particular, whenever some deficit-scold group talks about “shared sacrifice,” you need to ask, sacrifice relative to what?

As regular readers may know, I’m not a fan of the Bowles-Simpson report on deficit reduction that laid out a poorly designed plan that for some reason has achieved near-sacred status among the Beltway elite. Still, at least you can say this for Bowles-Simpson: When it talked about shared sacrifice, it started from a “baseline” that already assumed the end of the high-end Bush tax cuts. At this point, however, just about all the deficit scolds seem to want us to count the expiration of those cuts — which were sold on false pretenses, and were never affordable — as some kind of big giveback by the rich. It isn’t.

So keep your eyes open as the fiscal game of chicken continues. It’s an uncomfortable but real truth that we are not all in this together; America’s top-down class warriors lost big in the election, but now they’re trying to use the pretense of concern about the deficit to snatch victory from the jaws of defeat. Let’s not let them pull it off.

A Liberal Moment

By TIMOTHY EGAN, New York Times, November 29, 2012

Still hard to believe, I told a friend the other day while trying to fathom the election results, that pot is legal in my state, gays are free to marry, and a black man who vowed to raise taxes on the rich won a majority of the popular vote for president, back to back – the first time anyone has done that since Franklin Roosevelt’s second election in 1936.

And yet only one in four voters identified themselves as “liberal” in national exit polls. Conservatives were 35 percent, and moderates the plurality, at 41 percent. The number of voters who agreed to the “l” tag was up by three percentage points, for what it’s worth, from 22 percent in 2008.

What’s going on here, demography and democracy seem to be saying at the same time, is the advance of progressive political ideas by a majority that spurns an obvious label. Liberals have long been a distinct minority; liberalism, in its better forms, has been triumphant at key times since the founding of the Republic.

Abraham Lincoln’s push for the 13th Amendment, erasing the original sin of slavery from the land, was a liberal moment, as dramatized in Steven Spielberg’s new film. Teddy Roosevelt’s embrace of the income tax, eventually written into the Constitution after he left office, was a liberal moment. “No single device has done so much to secure the future of capitalism as this tax,” said John Kenneth Galbraith.

Women’s suffrage in 1920, Social Security in 1935, the Civil Rights Act of 1964 – all liberal moments. Ditto the creation of national parks, and laws against child labor and poisoning the environment, and for giving most Americans access to health care.

Democrats were the knuckle-draggers on race and populist economic reform in the 19th century, Republicans in the latter half of the 20th. The party identities change; the arc of enlightenment does not.

Which brings us to the fascinating self-portrait of the United States at the start of the second half of the Obama era. A tenuous center-left majority wants to restore some equality to the outsize imbalance between the very rich and the rest of us. If a tenuous president can lead that coalition, without overreaching, he might be remembered among the greats.

In its simplest form, this will involve raising taxes at the high end and reforming entitlements enough to ensure their continued success and sustainability. Much of that, an accountant could do. But it takes a gifted politician for the heavier lifting. That leader will have to make his still-fledgling health care act work and earn his premature Nobel Peace Prize on an issue like climate change. In the process, he could restore the good name to traditional liberalism.

For at least a generation’s time, liberals in this country have been afraid to call themselves liberal. Was it the excesses of their creed, from race-based preferential programs that went on far too long to crude speech censorship by the politically correct and humorless (one and the same) that soiled the brand? In blindly embracing, say, the teachers’ union in the face of overwhelming evidence that public education needs a jolt or in never questioning the efficacy of government programs, the left earned its years in exile.

Or was it the relentless campaign by the broadcasting and publishing empires of the far right, associating liberals with tyranny, spiritual vacuity and baby killing, that drove people from the label that could not speak its name? “Godless,” “Treason” and “Demonic” are actual Ann Coulter book titles, and a representative sample of the profitable cartooning of liberals.

Liberalism, in the broadest sense, is about expanding human rights and opportunity, while embracing science and reason. What do they call the secularists in Egypt today pushing for democracy over a theocracy? Liberals.

The Progressives of the early 20th had an amazing run – direct elections of senators, regulation of monopolistic trusts, modernization of public schools, cleaning up the food supply – with only one major blooper: Prohibition.

The New Deal’s lasting legacy, Social Security, and its counterpart of the 1960s, Medicare, allowed millions of American to live out their lives in dignity. Those programs, attacked as socialistic abominations by the Fox News shills of their day, are now considered near sacrosanct by Americans of all political stripes.

Conservatives of the last decade lost their way by rejecting science, immigration reform and personal freedom, particularly in regard to choices made by women and gays. If you believe in climate change, finding a path to citizenship for millions of hard-working Hispanics and the right to marry the person you love, there is no place in the Republican Party of 2012 for you.

Their neo-con wing started a pair of disastrous wars that all but bankrupted the country. And for leaders, at least on television, the party put forth crackpots like Rick Santorum, Sarah Palin, Michele Bachmann and the morally elastic Newt Gingrich. This chorus promoted an orthodoxy that forced this year’s standard-bearer, Mitt Romney, to sound even more out of touch than he already was.

All political moments are ephemeral. This one could vanish in the blink of a donkey’s eye. But here it is: a chance to shore up a battered middle class, make the promise of health care expansion work and do something about a planet in peril. Huge tasks, of course, and fraught with risk. For now, the majority of Americans have Obama’s back. But should he fail, the same majority could become something much worse – a confederacy of cynics.

What Defines a Meme?

Our world is a place where information can behave like human genes and ideas can replicate, mutate and evolve
By James Gleick, Smithsonian magazine, May 2011,

What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life.’ It is information, words, instructions,” Richard Dawkins declared in 1986. Already one of the world’s foremost evolutionary biologists, he had caught the spirit of a new age. The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment. “If you want to understand life,” Dawkins wrote, “don’t think about vibrant, throbbing gels and oozes, think about information technology.”
We have become surrounded by information technology; our furniture includes iPods and plasma displays, and our skills include texting and Googling. But our capacity to understand the role of information has been sorely taxed. “TMI,” we say. Stand back, however, and the past does come back into focus.
The rise of information theory aided and abetted a new view of life. The genetic code—no longer a mere metaphor—was being deciphered. Scientists spoke grandly of the biosphere: an entity composed of all the earth’s life-forms, teeming with information, replicating and evolving. And biologists, having absorbed the methods and vocabulary of communications science, went further to make their own contributions to the understanding of information itself.
Jacques Monod, the Parisian biologist who shared a Nobel Prize in 1965 for working out the role of messenger RNA in the transfer of genetic information, proposed an analogy: just as the biosphere stands above the world of nonliving matter, so an “abstract kingdom” rises above the biosphere. The denizens of this kingdom? Ideas.
“Ideas have retained some of the properties of organisms,” he wrote. “Like them, they tend to perpetuate their structure and to breed; they too can fuse, recombine, segregate their content; indeed they too can evolve, and in this evolution selection must surely play an important role.”
Ideas have “spreading power,” he noted—“infectivity, as it were”—and some more than others. An example of an infectious idea might be a religious ideology that gains sway over a large group of people. The American neurophysiologist Roger Sperry had put forward a similar notion several years earlier, arguing that ideas are “just as real” as the neurons they inhabit. Ideas have power, he said:
Ideas cause ideas and help evolve new ideas. They interact with each other and with other mental forces in the same brain, in neighboring brains, and thanks to global communication, in far distant, foreign brains. And they also interact with the external surroundings to produce in toto a burstwise advance in evolution that is far beyond anything to hit the evolutionary scene yet.
Monod added, “I shall not hazard a theory of the selection of ideas.” There was no need. Others were willing.
Dawkins made his own jump from the evolution of genes to the evolution of ideas. For him the starring role belongs to the replicator, and it scarcely matters whether replicators were made of nucleic acid. His rule is “All life evolves by the differential survival of replicating entities.” Wherever there is life, there must be replicators. Perhaps on other worlds replicators could arise in a silicon-based chemistry—or in no chemistry at all.
What would it mean for a replicator to exist without chemistry? “I think that a new kind of replicator has recently emerged on this very planet,” Dawkins proclaimed near the end of his first book, The Selfish Gene, in 1976. “It is staring us in the face. It is still in its infancy, still drifting clumsily about in its primeval soup, but already it is achieving evolutionary change at a rate that leaves the old gene panting far behind.” That “soup” is human culture; the vector of transmission is language, and the spawning ground is the brain.
For this bodiless replicator itself, Dawkins proposed a name. He called it the meme, and it became his most memorable invention, far more influential than his selfish genes or his later proselytizing against religiosity. “Memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation,” he wrote. They compete with one another for limited resources: brain time or bandwidth. They compete most of all for attention. For example:
Ideas. Whether an idea arises uniquely or reappears many times, it may thrive in the meme pool or it may dwindle and vanish. The belief in God is an example Dawkins offers—an ancient idea, replicating itself not just in words but in music and art. The belief that Earth orbits the Sun is no less a meme, competing with others for survival. (Truth may be a helpful quality for a meme, but it is only one among many.)
Tunes. This tune has spread for centuries across several continents.
Catchphrases. One text snippet, “What hath God wrought?” appeared early and spread rapidly in more than one medium. Another, “Read my lips,” charted a peculiar path through late 20th-century America. “Survival of the fittest” is a meme that, like other memes, mutates wildly (“survival of the fattest”; “survival of the sickest”; “survival of the fakest”; “survival of the twittest”).
Images. In Isaac Newton’s lifetime, no more than a few thousand people had any idea what he looked like, even though he was one of England’s most famous men. Yet now millions of people have quite a clear idea—based on replicas of copies of rather poorly painted portraits. Even more pervasive and indelible are the smile of Mona Lisa, The Scream of Edvard Munch and the silhouettes of various fictional extraterrestrials. These are memes, living a life of their own, independent of any physical reality. “This may not be what George Washington looked like then,” a tour guide was overheard saying of the Gilbert Stuart portrait at the Metropolitan Museum of Art, “but this is what he looks like now.” Exactly.
Memes emerge in brains and travel outward, establishing beachheads on paper and celluloid and silicon and anywhere else information can go. They are not to be thought of as elementary particles but as organisms. The number three is not a meme; nor is the color blue, nor any simple thought, any more than a single nucleotide can be a gene. Memes are complex units, distinct and memorable—units with staying power.
Also, an object is not a meme. The hula hoop is not a meme; it is made of plastic, not of bits. When this species of toy spread worldwide in a mad epidemic in 1958, it was the product, the physical manifestation, of a meme, or memes: the craving for hula hoops; the swaying, swinging, twirling skill set of hula-hooping. The hula hoop itself is a meme vehicle. So, for that matter, is each human hula hooper—a strikingly effective meme vehicle, in the sense neatly explained by the philosopher Daniel Dennett: “A wagon with spoked wheels carries not only grain or freight from place to place; it carries the brilliant idea of a wagon with spoked wheels from mind to mind.” Hula hoopers did that for the hula hoop’s memes—and in 1958 they found a new transmission vector, broadcast television, sending its messages immeasurably faster and farther than any wagon. The moving image of the hula hooper seduced new minds by hundreds, and then by thousands, and then by millions. The meme is not the dancer but the dance.
For most of our biological history memes existed fleetingly; their main mode of transmission was the one called “word of mouth.” Lately, however, they have managed to adhere in solid substance: clay tablets, cave walls, paper sheets. They achieve longevity through our pens and printing presses, magnetic tapes and optical disks. They spread via broadcast towers and digital networks. Memes may be stories, recipes, skills, legends or fashions. We copy them, one person at a time. Alternatively, in Dawkins’ meme-centered perspective, they copy themselves.
“I believe that, given the right conditions, replicators automatically band together to create systems, or machines, that carry them around and work to favor their continued replication,” he wrote. This was not to suggest that memes are conscious actors; only that they are entities with interests that can be furthered by natural selection. Their interests are not our interests. “A meme,” Dennett says, “is an information-packet with attitude.” When we speak of fighting for a principle or dying for an idea, we may be more literal than we know.
Tinker, tailor, soldier, sailor….Rhyme and rhythm help people remember bits of text. Or: rhyme and rhythm help bits of text get remembered. Rhyme and rhythm are qualities that aid a meme’s survival, just as strength and speed aid an animal’s. Patterned language has an evolutionary advantage. Rhyme, rhythm and reason—for reason, too, is a form of pattern. I was promised on a time to have reason for my rhyme; from that time unto this season, I received nor rhyme nor reason.
Like genes, memes have effects on the wide world beyond themselves. In some cases (the meme for making fire; for wearing clothes; for the resurrection of Jesus) the effects can be powerful indeed. As they broadcast their influence on the world, memes thus influence the conditions affecting their own chances of survival. The meme or memes comprising Morse code had strong positive feedback effects. Some memes have evident benefits for their human hosts (“Look before you leap,” knowledge of CPR, belief in hand washing before cooking), but memetic success and genetic success are not the same. Memes can replicate with impressive virulence while leaving swaths of collateral damage—patent medicines and psychic surgery, astrology and satanism, racist myths, superstitions and (a special case) computer viruses. In a way, these are the most interesting—the memes that thrive to their hosts’ detriment, such as the idea that suicide bombers will find their reward in heaven.
Memes could travel wordlessly even before language was born. Plain mimicry is enough to replicate knowledge—how to chip an arrowhead or start a fire. Among animals, chimpanzees and gorillas are known to acquire behaviors by imitation. Some species of songbirds learn their songs, or at least song variants, after hearing them from neighboring birds (or, more recently, from ornithologists with audio players). Birds develop song repertoires and song dialects—in short, they exhibit a birdsong culture that predates human culture by eons. These special cases notwithstanding, for most of human history memes and language have gone hand in glove. (Clichés are memes.) Language serves as culture’s first catalyst. It supersedes mere imitation, spreading knowledge by abstraction and encoding.
Perhaps the analogy with disease was inevitable. Before anyone understood anything of epidemiology, its language was applied to species of information. An emotion can be infectious, a tune catchy, a habit contagious. “From look to look, contagious through the crowd / The panic runs,” wrote the poet James Thomson in 1730. Lust, likewise, according to Milton: “Eve, whose eye darted contagious fire.” But only in the new millennium, in the time of global electronic transmission, has the identification become second nature. Ours is the age of virality: viral education, viral marketing, viral e-mail and video and networking. Researchers studying the Internet itself as a medium—crowdsourcing, collective attention, social networking and resource allocation—employ not only the language but also the mathematical principles of epidemiology.
One of the first to use the terms “viral text” and “viral sentences” seems to have been a reader of Dawkins named Stephen Walton of New York City, corresponding in 1981 with the cognitive scientist Douglas Hofstadter. Thinking logically—perhaps in the mode of a computer—Walton proposed simple self-replicating sentences along the lines of “Say me!” “Copy me!” and “If you copy me, I’ll grant you three wishes!” Hofstadter, then a columnist for Scientific American, found the term “viral text” itself to be even catchier.
Well, now, Walton’s own viral text, as you can see here before your eyes, has managed to commandeer the facilities of a very powerful host—an entire magazine and printing press and distribution service. It has leapt aboard and is now—even as you read this viral sentence—propagating itself madly throughout the ideosphere!
Hofstadter gaily declared himself infected by the meme meme.
One source of resistance—or at least unease—was the shoving of us humans toward the wings. It was bad enough to say that a person is merely a gene’s way of making more genes. Now humans are to be considered as vehicles for the propagation of memes, too. No one likes to be called a puppet. Dennett summed up the problem this way: “I don’t know about you, but I am not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other people’s ideas renew themselves, before sending out copies of themselves in an informational diaspora…. Who’s in charge, according to this vision—we or our memes?”
He answered his own question by reminding us that, like it or not, we are seldom “in charge” of our own minds. He might have quoted Freud; instead he quoted Mozart (or so he thought): “In the night when I cannot sleep, thoughts crowd into my mind…. Whence and how do they come? I do not know and I have nothing to do with it.”
Later Dennett was informed that this well-known quotation was not Mozart’s after all. It had taken on a life of its own; it was a fairly successful meme.
For anyone taken with the idea of memes, the landscape was changing faster than Dawkins had imagined possible in 1976, when he wrote, “The computers in which memes live are human brains.” By 1989, the time of the second edition of The Selfish Gene, having become an adept programmer himself, he had to amend that: “It was obviously predictable that manufactured electronic computers, too, would eventually play host to self-replicating patterns of information.” Information was passing from one computer to another “when their owners pass floppy discs around,” and he could see another phenomenon on the near horizon: computers connected in networks. “Many of them,” he wrote, “are literally wired up together in electronic mail exchange…. It is a perfect milieu for self-replicating programs to flourish.” Indeed, the Internet was in its birth throes. Not only did it provide memes with a nutrient-rich culture medium, it also gave wings to the idea of memes. Meme itself quickly became an Internet buzzword. Awareness of memes fostered their spread.
A notorious example of a meme that could not have emerged in pre-Internet culture was the phrase “jumped the shark.” Loopy self-reference characterized every phase of its existence. To jump the shark means to pass a peak of quality or popularity and begin an irreversible decline. The phrase was thought to have been used first in 1985 by a college student named Sean J. Connolly, in reference to an episode of the television series “Happy Days” in which the character Fonzie (Henry Winkler), on water skies, jumps over a shark. The origin of the phrase requires a certain amount of explanation without which it could not have been initially understood. Perhaps for that reason, there is no recorded usage until 1997, when Connolly’s roommate, Jon Hein, registered the domain name and created a web site devoted to its promotion. The web site soon featured a list of frequently asked questions:
Q. Did “jump the shark” originate from this web site, or did you create the site to capitalize on the phrase?
A. This site went up December 24, 1997, and gave birth to the phrase “jump the shark.” As the site continues to grow in popularity, the term has become more commonplace. The site is the chicken, the egg and now a Catch-22.
It spread to more traditional media in the next year; Maureen Dowd devoted a column to explaining it in the New York Times in 2001; in 2002 the same newspaper’s “On Language” columnist, William Safire, called it “the popular culture’s phrase of the year”; soon after that, people were using the phrase in speech and in print without self-consciousness—no quotation marks or explanation—and eventually, inevitably, various cultural observers asked, “Has ‘jump the shark’ jumped the shark?” Like any good meme, it spawned mutations. The “jumping the shark” entry in Wikipedia advised in 2009, “See also: jumping the couch; nuking the fridge.”
Is this science? In his 1983 column, Hofstadter proposed the obvious memetic label for such a discipline: memetics. The study of memes has attracted researchers from fields as far apart as computer science and microbiology. In bioinformatics, chain letters are an object of study. They are memes; they have evolutionary histories. The very purpose of a chain letter is replication; whatever else a chain letter may say, it embodies one message: Copy me. One student of chain-letter evolution, Daniel W. VanArsdale, listed many variants, in chain letters and even earlier texts: “Make seven copies of it exactly as it is written” (1902); “Copy this in full and send to nine friends” (1923); “And if any man shall take away from the words of the book of this prophecy, God shall take away his part out of the book of life” (Revelation 22:19). Chain letters flourished with the help of a new 19th-century technology: “carbonic paper,” sandwiched between sheets of writing paper in stacks. Then carbon paper made a symbiotic partnership with another technology, the typewriter. Viral outbreaks of chain letters occurred all through the early 20th century. Two subsequent technologies, when their use became widespread, provided orders-of-magnitude boosts in chain-letter fecundity: photocopying (c. 1950) and e-mail (c. 1995).
Inspired by a chance conversation on a hike in the Hong Kong mountains, information scientists Charles H. Bennett from IBM in New York and Ming Li and Bin Ma from Ontario, Canada, began an analysis of a set of chain letters collected during the photocopier era. They had 33, all variants of a single letter, with mutations in the form of misspellings, omissions and transposed words and phrases. “These letters have passed from host to host, mutating and evolving,” they reported in 2003.
Like a gene, their average length is about 2,000 characters. Like a potent virus, the letter threatens to kill you and induces you to pass it on to your “friends and associates”—some variation of this letter has probably reached millions of people. Like an inheritable trait, it promises benefits for you and the people you pass it on to. Like genomes, chain letters undergo natural selection and sometimes parts even get transferred between coexisting “species.”
Reaching beyond these appealing metaphors, the three researchers set out to use the letters as a “test bed” for algorithms used in evolutionary biology. The algorithms were designed to take the genomes of various modern creatures and work backward, by inference and deduction, to reconstruct their phylogeny—their evolutionary trees. If these mathematical methods worked with genes, the scientists suggested, they should work with chain letters, too. In both cases the researchers were able to verify mutation rates and relatedness measures.
Still, most of the elements of culture change and blur too easily to qualify as stable replicators. They are rarely as neatly fixed as a sequence of DNA. Dawkins himself emphasized that he had never imagined founding anything like a new science of memetics. A peer-reviewed Journal of Memetics came to life in 1997—published online, naturally—and then faded away after eight years partly spent in self-conscious debate over status, mission and terminology. Even compared with genes, memes are hard to mathematize or even to define rigorously. So the gene-meme analogy causes uneasiness and the genetics-memetics analogy even more.
Genes at least have a grounding in physical substance. Memes are abstract, intangible and unmeasurable. Genes replicate with near-perfect fidelity, and evolution depends on that: some variation is essential, but mutations need to be rare. Memes are seldom copied exactly; their boundaries are always fuzzy, and they mutate with a wild flexibility that would be fatal in biology. The term “meme” could be applied to a suspicious cornucopia of entities, from small to large. For Dennett, the first four notes of Beethoven’s Fifth Symphony (quoted above) were “clearly” a meme, along with Homer’s Odyssey (or at least the idea of the Odyssey), the wheel, anti-Semitism and writing. “Memes have not yet found their Watson and Crick,” said Dawkins; “they even lack their Mendel.”
Yet here they are. As the arc of information flow bends toward ever greater connectivity, memes evolve faster and spread farther. Their presence is felt if not seen in herd behavior, bank runs, informational cascades and financial bubbles. Diets rise and fall in popularity, their very names becoming catchphrases—the South Beach Diet and the Atkins Diet, the Scarsdale Diet, the Cookie Diet and the Drinking Man’s Diet all replicating according to a dynamic about which the science of nutrition has nothing to say. Medical practice, too, experiences “surgical fads” and “iatro-epidemics”—epidemics caused by fashions in treatment—like the iatro-epidemic of children’s tonsillectomies that swept the United States and parts of Europe in the mid-20th century. Some false memes spread with disingenuous assistance, like the apparently unkillable notion that Barack Obama was not born in Hawaii. And in cyberspace every new social network becomes a new incubator of memes. Making the rounds of Facebook in the summer and fall of 2010 was a classic in new garb:
Sometimes I Just Want to Copy Someone Else’s Status, Word for Word, and See If They Notice.
Then it mutated again, and in January 2011 Twitter saw an outbreak of:
One day I want to copy someone’s Tweet word for word and see if they notice.
By then one of the most popular of all Twitter hashtags (the “hashtag” being a genetic—or, rather, memetic—marker) was simply the word “#Viral.”
In the competition for space in our brains and in the culture, the effective combatants are the messages. The new, oblique, looping views of genes and memes have enriched us. They give us paradoxes to write on Möbius strips. “The human world is made of stories, not people,” writes the novelist David Mitchell. “The people the stories use to tell themselves are not to be blamed.” Margaret Atwood writes: “As with all knowledge, once you knew it, you couldn’t imagine how it was that you hadn’t known it before. Like stage magic, knowledge before you knew it took place before your very eyes, but you were looking elsewhere.” Nearing death, John Updike reflected on
A life poured into words—apparent waste intended to preserve the thing consumed.
Fred Dretske, a philosopher of mind and knowledge, wrote in 1981: “In the beginning there was information. The word came later.” He added this explanation: “The transition was achieved by the development of organisms with the capacity for selectively exploiting this information in order to survive and perpetuate their kind.” Now we might add, thanks to Dawkins, that the transition was achieved by the information itself, surviving and perpetuating its kind and selectively exploiting organisms.
Most of the biosphere cannot see the infosphere; it is invisible, a parallel universe humming with ghostly inhabitants. But they are not ghosts to us—not anymore. We humans, alone among the earth’s organic creatures, live in both worlds at once. It is as though, having long coexisted with the unseen, we have begun to develop the needed extrasensory perception. We are aware of the many species of information. We name their types sardonically, as though to reassure ourselves that we understand: urban myths and zombie lies. We keep them alive in air-conditioned server farms. But we cannot own them. When a jingle lingers in our ears, or a fad turns fashion upside down, or a hoax dominates the global chatter for months and vanishes as swiftly as it came, who is master and who is slave?

Adapted from The Information: A History, A Theory, A Flood, by James Gleick. Copyright © 2011 by James Gleick. Reprinted with the permission of the author.
James Gleick is the author of Chaos: Making a New Science, among other books. Illustrator Stuart Bradford lives in San Rafael, California.

Find this article at:

Ron Paul’s Farewell Speech in Congress Lays Bare His Hatred for “Pure Democracy,” and Love of Oligarchy

Consortium News [1] / By Robert Parry [2] November 28, 2012  |

Rep. Ron Paul, an icon to the libertarian Right and to some on the anti-war Left, gave a farewell address to Congress that expressed his neo-Confederate interpretation of the Constitution and his anti-historical view of the supposedly good old days of laissez-faire capitalism.

In a near-hour-long rambling speech [3] on Nov. 14, Paul also revealed himself to be an opponent of “pure democracy” because government by the people and for the people tends to infringe on the “liberty” of businessmen who, in Paul’s ideal world, should be allowed to do pretty much whatever they want to the less privileged.

In Paul’s version of history, the United States lost its way at the advent of the Progressive Era about a century ago. “The majority of Americans and many government officials agreed that sacrificing some liberty was necessary to carry out what some claimed to be ‘progressive’ ideas,” said the 77-year-old Texas Republican. “Pure democracy became acceptable.”

Before then, everything was working just fine, in Paul’s view. But the reality was anything but wonderful for the vast majority of Americans. A century ago, women were denied the vote by law and many non-white males were denied the vote in practice. Uppity blacks were frequently lynched.

The surviving Native Americans were confined to oppressive reservations at the end of a long process of genocide. Conditions weren’t much better for the white working class. Many factory workers toiled 12-hour days and six-day weeks in very dangerous conditions, and union organizers were targeted for reprisals and sometimes death.

For small businessmen, life was treacherous, too, with the big monopolistic trusts overcharging for key services and with periodic panics on Wall Street rippling out across the country in bank failures, bankruptcies and foreclosures.

Meanwhile, obscenely rich Robber Barons, like John D. Rockefeller, Andrew Carnegie and J.P. Morgan, personally controlled much of the nation’s economy and manipulated the political process through bribery. They were the ones who owned the real “liberty.”

It took the Great Depression and its mass suffering to finally convince most Americans “that sacrificing some liberty was necessary,” in Paul’s curious phrasing, for them to gain a living wage, a measure of security and a little respect.

So, under President Franklin Roosevelt, laws were changed to shield working Americans from the worst predations of the super-rich. Labor standards were enacted; unions were protected; regulations were imposed on Wall Street; and the nation’s banks were made more secure to protect the savings of depositors.

Many social injustices also were addressed during Ron Paul’s dreaded last century. Women got the vote and their position in the country gradually improved, as it did for blacks and other minorities with the belated enforcement of the equal rights provisions of the 14th Amendment and passage of civil rights legislation.

The reforms from the Progressive Era, the New Deal and the post-World War II era also contributed to a more equitable distribution of the nation’s wealth, making the United States a richer and stronger country. The reforms, initiated by the federal government, essentially created the Great American Middle Class.

Paul’s Complaint

But in Paul’s view, the reformers should have left things the way they were – and he blames the reforms for today’s problems, although how exactly they’re connected is not made clear.

Paul said: “Some complain that my arguments make no sense, since great wealth and the standard of living improved for many Americans over the last 100 years, even with these new policies. But the damage to the market economy, and the currency, has been insidious and steady.

“It took a long time to consume our wealth, destroy the currency and undermine productivity and get our financial obligations to a point of no return. Confidence sometimes lasts longer than deserved. Most of our wealth today depends on debt.

“The wealth that we enjoyed and seemed to be endless, allowed concern for the principle of a free society to be neglected. As long as most people believed the material abundance would last forever, worrying about protecting a competitive productive economy and individual liberty seemed unnecessary.”

But Paul’s blaming “progressive” reforms of the last century for the nation’s current economic mess lacks any logic, more a rhetorical trick than a rational argument, a sophistry that holds that because one thing happened and then some bad things happened, the first thing must have caused the other things.

The reality is much different. Without Theodore Roosevelt’s Progressive Era and Franklin Roosevelt’s New Deal, the direction of America’s capitalist system was toward disaster, not prosperity. Plus, the only meaningful “liberty” was that of a small number of oligarchs looting the nation’s wealth. (It would make more sense to blame the current debt problem on the overreach of U.S. imperialism, the rush to “free trade,” the unwise relaxing of economic regulations, and massive tax cuts for the rich.)

Besides his reactionary fondness for the Gilded Age, Paul also embraces an anti-historical attitude toward the Founding Era. He claimed that the Constitution failed not only because of the 20th Century’s shift toward “pure democracy” but because of a loss of moral virtue among the populace.

“Our Constitution, which was intended to limit government power and abuse, has failed,” Paul said. “The Founders warned that a free society depends on a virtuous and moral people. The current crisis reflects that their concerns were justified.”

However, there’s no compelling evidence that people were more moral in 1787 or in 1912 than they are today. Indeed, one could argue that many slave-owning Founders were far less moral than Americans are now, a time when tolerance of racial, gender and other differences is much greater.

And as for the late 19th and early 20th centuries, the pious morality of the Robber Barons included the cruel exploitation of their workers, the flaunting of obscene wealth amid widespread poverty, and the routine bribery of politicians. How that measures up to moral superiority is a mystery.

In his speech, Paul declared that “a society that boos or ridicules the Golden Rule is not a moral society,” but many of the Founders and the Robber Barons did not follow the Golden Rule either. They inflicted on others great pain and suffering that they would not want for themselves.

Misreading the Constitution

Paul’s historical incoherence extends to what the Framers were doing with the Constitution. He argues that they were seeking “to limit government” in 1787 when they drafted the Constitution. But that was not their primary intent. The Framers were creating a strong and vibrant central government to replace the weak and ineffective one that existed under the Articles of Confederation.

Of course, by definition, all constitutions set limits on the power of governments. That’s what constitutions do and the U.S. Constitution is no exception. However, if the Framers wanted a weak central government and strong states’ rights, they would not have scrapped the Articles of Confederation, which governed the United States from 1777 to 1787. The Articles made the states “independent” and “sovereign” and left the federal government as a supplicant.

The key point, which Paul and other right-wingers seek to obscure about the Constitution, is that it granted broad powers to the central government along with the mandate to address the nation’s “general Welfare.”

The key Framers of the Constitution, particularly George Washington and James Madison, were pragmatists who understood that a strong and effective central government was necessary to protect the independence of a large and sprawling nation. For that reason, they recognized that the Articles had been a failure, preventing the 13 states from functioning as a cohesive nation. Indeed, the Articles didn’t even recognize the United States as a government, but rather as a “league of friendship.”

General Washington, in particular, hated the Articles because they had left his Continental Army begging individual states for supplies during the Revolutionary War. And after the hard-won independence, Washington saw European powers exploiting the divisions among the states and regions to whittle away that independence.

The whole American enterprise was threatened by the principle of states’ rights because national coordination was made almost impossible. It was that recognition which led Madison, with Washington’s firm support, to seek first to amend the Articles and ultimately to throw them out.

When Madison was trying to get Virginia’s endorsement of an amendment to give the federal government power to regulate commerce, Washington wrote: “the proposition in my opinion is so self evident that I confess I am at a loss to discover wherein lies the weight of the objection to the measure.

“We are either a united people, or we are not. If the former, let us, in all matters of a general concern act as a nation, which have national objects to promote, and a national character to support. If we are not, let us no longer act a farce by pretending it to be.” [For more on this background, see Robert Parry’s America’s Stolen Narrative [4].]

On to Philadelphia

After Madison was stymied on his commerce proposal in the Virginia legislature, he and Washington turned their attention to a convention that was technically supposed to propose changes to the Articles of Confederation but, in secrecy, chose to dump them entirely.

When the convention convened in Philadelphia in spring 1787, it was significant that on the first day of substantive debate, there was Madison’s idea of the federal government regulating commerce.

As the Constitution took shape – and the Convention spelled out the sweeping “enumerated powers” to be granted to Congress – Madison’s Commerce Clause was near the top, right after the power to tax, to pay debts, to “provide for the common Defence and general Welfare,” and to borrow money – and even above the power to declare war. Yes, the Right’s despised Commerce Clause, which was the legal basis for many reforms of the 20th Century, was among the “enumerated powers” in Article 1, Section 8.

And gone was language from the Articles of Confederation that had declared the states “sovereign” and “independent.” Under the Constitution, federal law was supreme and the laws of the states could be stricken down by the federal courts.

Immediately, the supporters of the old system recognized what had happened. As dissidents from the Pennsylvania delegation wrote: “We dissent … because the powers vested in Congress by this constitution, must necessarily annihilate and absorb the legislative, executive, and judicial powers of the several states, and produce from their ruins one consolidated government.”

A movement of Anti-Federalists arose, led by the likes of Patrick Henry, to defeat the Constitution. They organized strong opposition in the states’ ratifying conventions of 1788 but ultimately lost, after winning the concession from Madison to enact of Bill of Rights during the first Congress.

The inclusion of the Tenth Amendment, which reserves for the states and the people powers that the Constitution does not give to the federal government, is the primary hook upon which the modern Right hangs its tri-corner hat of anti-federal ideology.

But the amendment was essentially a sop to the Anti-Federalists with little real meaning because the Constitution had already granted broad powers to the federal government and stripped the states of their earlier dominance.

Remaking Madison

The Right’s “scholars” also make much of a few quotes from Madison’s Federalist Paper No. 45, in which he sought to play down how radical a transformation, from state to federal power, he had engineered in the Constitution. Rather than view this essay in context, the Right seizes on Madison’s rhetorical attempts to deflect the alarmist Anti-Federalist attacks by claiming that some of the Constitution’s federal powers were already in the Articles of Confederation, albeit in a far weaker form.

In Federalist Paper No. 45, entitled “The Alleged Danger From the Powers of the Union to the State Governments Considered,” Madison wrote: “If the new Constitution be examined with accuracy, it will be found that the change which it proposes consists much less in the addition of NEW POWERS to the Union, than in the invigoration of its ORIGINAL POWERS.” Today’s Right also trumpets Madison’s summation, that “the powers delegated by the proposed Constitution to the federal government are few and defined. Those which are to remain in the State governments are numerous and indefinite.”

But it should be obvious that Madison is finessing his opposition. Whether or not some shadow of these federal powers existed in the Articles of Confederation, they were dramatically enhanced by the Constitution. In No. 45, Madison even plays down his prized Commerce Clause, acknowledging that “The regulation of commerce, it is true, is a new power; but that seems to be an addition which few oppose, and from which no apprehensions are entertained.”

However, in Federalist Paper No. 14, Madison made clear how useful the Commerce Clause could be as he envisioned national construction projects.

“[T]he union will be daily facilitated by new improvements,” Madison wrote. “Roads will everywhere be shortened, and kept in better order; accommodations for travelers will be multiplied and meliorated; an interior navigation on our eastern side will be opened throughout, or nearly throughout the whole extent of the Thirteen States.

“The communication between the western and Atlantic districts, and between different parts of each, will be rendered more and more easy by those numerous canals with which the beneficence of nature has intersected our country, and which art finds it so little difficult to connect and complete.”

Founding Pragmatism

The Framers also understood that the country would not remain locked in a late 18th Century world. Though they could not anticipate all the changes that would arise over more than two centuries, they incorporated broad powers in the Constitution so the country through its elected representatives could adapt to those times.

The true genius of the Framers was their pragmatism, both for good and ill, in the cause of protecting American independence and unity. On the for-ill side, many representatives in Philadelphia recognized the evils of slavery but accepted a compromise allowing the states to count African-American slaves as three-fifths of a person for the purpose of representation in Congress.

On the for-good side, the Framers recognized that the American system could not work without a strong central government with the power to enforce national standards, so they created one. They transferred national sovereignty from the 13 “independent” states to “We the people.” And they gave the central government the authority to provide for the “general Welfare.”

Yet, the fight over America’s founding principles didn’t end with the Constitution’s ratification in 1788. Faced with a growing emancipation movement – and losing ground to the industrial North – the Southern slave states challenged the power of the federal government to impose its laws on the states. President Andrew Jackson fought back against Southern “nullification” of federal law in 1832 and the issue of federal supremacy was fought out in blood during the Civil War from 1861-65.

Even after the Civil War, powerful regional and economic forces resisted the imposition of federal law, whether intended to benefit freed slaves or to regulate industry. In the latter third of the 19th Century, as Jim Crow laws turned blacks into second-class citizens, John D. Rockefeller, Andrew Carnegie and J.P. Morgan created industrial monopolies that rode roughshod over working-class Americans.

For different reasons, the South’s agrarian oligarchs and the North’s industrial oligarchs wanted the federal government to stay out of their affairs – and they largely succeeded by wielding immense political power until the 20th Century.

Then, in the face of widespread abuses, President Theodore Roosevelt went after the “trusts,” President Franklin Roosevelt responded to the Great Depression with the New Deal, and post-World War II presidents and federal courts began the process of overturning racial segregation.

The Right’s Emergence

In reaction to those changes – federal regulation of the economy and rejection of overt racial discrimination –the modern American Right emerged as a sometimes uneasy coalition between the “free-marketeers” and the neo-Confederates, sharing a mutual hatred of modern liberalism.

Those two groups also drew in other constituencies harboring resentments against liberals, such as the Christian Right – angered over Supreme Court prohibitions on compulsory prayers in public schools and abortion rights for women – and war hawks, drawn from the ranks of military contractors and neoconservative ideologues.

These right-wing movements recognized the importance of propaganda and thus – in the 1970s – began investing heavily in an infrastructure of think tanks and ideological media that would develop supportive narratives and disseminate those storylines to the American people.

It was especially important to convince Americans that the New Deal and federal interference in “states’ rights” were a violation of the Founders’ core principles. Thus, the Right could pretend that it was standing up for the U.S. Constitution and the Left was out of step with American “liberty.”

So, right-wing “scholars” transformed the purpose of the Constitutional Convention and recreated James Madison in particular. Under the Right’s revisionist history, the Constitution was drafted to constrain the power of the federal government and to ensure the supremacy of states’ rights. A few Madison quotes were cherry-picked from the Federalist Papers and the significance of the Tenth Amendment was exaggerated.

The success of the pseudo-history can’t be overstated. From the Tea Party, which arose in angry determination to “take back our country” from the first African-American president, to the hip libertarians who turned the quirky Ron Paul into a cult figure, there was a certainty that they were channeling the true vision of the American Founders.

A large segment of the American Left also embraced Ron Paul because his ideology included a rejection of imperial military adventures and a disdain for government intrusion into personal lives (although he is a devout “right-to-lifer” who would deny women the right to have an abortion).

Paul’s mix of libertarianism and anti-imperialism has proven especially attractive to young white men. He is viewed by some as a principled prophet, predicting chaos because the nation has deviated from the supposed path of “liberty.”

However, as his farewell address revealed, his ideology is a jumble of anti-historical claims and emotional appeals. For instance, he posed unserious questions like “Why can’t Americans decide which type of light bulbs they can buy?” – apparently oblivious to the need for energy conservation and the threat of global warming.

In the end, Ron Paul comes across as little more than a political crank whose few good ideas are overwhelmed by his neo-Confederate thinking and his sophistry about the inherent value of free-market economics.

See more stories tagged with:

ron paul [5],

liberatarian [6],

farewell address [7],

laissez-faire capitalism [8],

confederate [9]

Source URL:



Obama’s challenge: Thinking big

By David Ignatius, Washington Post,  November 2, 2012

“Why We Can’t Solve Big Problems” says the provocative headline in the current issue of MIT Technology Review. This package ought to go in President Obama’s reading pile as he ponders his January inaugural address and second-term agenda.

Jason Pontin, editor-in-chief of the MIT review, introduces his theme by recalling the high age of space exploration — the incredible decade in which the United States, from a standing start, achieved President John F. Kennedy’s promise to put a man on the moon by the end of the 1960s.

“The strongest emotion at the time of the moon landings was of wonder at the transcendent power of technology,” writes Pontin. That sense of awe has diminished, if not disappeared. There hasn’t been a human being on the moon since 1972. And as Pontin writes, “big problems that people had imagined technology would solve, such as hunger, poverty, malaria, climate change, cancer, and the diseases of old age, have come to seem intractably hard.”

The point of Pontin’s exercise, as you might have guessed, is to say that these big problems are, in fact, solvable, if the United States and other advanced countries will widen their ambitions, their public research budgets and their willingness to take risks.

The MIT review gathers a series of manifestos for big-think ideas that are feasible, now. The list includes plans for: carbon capture to slow climate change; genomic medicine to target the array of cellular malfunctions that go under the heading of “cancer”; solar grids to bring electricity to the world’s poorest people; robotic manufacturing and online education to mass produce knowledge and good engineering techniques; a new assault on Alzheimer’s and other forms of dementia; and, yes, a mission to Mars.

Why aren’t these big ideas funded today? Pontin identifies one important factor as the decline in spending for energy research and development, which has fallen from 10 percent of total R&D spending in 1979 to just 2 percent today.

A second, more interesting cause is what Pontin says is a tendency among venture capitalists and other investors to look for small tweaks rather than big, disruptive technology breakthroughs. He quotes Bruce Gibney, a venture capitalist at the San Francisco-based Founders Fund, who offers a harsh explanation: “In the late 1990s, venture portfolios began to reflect a different sort of future. . . . Venture investing shifted away from funding transformational companies and toward companies that solved incremental problems or even fake problems. . . . VC has ceased to be the funder of the future, and instead has become a funder of features, widgets, irrelevances.”

Investors would respond that they’re still looking for the big ideas, so long as they are attached to a reasonable business model. (Indeed, the person who alerted me to the MIT discussion is Pradeep Ramamurthy, a former Obama administration official who now works for a private equity firm called Abraaj Capital.)

Here’s where Obama can make a difference in setting expectations about the future. As he reminded us so often during the presidential campaign, the past four years were largely about rebuilding the damage of the recession and managing orderly retreats from costly foreign wars. This was a period of low expectations, low returns on investment and low tolerance for risk. The president’s own cautious style was a mirror for that of Wall Street investors, who, whatever they might claim, were thinking even smaller than the president.

Can America think bigger during the next four years — not in the usual terms of expansive foreign policy but in terms of rebuilding its economic and technological mastery? It’s likely that Obama will get a budget deal that builds a sound macro-economic foundation for growth, but how will he build on it?

Here’s where a new White House partnership with business can be crucial: It would signal to the country that the president and the leaders of the nation’s biggest finance, tech and manufacturing companies are all going in the same direction. By the end of Obama’s term, America will be approaching energy self-sufficiency and will be a low-cost producer for products that use energy. It’s not crazy, given these fundamentals, to talk about an American revival.

But thinking big about the American economy will require stronger political vision. Except for occasional glimmers, Obama hasn’t shown the quality of sustained, strategic leadership that would make him a transformational president. His team won a political victory that was a piece of genius. Can the White House translate that momentum into a real agenda for governing and growth?–thinking-big/2012/11/28/41c38afe-3981-11e2-8a97-363b0f9a0ab3_story.html?wpisrc=nl_headlines

Meet The Radical Republicans Chairing Important House Committees

By Zack Beauchamp on Nov 28, 2012,


…ThinkProgress’ guide to the views of five of the new committee chairs on the issues they’ll be in charge of, which range from climate change to immigration to financial regulation:

Lamar Smith (Texas) — Science, Space and Technology – …Smith is a climate change skeptic…Smith received significant donations from both Koch industries and the oil and gas sector in his most recent campaign…

Jeb Hensarling (Texas) — Financial Services – …his candidacy was underwritten by Wall Street: banks donated more than seven times as much as the next largest industry to Hensarling’s reelection campaign. Perhaps unsurprisingly, Hensarling wants to take down the Dodd-Frank regulations and thinks taxing the financial industry is “frankly ludicrous.” Hensarling has also called Social Security, Medicare, and Medicaid “cruel Ponzi schemes.”

Ed Royce (California) — Foreign Affairs -  Rep. Royce has a questionable history with respect to people from diverse cultures and backgrounds: last year, he told an anti-Muslim rally that multiculturalism “has paralyzed too many of our citizens to make the critical judgement we need to make to prosper as a society.” He also appears on lead Islamophobic propagandist Frank Gaffney’s radio show, proposed a national version of Arizona’s “papers, please” immigration law, and allegedly sent mailers accusing his Taiwanese-American opponent in the 2012 election of being funded by Chinese Communists.

Michael McCaul (Texas) — Homeland Security -  Rep. McCaul…endorsed .. hearings on Islamic terrorism that..demonized” Muslims. He’s also a drug warrior… celebrated Arizona’s discriminatory “show me your papers” immigration law…

Bob Goodlatte (Virginia) — Judiciary -  Rep. Goodlatte…is staunchly anti-immigrant, opposing a pathway to citizenship…holds fringe views on the Constitution: he believes that Social Security and Medicare are unconstitutional, and that the federal minimum wage may be.

Full text

House Speaker John Boehner (R-OH) has announced the new House committee leaders: a full slate of white men. While many of these Congressmen are holding on positions they’ve already got, there are a few new faces sitting in the Chairperson’s seat. What follows is ThinkProgress’ guide to the views of five of the new committee chairs on the issues they’ll be in charge of, which range from climate change to immigration to financial regulation:

Lamar Smith (Texas) — Science, Space and Technology – Like his predecessor, Rep. Smith is a climate change skeptic. Smith refers to supporters of the scientific consensus as “global warming alarmists” and has criticized the media for not giving equal time to warming skeptics. His official website does say warming is occurring, but does not, as the consensus does, cite human activity as the cause. Unsurprisingly, Smith received significant donations from both Koch industries and the oil and gas sector in his most recent campaign. The new House point man on technology is also the author of the terrible Stop Online Privacy Act (SOPA) and opposes potentially life-saving embryonic stem cell research.

Jeb Hensarling (Texas) — Financial Services – Rep. Hensarling will be the point Republican on anything relating to the financial sector, but his candidacy was underwritten by Wall Street: banks donated more than seven times as much as the next largest industry to Hensarling’s reelection campaign. Perhaps unsurprisingly, Hensarling wants to take down the Dodd-Frank regulations and thinks taxing the financial industry is “frankly ludicrous.” Hensarling has also called Social Security, Medicare, and Medicaid “cruel Ponzi schemes.”

Ed Royce (California) — Foreign Affairs -  Rep. Royce has a questionable history with respect to people from diverse cultures and backgrounds: last year, he told an anti-Muslim rally that multiculturalism “has paralyzed too many of our citizens to make the critical judgement we need to make to prosper as a society.” He also appears on lead Islamophobic propagandist Frank Gaffney’s radio show, proposed a national version of Arizona’s “papers, please” immigration law, and allegedly sent mailers accusing his Taiwanese-American opponent in the 2012 election of being funded by Chinese Communists.

Michael McCaul (Texas) — Homeland Security -  Rep. McCaul, Congress’ richest member, seems primed to carry on his predecessor Peter King’s hardline legacy. McCaul enthusiastically endorsed King’s hearings on Islamic terrorism that, according to the Southern Poverty Law Center, “demonized” Muslims. He’s also a drug warrior who proposed legislation designating Mexican cartels “foreign terrorist organizations,” a move that infuriated the Mexican government and would have given the DEA access to enhanced counterterrorism powers. McCaul has also celebrated Arizona’s discriminatory “show me your papers” immigration law and compared President Obama to King George III.

Bob Goodlatte (Virginia) — Judiciary -  Rep. Goodlatte, like Rep. Royce, is staunchly anti-immigrant, opposing a pathway to citizenship and calling the DREAM act “ripe for fraud.” The Judiciary Committee has principal jurisdiction on immigration. Moreover, Goodlatte holds fringe views on the Constitution: he believes that Social Security and Medicare are unconstitutional, and that the federal minimum wage may be.

Moral values and the fiscal cliff

By Jonathan Haidt and Hal Movius, Washington Post, November 27, 2012


President Obama and House Speaker John Boehner …have to reach a deal themselves, and then convince majorities in the House and Senate to go along…What can they do to improve their odds of beating the clock? Moral psychology can help.

Human beings are “super-cooperators,” the only species on the planet that can form cohesive teams out of non-siblings. Part of our evolved mental toolkit for teamwork is our ability to make something sacred… enhance their cohesiveness by generating heroes, taboos and pledges to uphold certain ideals or commitments.

But the psychology of sacredness makes it harder for negotiators to execute tradeoffs in a utilitarian way. When the Republican presidential candidates all said they would walk away from a deal that offered 10 dollars of spending cuts for each dollar of tax increases, they revealed that tax increases had become a form of sacrilege for the Republican Party—though the recent moves by several Republicans to disavow Grover Norquist’s tax pledge suggest this might be changing.

Sharing moral commitments helps teams to function cohesively, but it also blinds them to reality. They select arguments and narratives that support their preferred policies while denying facts that threaten or contradict their commitments. They sometimes vote for symbolism over substance, even when it harms their material interests or long-term goals. High-stakes negotiations are hard enough, but when sacred values are in play, the odds of success go way down….

So what can our political leaders do to convince their supporters to accept a deal averting the fiscal cliff?

First, they should negotiate—and describe their progress—only in terms of overall packages of options across spending and revenues…

Second, they should jointly call for shared sacrifice…

It may seem counterintuitive, but our political leaders should avoid using the word “compromise” too often. When moral values are at stake, those who compromise may be seen as morally compromised. Compromise will be essential, but it would be more effective for each side to describe its determination to find common ground, and its flexibility and openness in finding novel ways to achieve its long-term goals.

Finally…each side can calm partisan passions by invoking the virtue of humility...

The agreement ultimately reached on the fiscal cliff will not be as exalted as the Constitution, but it can be presented to Congress and the nation as a test of whether we the people are still able, 225 years later, to secure the blessings of liberty to ourselves and our posterity.

Full text

The fiscal cliff negotiations remind us of the long-running game show “Beat the Clock.” Couples had to perform a stunt, such as tying their shoelaces together using only their left hands, before a large clock ticked down to zero. The host would often introduce a twist at the last minute, something like, “Oh, and one more thing, you have to do this while members of the audience throw tomatoes at you.”

President Obama and House Speaker John Boehner must do something far harder than tying their shoelaces together before the clock ticks down to January 1. They have to reach a deal themselves, and then convince majorities in the House and Senate to go along. Oh, and one more thing, they have to do this while being pilloried by their respective bases. What can they do to improve their odds of beating the clock? Moral psychology can help.

Human beings are “super-cooperators,” the only species on the planet that can form cohesive teams out of non-siblings. Part of our evolved mental toolkit for teamwork is our ability to make something sacred—a rock, a tree, a flag, a person or a principle—and then circle around it, literally or figuratively. It’s not just religions that do this. Sports teams, fraternities, political parties and nations at war all enhance their cohesiveness by generating heroes, taboos and pledges to uphold certain ideals or commitments.

But the psychology of sacredness makes it harder for negotiators to execute tradeoffs in a utilitarian way. When the Republican presidential candidates all said they would walk away from a deal that offered 10 dollars of spending cuts for each dollar of tax increases, they revealed that tax increases had become a form of sacrilege for the Republican Party—though the recent moves by several Republicans to disavow Grover Norquist’s tax pledge suggest this might be changing.

Sharing moral commitments helps teams to function cohesively, but it also blinds them to reality. They select arguments and narratives that support their preferred policies while denying facts that threaten or contradict their commitments. They sometimes vote for symbolism over substance, even when it harms their material interests or long-term goals. High-stakes negotiations are hard enough, but when sacred values are in play, the odds of success go way down. (Just ask the Israelis and Palestinians.)

So what can our political leaders do to convince their supporters to accept a deal averting the fiscal cliff?

First, they should negotiate—and describe their progress—only in terms of overall packages of options across spending and revenues. Taken alone, any single issue such as tax rates is likely to trigger diametrically opposed responses and invocations of moral duties. Yet taken together, each side can find specific moral victories. In this case, that could be reining in the growth of government, for Republicans, and making taxes more progressive, for Democrats.

Second, they should jointly call for shared sacrifice. When Winston Churchill became prime minister in 1940, he told Britons that he had nothing to offer except “blood, toil, tears and sweat.” In doing so he activated a powerful psychological mechanism that makes people willing to bear burdens and pay costs when the group’s survival is at stake, and when everyone is called on to pull together as a team.

If our leaders want to be statesmen rather than panderers, they need to do the same. Pledges to protect this or that group from all sacrifice are as counterproductive as pledges never to raise taxes. President Obama and Speaker Boehner should develop shared language to convey to the American people the severity of our problems and the need for all Americans to make some sacrifices.

They can also start using contingent agreements to break impasses. Each side has its own experts, facts and forecasts that yield different conclusions about, say, whether tax increases will slow growth. This invariably stalls policymaking before it even gets a real start. One way to break the stalemate is for negotiators to structure some of the key provisions in the form of “if…then…” statements. In the case of tax increases, an agreement might stipulate that if growth falls below a 2-percent rate for three consecutive quarters, certain revenue-increasing measures will be scaled back for a specified period.

It may seem counterintuitive, but our political leaders should avoid using the word “compromise” too often. When moral values are at stake, those who compromise may be seen as morally compromised. Compromise will be essential, but it would be more effective for each side to describe its determination to find common ground, and its flexibility and openness in finding novel ways to achieve its long-term goals.

Finally, when the clock has ticked down nearly to zero and an agreement is near, each side can calm partisan passions by invoking the virtue of humility. Benjamin Franklin weighed in on the last day of the constitutional convention with these words: “I cannot help expressing a wish that every member of the Convention who may still have objections to it, would with me, on this occasion doubt a little of his own infallibility, and to make manifest our unanimity, put his name to this instrument.”

The agreement ultimately reached on the fiscal cliff will not be as exalted as the Constitution, but it can be presented to Congress and the nation as a test of whether we the people are still able, 225 years later, to secure the blessings of liberty to ourselves and our posterity.

Jonathan Haidt is a professor of business ethics at the NYU-Stern School of Business, and is the author of The Righteous Mind: Why Good People are Divided by Politics and Religion. Hal Movius is the president of Movius Consulting, and is the author of Built to Win: Creating a World-class Negotiating Organization . They are both contributors to

CEO Council Demands Cuts To Poor, Elderly While Reaping Billions In Government Contracts, Tax Breaks

by Christina Wilke and Ryan Grim, 11/25/2012

WASHINGTON — The corporate CEOs who have made a high-profile foray into deficit negotiations have themselves been substantially responsible for the size of the deficit they now want closed.

The companies represented by executives working with the Campaign To Fix The Debt have received trillions in federal war contracts, subsidies and bailouts, as well as specialized tax breaks and loopholes that virtually eliminate the companies’ tax bills.

The CEOs are part of a campaign run by the Peter Peterson-backed Center for a Responsible Federal Budget, which plans to spend at least $30 million pushing for a deficit reduction deal in the lame-duck session and beyond.

During the past few days, CEOs belonging to what the campaign calls its CEO Fiscal Leadership Council — most visibly, Goldman Sachs’ Lloyd Blankfein and Honeywell’s David Cote — have barnstormed the media, making the case that the only way to cut the deficit is to severely scale back social safety-net programs — Medicare, Medicaid, and Social Security — which would disproportionately impact the poor and the elderly.

As part of their push, they are advocating a “territorial tax system” that would exempt their companies’ foreign profits from taxation, netting them about $134 billion in tax savings, according to a new report from the Institute for Policy Studies titled “The CEO Campaign to ‘Fix’ the Debt: A Trojan Horse for Massive Corporate Tax Breaks” — money that could help pay off the federal budget deficit.

Yet the CEOs are not offering to forgo federal money or pay a higher tax rate, on their personal income or corporate profits. Instead, council recommendations include cutting “entitlement” programs, as well as what they call “low-priority spending.”

Many of the companies recommending austerity would be out of business without the heavy federal support they get, including Goldman Sachs and JPMorgan Chase, which both received billions in direct bailout cash, plus billions more indirectly through AIG and other companies taxpayers rescued.

Just three of the companies — GE, Boeing and Honeywell — were handed nearly $28 billion last year in federal contracts alone. A spokesman for Campaign To Fix The Debt did not respond to an email from The Huffington Post over the weekend.

The CEO council recommends two major avenues that it claims will produce “at least $4 trillion of deficit reduction.” The first is to “replace mindless, abrupt deficit reduction with thoughtful changes that reform the tax code and cut low-priority spending.” The second is to “keep debt under control over the long-term by focusing on the long-term growth of entitlement programs.”

CEOs are encouraged to present a Fix-The-Debt PowerPoint presentation to their “employee town hall [meetings and] company meetings.” To further help get the word out, the campaign borrowed a page from the CEOs this fall who wrote letters encouraging their employees to vote for Mitt Romney, or face job cuts. This time, the CFD has created two templates for bosses to use at their companies.

But in the past week, in order to make their case to the millions of Americans who don’t work for them, CEOs fanned out into television, to convince the rest of the country that slashing the social safety net is the only way to reduce the deficit.

In an interview aired Monday, Goldman Sachs chairman and CEO Lloyd Blankfein said Social Security “wasn’t devised to be a system that supported you for a 30 year retirement after a 25-year career.” The key to cutting Social Security, he said, was simply a matter of teaching people to expect less.

“You’re going to have to do something, undoubtedly, to lower people’s expectations of what they’re going to get,” Blankfein told CBS, “the entitlements, and what people think they’re going to get, because you’re not going to get it.”

Blankfein and Goldman Sachs don’t have to worry about lowering expectations. After receiving a $10 billion federal bailout in 2008, and paying it back a few years later, Goldman Sachs recently exceeded Wall Street analysts’ expectations by announcing $8.4 billion in third quarter revenues for 2012. On the heels of a great year, Blankfein is expected to take home an even larger salary than he did in 2011, when he made $16.1 million.

To understand the importance of banking profits to the members of the deficit council, one need look no further than the two top-ranking members of the Campaign To Fix The Debt’s steering committee, former New Hampshire Sen. Judd Gregg (R) and former Pennsylvania Gov. Ed Rendell, a Democrat. Gregg is currently employed as an international adviser to Goldman Sachs, while Rendell collects his paycheck from the boutique investment bank Greenhill & Co.

Following Blankfein’s evening news appearance on Monday, Cote, the Honeywell CEO, sat down with the same network on Tuesday, and said essentially the same thing that Blankfein did.

Cote ranked 11th on a list compiled in a recent study conducted by the Institute for Policy Studies of executives who have saved the most from the Bush tax cuts. According to the IPS, Cote’s taxable compensation for 2011 was a bit more than $55 million, and he did not pay about $2.5 million thanks to the Bush tax cuts.

After mentioning a few scary-sounding deficit statistics, he suggested the government raise revenue by ending individual tax credits and deductions, which he said amounted to a $1 trillion “giveaway” in 2011. It was clear, however, that Cote hadn’t come on the show to talk about taxes.

“The big nut is going to have to be [cuts to] Medicare/Medicaid … especially with the baby boomer generation retiring. It’s going to literally crush the system.”

But while Cote strongly recommends cutting those benefits, when it comes to the tax obligations of corporations, he’s clear about what he wants: a corporate tax rate of zero.

“From a fairness perspective, nobody would be able to stand [a zero tax rate on corporate profits],” but if the U.S. really wanted to create jobs, he said this spring, “we would have the lowest rate possible.”

At Honeywell, Cote practices what he preaches. Between 2008-2010, the company avoided paying any taxes at all. Instead, the company got taxpayer-funded rebates of $34 million off of profits totaling nearly $5 billion.

Part of what makes the lobbying blitz around the fiscal cliff so complex for CEOs on the Fiscal Leadership Council is that many of them need more than just low tax rates. They also need Congress and the White House to maintain current defense spending levels so they can continue winning enormous contracts.

In 2011, $40 billion of taxpayer money was divided among just nine CFD member companies, led by defense giant Boeing, which raked in $22 billion in federal contracts alone, more than the other eight companies combined. For his efforts as CEO, Boeing’s Jim McNerney took home nearly $23 million in compensation last year.

But even as McNerney lends his name to the deficit commission, his company has quietly begun laying off U.S. workers ahead of defense cuts that are expected to be part of a deficit reduction deal. The company denies that federal spending has anything to do with the job cuts, but defense industry analysts aren’t convinced.

At least one faction of Boeing’s workforce is thriving: Boeing lobbyists in Washington have made $12 million since January fighting proposed cuts to defense and aerospace projects.