What is a President? The CEO of Capitalism

Ongoing left debates regarding Bernie Sanders’ presidential campaign are frequently characterized by a shared premise. Whether arguing, for instance, that Sanders is dismissive of race or countering that his emphasis on economics necessarily entails anti-racism, both sides tend to assume that Sanders would be able to meaningfully advance his politics if he were to become president. That is, both sides generally presuppose the liberal notion of pluralism, which conceives of a neutral and malleable state that can be shaped and reshaped by those who govern it.

The history of the presidency illustrates a very different story, one in which the political party and personal inclinations of presidents (let alone candidates) are generally irrelevant to how they wield power. Presidents – whether Constitutional Law professor/community organizers or religious zealots with MBAs – historically have advanced the objective interests of the nation-state, prioritizing its international power and the profitability of its economy above all other considerations. Notwithstanding cogent left criticisms of Sanders, the key question is not whether Sanders is a phony but what, if elected president, he will in fact be sworn to do. In other words, what are presidents?

The Constitution was of course designed to replace the Articles of Confederation, whose preservation of revolutionary anti-monarchism (“The Spirit of 1776”) resulted in what the framers came to fear as a dangerously weak state. The decentralized Articles did not have an executive and instead placed power in the legislature (the “People’s Branch”) and the states. Not only did such decentralization preclude national coherence but it also prevented the national government from raising taxes and thereby armies, leaving it, among other things, unequipped to suppress mass debtor insurrections.

Encouraging state legislatures to eliminate debts through inflating state currencies and issuing “stay laws,” debtor insurrections horrified leaders who argued that revolutionary liberty had gone “too far.” Indeed, debtors’ repudiation of property rights (sometimes destroying debt records directly) reflected the growing power of Hamilton and Madison’s dreaded (if not oxymoronic) “majority faction,” which according to Madison threatened not merely the small creditor class but the “permanent and aggregate interests of the community” as well.

Significantly, the Framers discussed the threat of foreign invasion and the threat of domestic insurrection in the same vein. But while the former would clearly challenge the national character of the state, the latter – conducted by citizens after all – would not. That is, Madison and Hamilton’s nation-state is not a clean slate of pluralistically competing factions but has instead always been intrinsically defined by the general interests and demands – if not the personal economic interests of the founders – of the propertied class. Aggregating concrete competing interests into an imagined national community, the framers established antagonistic property relations as the cornerstone of the nation-state and, more specifically, guaranteed that the propertied few would be protected from the property-less many. Accordingly, the Framers designed a government that “multiplied” and “diffused” factions while “filtering” the “violent passions” of the masses through “insulated” and “responsible” “elites” in order to obstruct the majority’s inevitable “rage for paper money, for abolition of debts, for an equal division of property, or for any other improper or wicked project….”

Steward of the State

The Constitution not only centralized power but also eliminated the legislature’s dominance by establishing a bicameral Congress and a “separation of powers” that enabled the executive to become supreme. Article II granted the president a powerful veto, and its provision for unity and relative vagueness provided the executive with the tools for the “energy,” “decision, activity, secrecy, and dispatch” deemed necessary for “strong government.” Aghast at the power of the Constitution in general and the new executive in particular, Patrick Henry warned that the “tyranny of Philadelphia” would come to resemble the tyranny of King George.

Predictably, George Washington exploited Article II’s vagueness, invoking the “take care” clause to crush the Whiskey Rebellion and capitalizing on the omission of Article I’s qualifier “herein granted shall be vested in” to issue the Neutrality Proclamation. But it was not until Thomas Jefferson’s presidency that the objective character of the presidency became manifestly clear. It is indeed an emblematic irony of U.S. history that while the Jeffersonians won most of the early presidential elections, continental and international imperial pressure to expand led them to frequently implement Hamiltonian policies once in office. While Washington and Adams (one also thinks of the Alien and Sedition Acts) expressed Hamiltonian political orientations, Jefferson personified a diametrically opposed U.S. political tradition. Whereas Hamilton was a loose constructionist who advocated for a large national government and a strong executive that would pursue manufacturing following the British model of development, Jefferson was a strict constructionist who advocated for a small national government and weak executive that would pursue agrarianism following the French model of development. Yet, in spite of his lifelong principles, Jefferson in significant respects presided like a Hamiltonian, violating his strict constructionism via the Louisiana Purchase and the Fourth Amendment via his aggressive, albeit unsuccessful, Embargo Act.

Andrew Jackson continued this pattern, expanding the power of the executive as well as the national government notwithstanding his previous advocacy of small government and states’ rights. Beyond his unprecedentedly aggressive use of the veto (Jackson was the first president to use the veto on policies he merely disliked instead of those deemed unconstitutional), Jackson threatened to use military force against South Carolina if it did not yield to the national government during the Nullification Crisis. And it is notable that when Jackson did support states’ rights after Georgia violated the Supreme Court’s ruling in Worcester v. Georgia, it was in the name of expelling the Southeast’s Native-Americans in order to clear the land for profitable exploitation by African American slaves. That is, Jackson supported the states as long as they were pursuing nation-building rather than their own parochial interests.

And though the growth of the executive was neither even nor always linear, its long-term evolution has been characterized more than anything else by massive and bipartisan aggrandizement. Even periodic setbacks, such as the Congressional backlash against Nixon’s “imperial presidency,” proved to be ephemeral. Reagan merely danced around the War Powers Resolution in his illegal funding of the Contras, while Obama circumvented the WPR by declaring that his war on Libya wasn’t in fact a war. By the time of the George W. Bush Administration, the executive – usurping the Congress via signing statements and the courts via military tribunals, among countless other encroachments – had unprecedentedly expanded its power. Contrary to liberal mythology, Bush was hardly an anomaly, as his response to 9/11 built upon Clinton’s attack on civil liberties following the Oklahoma City bombing, just as Obama’s “kill lists,” surveillance, and drone warfare have expanded Bush’s apparently permanent state of exception.

Manager of Capitalism

It is important to note that this expansion of executive power did not occur in a vacuum. On the contrary, executive aggrandizement has more often than not correlated to emergencies in general and capitalist crises in particular. As “steward” of the system, to use Theodore Roosevelt’s appellation, the modern president is devoted not only to expanding the power of the state vis-à-vis international competitors but also to maintaining the conditions for the capitalist economy with which it, in large measure, competes. Jackson aimed to open new arenas for capitalist accumulation not only through the primitive accumulation of Indian removal and chattel slavery but also through eliminating corrupt, monopolistic, and ossified economic institutions such as the Charles River Bridge Company and Biddle’s Bank.

Jackson’s incipient capitalism had become a mature and complex system producing enormous social and political problems by the turn of the century. In turn, Theodore Roosevelt radically expanded presidential power by inverting Jefferson’s interpretation of the Constitution: while Jefferson claimed that the president can only do what the Constitution explicitly permitted, Roosevelt claimed that the president could do anything that the Constitution did not explicitly forbid. As such, Roosevelt intervened in the Coal Strike of 1902 and threatened to seize and run the mines after failing to initiate arbitration meetings, while the Hepburn Act saw the U.S. issuing price controls for the first time.

Although progressives applauded the executive’s reinvention as a “trust-busting” “referee” after decades of pro-business policies, the presidency had in fact remained consistent in its relationship to capitalism. When nascent capitalism required primitive accumulation and (selective) laissez-faire, Jackson gave the system what it needed; when rampaging capitalism threatened to destroy its own social and economic bases during the Gilded Age, Theodore Roosevelt did the same.

Before (if at all) considering the interests of the people that he nominally represents, the president must insure that they constitute a ready and exploitable workforce in the case of economic expansion or that they do not threaten the state’s social and political stability in the case of depression. Indeed, the president (though typically not more myopic business leaders) has frequently recognized the danger of killing the golden goose during capitalist crises, a point made explicitly by that giant of the liberal imagination, FDR.  As recounted by Neil Smith in The Endgame of Globalization, FDR explained his rationale for the New Deal to business leaders: “‘I was convinced we’d have a revolution’ in the US ‘and I decided to be its leader and prevent it. I’m a rich man too,’ he continued, ‘and have run with your kind of people. I decided a half loaf was better than none – a half for me and a half for you and no revolution.'” Such cynical calculations allow us to reconcile the “good FDR” of the New Deal with the “bad FDR” who interned Japanese-Americans and firebombed Tokyo, Dresden, and other urban centers.

Notwithstanding the limitations of the New Deal (which among other things emphasized selective social redistribution at the expense of preserving mass exploitation), the Keynesian rescue package had run out of gas by 1973. Amid renewed global competition and the increase in oil prices, profit contracted, but for the first time since the postwar “Golden Age of Capitalism” had begun, spending no longer mitigated the effects of the glut. According to Tony Judt, Labor Prime Minister James Callaghan had “glumly explained to his colleagues, ‘We used to think that you could just spend your way out of a recession…I tell you, in all candour, that that option no longer exists.’”

It was within this context that laissez-faire, now refashioned as neoliberalism, rose from the dead, as it provided the apparent solutions (e.g., privatization, tax cuts, and deregulation) that Keynesianism could not. Put differently, capitalism generated a second wind not only by moving investment from industry to finance but also by cannibalizing the apparatus that had helped rescue it from its previous crisis. The growing chasm separating postwar liberal politics from the post-1970s new economics gave rise to “new” liberals including Clinton, Blair, Schroeder, Obama, and Hollande, who, operating within an increasingly limited range of action, attempted to manage liberalism’s strategic retreat. In so doing, liberal politicians have frequently compensated for their exhausted economic programs by embracing cultural issues, a strategy that has been termed, “Let them eat marriage.” While liberals accurately note that the monstrous right would be “even worse,” their warning is nevertheless dishonest insofar as it ignores that liberals are wedded to the political-economic system whose noxious effects produce such reactionaries in the first place.

Lest we conclude that this is a case of the domestic political cart leading the economic horse, it is crucial to reiterate that the collapse of economic liberalism has been a global phenomenon, whether expressed through Bill Clinton’s declaration that “the era of big government is over,” Francois Mitterand’s assertion that “‘The French are starting to understand that it is business that creates wealth, determines our standard of living and establishes our place in the global rankings,”’ or anti-austerity Syriza’s ongoing implementation of austerity.

That is, assuming that it would be desirable, the New Deal is unlikely to return (although a new world war or some other catastrophe can indeed press the “restart” button on capitalist development assuming there’s anyone left to exploit). Given the enormous global economic and structural constraints delimiting the presidency, it is possible to argue that Barack Obama, demonstrating prodigious “activity,” has done a remarkable job in advancing his domestic and international agendas. Rather than being “weak” or a “sell-out,” Obama very well might be, as liberals stress, the best we can hope for – a possibility that more than anything else radically indicts the system itself.

Obama’s political victories on Iran, Cuba, healthcare, and gay marriage should not be compared to his failures. They should instead be compared to his other, far more reactionary, achievements including Afghanistan, Libya, Yemen, Pakistan, the Tran-Pacific Partnership Trade Treaty, mass surveillance, and the prosecution of whistleblowers, policies regularly conducted with Hamiltonian “energy,” “decision,” “secrecy,” and “dispatch.” These latter policies neither contradict nor are inconsistent with Obama’s liberal successes. Their common denominator is the presidential articulation of the primacy of the nation-state – and thereby capital accumulation – above all other concerns. The voters’ concerns are considered only when they are serviceable to these paramount interests.

Given the enormous powerlessness of the voter, it is unsurprising that the injunction “hope” so often accompanies political campaigns. Bill Clinton was “The Man from Hope,” Obama campaigned on “Hope,” and, overseas, Syriza promised that “Hope is Coming.” Selecting who will rule without any ability to control the content of that rule, the voter casts the ballot as an act of faith. Investing political and emotional energy into nothing more than the good name of the system (election nights are always exercises in flag-waving celebration of a system that lets us choose our rulers), voters incorrectly argue that voting is better than doing nothing and condemn those who abstain. Yet, the disillusioned are not to blame for forces that they have no control over. And if the disillusioned do become interested in challenging the abuses of everyday life, it will not be through voting but through criticizing the system that voting acclaims. The opposite of hope is not despair. It is power.

Joshua Sperber teaches political science and history. He is the author of Consumer Management in the Internet Age. He can be reached at jsperber4@gmail.com