“I ain’t running for preacher,” Republican presidential candidate Phil Gramm snarled to religious right activists in 1995 when they urged him to run a campaign stressing moral themes. Several months later, despite Gramm’s fund raising prowess, the Texas conservative finished a desultory fifth place in the Iowa caucuses and quickly dropped out of the race. Since then, few candidates have made Gramm’s mistake. Serious contenders for the office recognize that the role and scope of the modern presidency cannot be so narrowly confined. Today’s candidates are running enthusiastically for national preacher — and much else besides.
In the revival tent atmosphere of Barack Obama’s campaign, the preferred hosanna of hope is “Yes we can!” We can, the Democratic front-runner promises, not only create “a new kind of politics” but “transform this country,” “change the world,” and even “create a Kingdom right here on earth.” With the presidency, all things are possible.
Even though Republican nominee John McCain tends to eschew rainbows and uplift in favor of the grim satisfaction that comes from serving a “cause greater than self-interest,” he too sees the presidency as a font of miracles and the wellspring of national redemption. A president who wants to achieve greatness, McCain suggests, should emulate Teddy Roosevelt, who “liberally interpreted the constitutional authority of the office” and “nourished the soul of a great nation.” President George W. Bush, when passing the GOP torch to his former rival in March, declared that the Arizona senator “will bring determination to defeat an enemy and a heart big enough to love those who hurt.” Hillary Clinton, meanwhile, suggests she is “ready on Day 1 to be commander in chief of our economy.”
The chief executive of the United States is no longer a mere constitutional officer charged with faithful execution of the laws. He is a soul nourisher, a hope giver, a living American talisman against hurricanes, terrorism, economic downturns, and spiritual malaise. He — or she — is the one who answers the phone at 3 a.m. to keep our children safe from harm. The modern president is America’s shrink, a social worker, our very own national talk show host. He’s also the Supreme Warlord of the Earth.
This messianic campaign rhetoric merely reflects what the office has evolved into after decades of public clamoring. The vision of the president as national guardian and spiritual redeemer is so ubiquitous it goes virtually unnoticed. Americans, left, right, and other, think of the “commander in chief” as a superhero, responsible for swooping to the rescue when danger strikes. And with great responsibility comes great power.
It’s difficult for 21st-century Americans to imagine things any other way. The United States appears stuck with an imperial presidency, an office that concentrates enormous power in the hands of whichever professional politician manages to claw his way to the top. Americans appear deeply ambivalent about the results, alternately cursing the king and pining for Camelot. But executive power will continue to grow, and threats to civil liberties increase, until citizens reconsider the incentives we have given to a post that started out so humble.
Minimum Leader
It wasn’t supposed to be this way. The modern vision of the presidency couldn’t be further from the Framers’ view of the chief executive’s role. In an age long before distrust of power was condemned as cynicism, the Founding Fathers designed a presidency of modest authority and limited responsibilities. The Constitution’s architects never conceived of the president as the man in charge of national destiny. They worked amid the living memory of monarchy, and for them the very notion of “national leadership” raised the possibility of authoritarian rule by a demagogue ready to create an atmosphere of crisis in order to enhance his power.
The constitutional office they designed gave the president an important role, but he’d have “no particle of spiritual jurisdiction,” the 69th essay of The Federalist Papers tells us. In Federalist No. 48, James Madison assured Americans that under the proposed Constitution the “executive magistracy is carefully limited, both in the extent and the duration of its powers.” Indeed, the very pseudonym the Federalist’s authors chose, “Publius,” says something about how hostile Founding-generation Americans were to the idea of one-man rule. Publius Valerius Poplicola, a hero of the Roman revolution in the 5th century B.C., was famous in part for passing a law providing that anyone suspected of seeking kingship could be summarily executed.
Never were constitutional limitations more essential than when it came to using military power. Early Americans were no strangers to national security threats; in 1787 the U.S. was a small frontier republic on the edge of a continent occupied by periodically hostile great powers and Indian marauders. Yet the Constitution limited emergency powers and sharply rejected the idea that the president was above the law. “In no part of the Constitution,” Madison wrote in 1793, “is more wisdom to be found, than in the clause which confides the question of war or peace to the legislature, and not to the executive department.” In any other arrangement, “the trust and the temptation would be too great for any one man.” That sentiment crossed party lines. As Chief Justice John Marshall wrote in 1801, “the whole powers of war being by the Constitution of the United States vested in Congress, the acts of that body can alone be resorted to as our guides.”
Today Americans expect their president to pound Teddy Roosevelt’s “bully pulpit,” whipping the electorate into a frenzy to harness power against perceived threats. But the Framers viewed that sort of behavior as fundamentally illegitimate. In fact, the president wasn’t even supposed to be a popular leader. As presidential scholar Jeffrey K. Tulis has pointed out, in the Federalist the term leader is nearly always used pejoratively; the essays by Madison, Alexander Hamilton, and John Jay in defense of the Constitution begin and end with warnings about the perils of populist leadership. The first Federalist warns of “men who have overturned the liberties of republics” by “paying obsequious court to the people, commencing demagogues and ending tyrants,” and the last Federalist raises the specter of a “military despotism” orchestrated by “a victorious demagogue.”
Instead of stoking public demands for action, the chief magistrate was expected to resist “the transient impulses of the people” and use his veto to keep Congress within its constitutional bounds. That role didn’t require much speechifying. Early presidents rarely spoke directly to the public; from George Washington through Andrew Jackson, they averaged little more than three speeches per year, with those mostly confined to ceremonial addresses. In his first year in office, by comparison, President Clinton delivered 600.
In the early State of the Union addresses to Congress, presidents knew better than to adopt an imperious tone. After his third SOTU, Washington wrote that “motives of delicacy” had deterred him from “introducing any topic which relates to legislative matters, lest it should be suspected that [I] wished to influence the question” before Congress. Yet the deference shown by Washington and his successor John Adams didn’t go quite far enough for our third president, Thomas Jefferson, who thought their practice of speaking before the legislature in person smacked of the British king’s “Speech From the Throne.” Jefferson instead inaugurated a new tradition of delivering the annual message in writing. For 112 years, that Jeffersonian tradition held sway, until the power-hungry Woodrow Wilson delivered his first State of the Union in person.
The 19th century did see presidents occasionally taking independent action of enormous consequences: Jefferson purchased Louisiana without congressional approval, Madison seized West Florida in 1810, Andrew Jackson governed as an irritable populist, and Abraham Lincoln expanded presidential power dramatically throughout the course of the cataclysmic Civil War. Yet taken as a whole, the 19th-century presidency was a pale shadow of the plebiscitary office we know today.
In a 2002 study tracking word usage through two centuries of SOTUs and inaugural addresses, political scientist Elvin T. Lim noted that in the first decades under the Constitution presidents rarely mentioned poverty, and the word help did not even appear until 1859. Nor did early presidents subscribe to the modern notion that it’s all “about the children”; they rarely even mentioned the little buggers. But Lim found that “Presidents Carter, Reagan, Bush, and Clinton made 260 of the 508 references to children in the entire speech database, invoking the government’s responsibility to and concern for children in practically every public policy area.”
George Washington did mention kids in his seventh annual message, lamenting “the frequent destruction of innocent women and children” by Indian raiders. But that was a far cry from Bill Clinton in 1997, who declared in the State of the Union that “we must also protect our children by standing firm in our determination to ban the advertising and marketing of cigarettes that endanger their lives.”
Wail to the Chief
A little-remembered vignette from the 1992 presidential race underscores how far we’ve traveled from the Framers’ unassuming “chief magistrate” — and how infantile our politics have become along the way. The scene was the campaign’s second televised debate, held in Richmond, Virginia; the format, a horrid Oprah-style arrangement in which a hand-picked audience of allegedly normal Americans got to lob questions at the candidates, who were perched on stools, trying to look warm and approachable. Up from the crowd popped a ponytailed social worker named Denton Walthall, who demanded to know what George H.W. Bush, Bill Clinton, and H. Ross Perot were going to do for us.
“The focus of my work as a domestic mediator is meeting the needs of the children that I work with…and not the wants of their parents,” Walthall said. “And I ask the three of you, how can we, as symbolically the children of the future president, expect the three of you to meet our needs, the needs in housing and in crime and you name it.”
One wonders how some of the more irascible presidents of old would have reacted at the sight of a grown man burbling about childish necessities to the prospective national father. Yet under the hot lights of the 1992 campaign, Ross Perot said he’d cross his heart and take Walthall’s pledge to meet America’s infantile needs, whatever those were. Bill Clinton, being Bill Clinton, pandered. And Bush 41 spluttered through his answer thusly:
“I mean I — I think, in general, let’s talk about these — let’s talk about these issues; let’s talk about the programs, but in the presidency a lot goes into it. Caring is…that’s not particularly specific; strength goes into it, that’s not specific; standing up against aggression, that’s not specific in terms of a program. So I, in principle, I’ll take your point and think we ought to discuss child care — or whatever else it is.” That wasn’t just an example of the Bush family’s famous locution problems; it’s hard not to stammer when faced with the limitless and bewildering demands the public places on the presidency.
How did we go from a reticent constitutional officer to the modern commander in chief, a figure who continually shifts back and forth between gushing empathy and military bluster, often within the same speech? As Tony Soprano might have put it, whatever happened to Calvin Coolidge, the strong, silent type?
There is no single explanation for the presidency’s growth. New communication technologies such as radio and television played a role, as did growing material progress, which made Americans less willing to suffer inconveniences and more receptive to the belief that public problems could be solved with collective action. Yet in each key period of the presidency’s growth, we see a familiar pattern: expansionist ideology meeting practical opportunity in the form of successive national crises.
The 100-Year Emergency
Much of what’s wrong with American government today can be traced to the Progressive Era, that period of reformist backlash against the Industrial Revolution that dominated the decades surrounding the turn of the 20th century. As the Progressives saw it, if the Constitution stood in the way of necessary reforms, then too bad for the Constitution. “We are the first Americans,” a young scholar named Woodrow Wilson wrote in 1885, “to hear our own countrymen ask whether the Constitution is still adapted to serve the purposes for which it was intended; the first to entertain any serious doubts about the superiority of our institutions as compared with the systems of Europe.”
The Progressives were “the nearest to presidential absolutists of any theorists and practitioners of the presidency,” wrote Raymond Tatalovich and Thomas S. Engeman in their 2003 book The Presidency and Political Science: Two Hundred Years of Intellectual Debate. For the new century’s reformers, power wielded for national greatness was benign, checks on such power perverse. The Progressives had no use for the restrained oratorical traditions of the 19th century; it was the president’s job to move the masses, unifying them behind calls for bold executive action.
Their model and embodiment was Teddy Roosevelt, whom the Progressive journalist and New Republic founder Herbert Croly described as a “sledgehammer in the cause of national righteousness.” When T.R. took the stage at the 1912 Progressive Party convention, he foreshadowed Obama’s quasi-religious fervor and McCain’s bellicosity, barking, “To you who strive in a spirit of brotherhood for the betterment of our Nation, to you who gird yourselves for this great new fight in the never-ending warfare for the good of humankind, I say in closing.…We stand at Armageddon, and we battle for the Lord!”
The most astute among the Progressives recognized that, given the American public’s congenital resistance to centralized rule, a sustained atmosphere of crisis would be necessary to sell the expansion of White House power. Two world wars and one Great Depression did the trick nicely. T.R.‘s activist, celebrity presidency heralded the coming of a new sort of chief executive, one who would evermore be the center of national attention, the motive force behind American government. With his expanded power, Roosevelt busted trusts, carried a big stick throughout the Americas with a newly imperial U.S. Navy, and issued nearly as many executive orders as all of his predecessors combined. Woodrow Wilson then proved what Progressives had long hypothesized: that soaring rhetoric combined with the panicked atmosphere of war could concentrate massive social power in the hands of one person. Over the course of his presidency he helped create the Federal Reserve, nationalized railroads, and used the Espionage and Sedition Acts (along with more than 150,000 vigilantes) to carry out the most brutal campaign against dissent in U.S. history.
But it took FDR to eliminate the last remaining vestiges of the modest presidency. Roosevelt used Wilson’s Trading With the Enemy Act to shut down all U.S. banks in 1933, grabbed the power to approve or prescribe wages and prices for all trades and industries, and authorized the FBI to spy on suspected subversives. He changed the Supreme Court from a bulwark against presidential overreach to an enabler. By the end of his 12-year reign, FDR had firmly established the president as national protector and nurturer, one whose performance would be judged in terms of what political scientist Theodore Lowi has identified as the modern test of executive legitimacy: “service delivery.” In his 11th State of the Union address, FDR conjured up a second Bill of Rights, one whose guarantees would include “a useful and renumerative job” and the “right of every farmer to…a decent living.” Depression-era economic controls and war-driven centralization had turned the American system of government, in Lowi’s words, into “an inverted pyramid, with everything coming to rest on a presidential pinpoint.”
War was the health of the presidency during the long twilight struggle against the Soviet Union as well. “The worse matters get,” Harry Truman’s adviser Clark Clifford told him in 1948, “the more is there a sense of crisis. In times of crisis, the American citizen tends to back up his president.” During the Cold War, presidents used the all-purpose rationale of national security to justify spying on their political enemies. Richard Nixon might have been the most notorious abuser, with a series of dirty tricks and flagrant offenses that led to his downfall, but his predecessors also wielded the presidential bludgeon with gusto. When American steel companies raised prices in 1962, John F. Kennedy declared privately that “they fucked us, and now we’ve got to fuck them,” then (along with his attorney general, brother Bobby) ordered up wiretaps, Internal Revenue Service audits and early-morning raids on steel executives’ homes. During the 1964 presidential race, Lyndon Johnson used the CIA to obtain advance copies of Barry Goldwater’s campaign speeches, and the FBI to bug Goldwater’s plane.
In the pre-Watergate age of the heroic presidency, public trust in government was at its height, and mainstream scholars lauded the presidency as an earthly manifestation of the living God. As political scientist Herman Finer put it in 1960, the office was “the incarnation of the American people in a sacrament resembling that in which the wafer and the wine are seen to be the body and blood of Christ.” The president, Finer said, was “the offspring of a titan and Minerva husbanded by Mars.”
I Hate You; Don’t Leave Me
After Vietnam and Watergate, America’s intoxication with the imperial presidency ended with a crushing hangover. A newly aggressive press and assertive Congress produced serial revelations of the executive abuses that blind trust had enabled. In the bicentennial year of 1976, Idaho Sen. Frank Church’s Committee to Study Governmental Operations With Respect to Intelligence Activities summed up the damage:
“For decades Congress and the courts as well as the press and the public have accepted the notion that the control of intelligence activities was the exclusive prerogative of the Chief Executive and his surrogates. The exercise of this power was not questioned or even inquired into by outsiders. Indeed, at times the power was seen as flowing not from the law, but as inherent, in the Presidency. Whatever the theory, the fact was that intelligence activities were essentially exempted from the normal system of checks and balances. Such executive power, not founded in law or checked by Congress or the courts, contained the seeds of abuse and its growth was to be expected.”
During the Eisenhower 1950s and the JFK/LBJ 1960s, the newly ascendant conservative movement coalescing around Barry Goldwater and William F. Buckley’s National Review was the most potent source of criticism of the imperial presidency. “Others hail the display of presidential strength…simply because they approve of the result reached by the use of power,” Goldwater wrote in his 1964 campaign manifesto, “This is nothing less than the totalitarian philosophy that the end justifies the means.”
But enticed by the long-awaited prospect of an “emerging Republican majority” and turned off by the journalistic and congressional attacks on Nixon, conservatives learned to stop worrying and love the executive branch. During the post-Watergate reform era, two senior Gerald Ford White House aides named Dick Cheney and Donald Rumsfeld fought tooth and nail against what they felt were dangerous shackles on the executive branch, supported by a conservative commentariat that refocused its ire on the Democratic Congress and the left-leaning press. “I didn’t like Nixon until Watergate,” National Review stalwart M. Stanton Evans once quipped.
Although Americans finally recovered their native skepticism toward power after Vietnam, Watergate, and the revelations of the Church committee, we never reduced our demands on the executive branch. The lesson we seemed to have learned from the legacy of abuses was to trust less, ask more. In 1998 the Pew Research Center noted that “public desire for government services and activism has remained nearly steady over the past 30 years.” Two years later, a report on a survey by NPR, the Kaiser Family Foundation, and Harvard’s John F. Kennedy School of Government put it pithily: “Americans distrust government, but want it to do more.” The spirit of Denton Walthall lived on in the years leading up to the terrorist attacks of September 11, 2001.
Superman Returns
The Bush administration’s extraconstitutional innovations in response to those attacks are by now all too familiar. John Yoo, David Addington, and other members of the president’s legal team constructed an alternative version of the national charter, a “neoconstitution” in which the president has unlimited power to launch war, wiretap without judicial scrutiny, and even seize American citizens on American soil and hold them for the duration of the War on Terror — in other words, indefinitely — without ever having to answer to a judge.
Conventional accounts of the post‑9/11 imperial presidency emphasize the role of dedicated ideologues within the administration, men and women who had long believed that post-Watergate America had swung the pendulum too far back, jeopardizing national security. There’s good reason for that emphasis, but the “cabal of neocons” narrative risks obscuring the role that public demands have played in driving the centralization of power.
In his 2007 book The Terror Presidency, Jack Goldsmith, the former head of the president’s Office of Legal Counsel, describes the prevailing atmosphere within the executive branch after 9/11, one where the president’s men were acutely aware that all eyes were on the commander in chief. What is he doing to keep us safe? What more is he prepared to do?
Goldsmith, a dissenter from the Bush administration’s absolutist theories of executive power, often clashed with Dick Cheney’s deputy David Addington, the hardest-driving supporter of those theories. But Goldsmith understood why Addington was so unrelenting: “He believed presidential power was coextensive with presidential responsibility. Since the president would be blamed for the next homeland attack, he must have the power under the Constitution to do what he deemed necessary to stop it, regardless of what Congress said.”
That dynamic can lead to enhanced presidential power even in areas far removed from the War on Terror, as was demonstrated in the aftermath of Hurricane Katrina. In business or in government, responsibility without authority is every executive’s worst nightmare. That was the political reality facing the Bush administration in late summer 2005, when New Orleans was under water and desperate for assistance. As Colby Cosh of Canada’s National Post put it at the time, “the 49 percent of Americans who have been complaining for five years about George W. Bush being a dictator are now vexed to the point of utter incoherence because for the last fortnight he has failed to do a sufficiently convincing impression of a dictator.”
To be sure, the administration deserved plenty of blame for bungling the disaster relief tasks it had the power to carry out. But it soon became clear that the public held the Bush team responsible for performing feats above and beyond its legal authority. One almost had to feel sorry for Michael “Heckuva Job” Brown(ie), the disgraced former Federal Emergency Management Agency head, when he was obliged on Capitol Hill a month after the hurricane to inform an irate Rep. Chris Shays (R‑Conn.) that in our federalist system, the FEMA chief has no power to order mandatory evacuations, or to become “this superhero that is going to step in there and suddenly take everybody out of New Orleans.” “That is just talk,” Shays responded. “Were you in contact with the military?”
For a president beleaguered by public demands, seizing new powers can be an adaptive response. Small wonder, then, that the Bush administration promptly sought enhanced authority for domestic use of the military. Although few in the media noted the historical moment, the president received that authority. On October 17, 2006, the same day he signed the Military Commissions Act denying centuries-old habeas corpus rights to “enemy combatants,” the president also signed a defense authorization bill that contained gaping new exceptions to the Posse Comitatus Act of 1878, the federal law that restricts the president’s power to use the standing army to enforce order at home.
The new exceptions to the act gave the president power to use U.S. armed forces to “restore public order and enforce the laws” when confronted with “natural disasters,” “public health emergencies,” and “other…incidents” — a catchall phrase that radically expands the president’s ability to use troops against his own citizens. Under it, the president can, if he chooses, fight a federal War on Hurricanes, declaring himself supreme military commander in any state where he thinks conditions warrant it. That’s the kind of executive power grab that happens when the public demands that the president protect Americans from the hazards of cyclical bad weather.
2009 and Beyond
To understand is not to excuse: No president should have the powers President Bush has sought and seized during the last seven years. But after 9/11 and Katrina, what rationally self-interested chief executive would hesitate to centralize power in anticipation of crisis? That pressure would be hard to resist, even for a president devoted to the Constitution and respectful of the limited role the office was supposed to play in our system of government.
In the current presidential race, none of the major-party candidates comes close to fitting that description. Aside from the issue of torture, there’s very little daylight between John McCain and George W. Bush on matters of executive power. For her part, Hillary Clinton claims she played a key role in her husband’s undeclared war against Serbia in 1999. “I urged him to bomb,” she told Talk magazine that year. In 2003 she told ABC’s George Stephanopoulos: “I’m a strong believer in executive authority. I wish that, when my husband was president, people in Congress had been more willing to recognize presidential authority.”
Barack Obama has done more than any candidate in memory to boost expectations for the office, which were extraordinarily high to begin with. Obama’s stated positions on civil liberties may be preferable to McCain’s, but would it matter? If and when a car bomb goes off somewhere in America, would a President Obama be able to resist resorting to warrantless wiretapping, undeclared wars, and the Bush theory of unrestrained executive power? As a Democrat without military experience, publicly perceived as weak on national security, he’d have much more to prove.
As Jack Goldsmith put it in his 2007 book, “For generations the Terror Presidency will be characterized by an unremitting fear of attack, an obsession with preventing the attack, and a proclivity to act aggressively and preemptively to do so.…If anything, the next Democratic President — having digested a few threat matrices, and acutely aware that he or she alone will be wholly responsible when thousands of Americans are killed in the next attack — will be even more anxious than the current President to thwart the threat.”
Law professors Jack Balkin of Yale and Sanford Levinson of the University of Texas at Austin are both Democrats and civil libertarians, so they take no pleasure in their prediction that “the next Democratic President will likely retain significant aspects of what the Bush administration has done.” Indeed, they write in a 2006 Fordham Law Review article, future Democratic presidents “may find that they enjoy the discretion and lack of accountability created by Bush’s unilateral gambits.”
Throughout the 20th century more and more Americans looked to the central government to deal with highly visible public problems, from labor disputes to crime waves to natural disasters. And as responsibility flowed to the center, power accrued with it. If that trend continues, responses to matters of great public concern will be increasingly federal, increasingly executive, and increasingly military.
In the years to come, many Americans will find that the results of executive action are not to their liking. And if history is any guide, they’ll respond by vilifying the officeholder and looking for another man on horseback to set things right again.
In The Road to Serfdom, economist and political philosopher F.A. Hayek chastised the “socialists of all parties” for their belief that “it is not the system we need fear, but the danger it might be run by bad men.” Today’s “presidentialists of all parties” — a phrase that describes the overwhelming majority of American voters — suffer from a similar delusion. Our system, with its unhealthy, unconstitutional concentration of power, feeds on the atavistic tendency to see the chief magistrate as our national father or mother, responsible for our economic well-being, our physical safety, and even our sense of belonging. Relimiting the presidency depends on freeing ourselves from a mind-set one century in the making. One hopes that it won’t take another Watergate and Vietnam for us to break loose from the spellbinding cult of the presidency.