Ludwig von Mises Institute
No institution of modern life commands as much veneration as democracy. It comes closer than anything else to being the supreme object of adoration in a global religion. Anyone who denies its righteousness and desirability soon finds himself a pariah. One may get away with denouncing motherhood and apple pie, but not with speaking ill of democracy, which is now the principal icon of political and social life throughout the world. Many people are atheists, but few are antidemocrats.
Worship of this particular political arrangement has emerged relatively recently, however, and in earlier ages political philosophers were more apt to condemn democracy than to praise it. Aristotle, whose views received great weight for millennia, did not recommend democracy highly. Along with many other criticisms of this type of government, he wrote in his Politics:
1313b: 32-41: The final form of democracy has characteristics of tyranny: women dominate in the household so that they can denounce their husbands, slaves lack discipline, and flatterers – demagogues – are held in honor. The people wish to be a monarch.
1295b: 39-1296a5: It is best for citizens in a city-state to possess a moderate amount of wealth because where some have a lot and some have none the result is the ultimate democracy or unmixed oligarchy. Tyranny can result from both these extremes. It is much less likely to spring from moderate systems of government.
1276a: 12-14: Some democracies, like tyrannies, rest on force and are not directed toward the common advantage.
1312b: 35-38: Ultimate democracy, like unmixed and final oligarchy, is really a tyranny divided [among a multitude of persons].
The founders of the United States of America had mixed views about democracy. Nearly all of them seem to have feared it more than they respected it. They recognized that concessions to fairly wide participation in politics might have to be made to placate the masses – who, after all, had served as cannon fodder in the recently concluded war of secession from the British Empire – but they designed a system in which voting would be hobbled and circumscribed, so that the common people would be kept from giving direct vent to their passions by seizing control of the government and using it to plunder the rich. The founders conspicuously feared “mob rule” and associated it with untrammeled democracy. All of the newly independent states required property-holding and other qualifications for voting, and, in practice, the franchise was limited in most places to a small minority of the population – a subset of the adult, white males. The Constitution of the United States does not contain the word democracy, although it stipulates certain protocols for the election of officials, and it relies instead on federalism and the separation of powers to preserve liberty.
Although democracy made giant ideological strides in the nineteenth century, a few writers had the courage to condemn it even well into the twentieth century. Among the most astute of them was Joseph A. Schumpeter. In Capitalism, Socialism, and Democracy, he posits as a point of departure for analysis the classical conception of democracy: “the democratic method is that institutional arrangement for arriving at political decisions which realizes the common good by making the people itself decide issues through the election of individuals who are to assemble in order to carry out its will.” He then proceeds to demolish the pretension that this conception makes sense.
If we are to argue that the will of the citizens per se is a political factor entitled to respect, it must first exist. That is to say, it must be something more than an indeterminate bundle of vague impulses loosely playing about given slogans and mistaken impressions.
Schumpeter calls attention to “the ordinary citizen’s ignorance and lack of judgment in matters of domestic and foreign policy” and adds, anticipating the rational ignorance concept of public choice theory, that “without the initiative that comes from immediate responsibility, ignorance will persist in the face of masses of information however complete and correct.”
Moreover, “even if there were no political groups trying to influence him, the typical citizen would in political matters tend to yield to extrarational or irrational prejudice and impulse.” Matters are even worse once we recognize the “opportunities for groups with an ax to grind,” who “are able to fashion and, within very wide limits, even to create the will of the people,” leaving political analysts to ponder “not a genuine but a manufactured will” that is “the product and not the motive power of the political process.”
Schumpeter conceded that, in the long run, the general public may come to hold a more perceptive view of the world and to reward or punish officeholders in its light when they cast their ballots, but this eventual adjustment itself has a fatal flaw, because history “consists of a succession of short-run situations that may alter the course of events for good:”
If all the people can in the short run be “fooled” step by step into something they do not really want, and if this is not an exceptional case which we could afford to neglect, then no amount of retrospective common sense will alter the fact that in reality they neither raise nor decide issues but that the issues that shape their fate are normally raised and decided for them.
Because “electorates normally do not control their political leaders in any way except by refusing to reelect them or the parliamentary majorities that support them,” the distinct possibility – nay, the great likelihood – exists that the voters will find themselves time after time concerned about a horse that has already fled the barn, never to be retrieved.
This bleak view of the political process under representative democracy becomes even bleaker once we recognize that office seekers typically either speak in vague, emotion-laden generalities or simply lie about their intentions. After taking office, they may act in complete disregard of their campaign promises, trusting that when they run for reelection, they will be able to concoct a plausible excuse for their infidelity and betrayal of trust. Thus, the voters remain permanently immersed in a fog of disinformation, emotional manipulation, and bald-faced mendacity. No matter what a candidate promises, the voters have no means of holding him to those promises or of punishing his misbehavior until it may be too late to matter. In many cases, unfortunately, the officeholders’ decisions give rise to irreversible consequences – outcomes that cannot possibly be undone ex post.
Garet Garrett had a similar vision of the uselessness of democracy as a means of making government accountable to the “will of the people” (or to anything else except the rulers’ own desires). Writing at midcentury, shortly after Schumpeter’s death, in an essay titled “Ex America,” Garrett posed the following hypothetical scenario:
Suppose a true image of the present world had been presented to them in 1900, the future as in a crystal ball, together with the question, “Do you want it?” No one can imagine that they would have said yes – that they could have been tempted by the comforts, the gadgets, the automobiles and all the fabulous satisfactions of midcentury existence, to accept the coils of octopean government, the dim-out of the individual, the atomic bomb, a life of sickening fear, the nightmare of extinction. Their answer would have been no, terrifically.
Having set the scene, he asked: “Then how do you account for the fact that everything that has happened to change their world from what it was to what it is has taken place with their consent?” To which he added: “More accurately, first it happened and then they consented.”
Garrett proceeded to list and to discuss briefly a series of cataclysmic, course-altering political events in the United States, including getting into World War I, launching the New Deal, getting into World War II, and joining the United Nations, noting that in each instance the people did not vote for the government’s action, yet “to all of this the people have consented, not beforehand but afterward.”
One might object at this point by asking, “What difference does it make whether the people consent beforehand or afterward, so long as they consent?” Indeed, Bruce Ackerman has written an entire book to argue precisely that the most profound constitutional changes in US history occurred not when the people formally amended the Constitution, but when the government acted outside its constitutional authority in a crisis and later received electoral and judicial validation of its actions, and that these de facto constitutional revolutions deserve our approbation; indeed, they ought to serve as models for future constitutional revolutions.
Ackerman’s view may be challenged by noting the frequency with which constitutional revolutionaries engineer the alleged ex post validation of their actions. People in power have the greatest ability to gerrymander the voting districts, bias the electoral rules, buy votes with taxpayers’ money, stuff the ballot boxes, and otherwise ensure that those in power – regardless of how they got there – remain in power. Similarly, people in power have the greatest ability to appoint new judges, alter judicial jurisdictions, and change the size or number of courts of appeal to ensure that those in power – regardless of how they got there – gain judicial vindication of their (heretofore unconstitutional) actions.
Despite the force of the preceding objections, Ackerman might refuse to consider them a knockout blow to his thesis. Sooner or later, he might insist, the people will be able to vote against policies they find offensive, and judges will be able to overturn the constitutionality of laws that transcend the government’s true constitutional authority. The political winners can’t rig the game forever, so if the people and the judges never avail themselves of opportunities to express their aversion to the constitutional revolutionaries and their policies, we may presume that they actually approve of what has been done – in Garrett’s words, “first it happened and then they consented.”
In a sense, this interpretation may be correct, but I doubt that the sense I have in mind is one that Ackerman would welcome. If the people never avail themselves of the opportunity to overturn what was done initially without their consent, they may thereby reveal only that people who have been fed thin gruel for a long time get used to eating it and even come to consider it nutritious. In less metaphorical terms, my claim is that ideological change is often path-dependent: where a dominant ideology stands and where it is most likely to go in the future depend significantly on where it has been in the past.
Bearing in mind this aspect of political, social, and economic dynamics, we may come to understand better how, for example, in each decisive episode in the great transformation of America’s political economy between 1900 and 1950, “first it happened and then they consented,” and afterward the people looked back on these episodes not so much with regret as with pride and a sense that the nation had overcome great challenges. Moreover, the people subsequently elevated to the pantheon of “greatness” the presidents who had taken it upon themselves to plunge the nation into these cauldrons and endowed them with sainthood in the Church of Democracy – thus, Woodrow Wilson and Franklin D. Roosevelt, and earlier, in the same mold, Abraham Lincoln.
After World War I erupted in Europe in August 1914, the overwhelming majority of Americans preferred that their government remain neutral and not become engaged in the fighting. “Aversion to joining in the carnage,” writes Walter Karp, “was virtually unanimous.” President Wilson represented himself as striving above all to end the fighting and to resist the temptation to enter the war in reaction to various provocations by both warring sides. We may well doubt the sincerity of his avowals of neutrality, however. Thomas Fleming writes that “in an unguarded moment, Wilson confessed to a friend that he hoped for an Allied victory in the war but was not permitted by his public neutrality to say so.” There is no doubt, however, that the president and his election managers perceived that the best way for him to gain reelection in 1916 was by continuing to represent himself as a man of peace; hence, the campaign slogan “He kept us out of war.”
Yet, less than a month after beginning his second term, Wilson asked Congress for a declaration of war, resting his request on the astonishing ground that Americans had an absolute right to travel unmolested on the high seas on ships carrying munitions to a warring power. “Even after Wilson broke off relations with Germany in February 1917,” Karp writes, “an overwhelming majority of Americans still opposed entering the war. Even when the United States had already been at war for some months, a majority of Americans remained a sullen, silenced opposition, more profoundly alienated from their own government than any American majority has ever been before or since.” Karp concludes: “Representative government had failed them at every turn.” Democracy in action?
Probably no single event of the past century has been such a prodigious source of evils as the US entry into World War I and the Versailles Treaty that US entry made possible. The conquests of Bolshevism, Nazism, and Fascism and the manifold catastrophes known collectively as World War II, not to mention endless troubles in the Middle East, arguably, may be traced directly to this source. In the United States, World War I prompted the government to embrace what contemporaries called “war socialism” (though it was, in more precise language, “war fascism” for the most part), which provided blueprints for an immense variety of government interventions in the economy and society, many of which continue to impoverish Americans and to crush their liberties ninety years later. The war could have such extreme and enduring consequences because it had also brought about abrupt ideological changes: many Americans became convinced by their perception of the wartime controls that the government was capable of successfully engaging in socio-economic engineering on a wide front. Thus, the war put the final nail into the coffin of nineteenth-century liberalism, at least in the eyes of the major political players. As Bernard Baruch, the wartime head of the War Industries Board, declared, “We helped inter the extreme dogmas of laissez faire, which had for so long molded American economic and political thought.”
Democracy’s next colossal failure in the United States occurred in 1932. By the time of the presidential election in November, the country had experienced more than three years of worsening economic performance: falling output, rising unemployment, increasing numbers of business failures, and growing numbers of homes and businesses lost to foreclosure or to seizure for failure to pay taxes. Not without plausible reasons, people blamed President Herbert Hoover for these dreadful developments and gave Franklin D. Roosevelt, the Democratic challenger, the benefit of the doubt.
Roosevelt campaigned on a platform that the old Grover Cleveland-style Democrats of the nineteenth century might have endorsed comfortably. As Jesse Walker summarizes it:
The very first plank calls for “an immediate and drastic reduction of governmental expenditures by abolishing useless commissions and offices, consolidating departments and bureaus, and eliminating extravagance to accomplish a saving of not less than twenty-five per cent in the cost of the Federal Government.” (It also asks “the states to make a zealous effort to achieve a proportionate result.”) Subsequent planks demand a balanced budget, a low tariff, the repeal of Prohibition, “a sound currency to be preserved at all hazards,” “no interference in the internal affairs of other nations,” and “the removal of government from all fields of private enterprise except where necessary to develop public works and natural resources in the common interest.” The document concludes with a quote from Andrew Jackson: “equal rights to all; special privilege to none.”
Having made these promises, Roosevelt swept to a lopsided victory at the polls.
Yet, the merest child knows that his New Deal, a huge hodgepodge of domestic interventions, controls, subsidies, taxes, threats, seizures, and other troublemaking amounted to nearly the exact opposite of what he had promised the voters during the campaign.
So what, we may hear Professor Ackerman asking offstage; didn’t the people endorse these actions by reelecting Roosevelt with an even greater margin of victory in 1936? Yes, of course, they did. But, by that time, the president and his party had turned the federal government into a vast, vote-buying apparatus that covered the entire country and penetrated every county, town, and village. As John T. Flynn described the situation:
Roosevelt’s billions, adroitly used, had broken down every political machine in America. The patronage they once lived on and the local money they once had to disburse to help the poor was trivial compared to the vast floods of money Roosevelt controlled. And no political boss could compete with him in any county in America in the distribution of money and jobs.
Nor was this garden-variety political corruption the worst of it. Far more significant in the long run was the loss of faith in the free market among the masses and the boost given to ideological support for economic fascism. Owing to the Great Depression and the New Deal, later generations would live in chronic fear of economic privation and rest their hopes for security in a fervent belief that if the economy turned down, the government could and would rescue them. The Employment Act of 1946 codified this public dependency. Rugged individualism, to the extent that it had ever really existed, died a cruel death at the hands of the New Deal – precisely the opposite of what Roosevelt had promised when he first campaigned for the presidency. Democracy in action?
Roosevelt was still in office when the next great travesty of democracy occurred, in 1940. War between the great powers had resumed in Europe, as everyone had expected it eventually would after the Versailles Treaty was signed in 1919. Just as the great majority of Americans had wished to keep away from the fighting in 1914, so a great majority again wanted nothing to do with the European bloodletting. Roosevelt, as the leader of the small minority that favored going to war – to save the British and (dare we conjecture?) to permit him to achieve the “greatness” that only wartime leadership brings – had to play his cards carefully. For two years, mendacity would be his major political device, as he sought to maneuver Germany and Japan into an “incident” so inflammatory that it would shock the public into supporting US entry into the war.
Roosevelt’s vaulting ambition fed his quest for reelection to an unprecedented third term. Given the massive public opposition to war – opposition, that is, to the very objective whose attainment he sought above all others – the president, who had already begun to involve the country in the war in discreet ways, lifted his dishonesty to a higher level as the election approached. In a campaign speech at Boston on October 30, 1940, he declared bluntly: “I have said this before, but I shall say it again and again: Your boys are not going to be sent into any foreign wars.” As David M. Kennedy notes, “Conspicuously, Roosevelt omitted the qualifying phrase that he had used on previous occasions: ‘except in case of attack.'” Relying on this seemingly frank promise, the electorate returned Roosevelt to office for another term.
In return, of course, they found themselves being pushed farther and farther toward open US belligerency, until finally the Japanese attack on Pearl Harbor gave the president what he, his chief subordinates, and his closest supporters had been seeking from the start: declared engagement in the greatest armed conflict of all time. Democracy in action?
By the time it ended, Americans had suffered more than a million casualties, including more than 400,000 servicemen’s deaths, and four years of economic fascism on the home front, with extensive controls and government takeovers that dwarfed those of any comparable episode in the United States before or since. Moreover, the entire world had been altered, as the Soviet Union, America’s wartime ally, now stood astride all of eastern Europe and much of central Europe, too, as far west as Czechoslovakia, so that when the violence ended in 1945, only a tense pseudo-peace took its place, and the world was condemned to live in fear of nuclear annihilation indefinitely.
For this dismal result, we may credit the democratic system that put Franklin D. Roosevelt and his party in power and allowed them to make the United States the decisive factor in the war’s outcome. Without America’s active involvement in the war, the British might have been forced to sue for peace, and the Germans and the Soviets might have bled one another to death – a grisly outcome, to be sure, but would it have been any worse than what actually happened? We cannot know, of course; history is not ours to rerun, like a controlled experiment with reset conditions. Yet, we can scarcely deny that the devastated world of 1945, with 50 million dead, tens of millions left sick, wounded, or homeless, and a murderous Communist dictator in control of half of Europe, was scarcely what most Americans sought to bring about when they cast their votes for Roosevelt in 1940.
Democracy has always had its critics. No one claims that it is a perfect system for choosing political leaders or for putting in place the policies and laws the public prefers. Obviously, when individual preferences differ, no one political outcome can please everybody, and the “tyranny of the majority” stands as a constant menace to the lives, liberties, and property of unpopular minorities. Yet, most people continue to insist that democracy, with all its faults, offers to best institutional arrangement for making rulers accountable to the people. So long as elections continue to be held, the possibility always remains of “throwing the rascals out.”
What has not been widely recognized, however, is the problem of faits accomplis. Once elected rulers have taken office, the democratic system provides little or no effective means for the people to bring them to heel short of the next election. The great problem is that, by that time, it may be impossible to reverse the outcomes the rulers have brought about. Wilson was not elected in 1916 to plunge the nation into the Great War. Roosevelt was not elected in 1932 to impose the New Deal on the country. Nor was he elected in 1940 to maneuver the United States into the greatest war of all time. Yet, in each case, the president did the opposite of what he had promised to do, and the people were left with no recourse. The world of 1919, the United States of 1936, and the world of 1945 – each was so massively, so irrevocably altered from the preceding status quo that any genuine restoration of the previous conditions was unimaginable. Like it or not, people were to a great extent simply stuck with what the deceitful politicians had done.
Worse, owing to “ideological learning,” many people who initially had not desired these changes did approve of them in the circumstances in which they later found themselves – circumstances that they had in no way chosen, not even indirectly, but into which they had been forcibly shoved by the ruling decision-makers. Contemplating this situation, one readily recalls Goethe’s dictum that “none are more hopelessly enslaved than those who falsely believe they are free.”
Still worse, an altered ideological context then sets the stage from which a society may be propelled even further from the course it initially preferred during the next round of democratic choice, unconstrained decisions by elected officials, and the resulting faits accomplis. If people believe that democracy is a means by which ordinary people may ensure that they exercise some control over their own societal fate, they are fooling themselves. If the persons elected to office have a free hand to act as they please, then the sense that they are truly accountable to the electorate is an illusion. It comes closer to the truth to say that the people are completely at the mercy of the officials they have elected.
H.L. Mencken wrote,
Democracy may be a self-limiting disease, as civilization itself seems to be. There are thumping paradoxes in its philosophy, and some of them have a suicidal smack.
Whether it will prove suicidal for its adherents, only time will tell, but we might note that, so far, only the United States of America, whose leaders and people tout their country as the greatest of all democracies, has employed nuclear weapons in war. It is not inconceivable that Woodrow Wilson’s war to make the world safe for democracy, owing to the train of consequences it set in motion, may ultimately make the world safe for democracy, to be sure, but not safe for mankind.
Robert Higgs is senior fellow in political economy for the Independent Institute and editor of The Independent Review. He is the 2007 recipient of the Gary G. Schlarbaum Prize for Lifetime Achievement in the Cause of Liberty. Send him mail. See Robert Higgs’s article archives.
Comment on the blog.
You can subscribe to future articles by Robert Higgs via this RSS feed.
 Thomas R. Martin, with Neel Smith and Jennifer F. Stuart, “Democracy in the Politics of Aristotle,” in Demos· Classical Athenian Democracy· a Stoa Publication (July 26, 2003).
 Joseph A. Schumpeter, Capitalism, Socialism and Democracy, 3rd ed. (New York: Harper and Brothers, 1950), p. 250.
 Ibid., p. 253.
 Ibid., pp. 261, 262.
 Ibid., p. 263. For a recent study that grapples with this problem, see Robert Higgs and Anthony Kilduff, “Public Opinion: A Powerful Predictor of US Defense Spending,” in Robert Higgs, Depression, War, and Cold War: Studies in Political Economy (New York: Oxford University Press, 2006), pp. 195-207.
 Ibid., p. 264; emphasis added.
 Ibid., p. 272.
 Garet Garrett, Ex America: The 50th Anniversary of The People’s Pottage, Introduction by Bruce Ramsey (Caldwell, Idaho: Caxton Press, 2004), p. 70.
 Ibid., p. 72.
 Bruce Ackerman, We the People 2: Transformations (Cambridge, Mass.: Belknap Press of Harvard University Press, 1998).
 Robert Higgs, “On Ackerman’s Justification of Irregular Constitutional Change: Is Any Vice You Get Away With a Virtue?” Constitutional Political Economy 10 (November 1999): 375-83.
 For visual representation of this phenomenon, nothing can surpass the Spartan regimen depicted in early scenes of the splendid film Babbette’s Feast (1987).
 Robert Higgs, “The Complex Course of Ideological Change,” American Journal of Economics and Sociology 67 (October 2008): 547-65.
 Robert Higgs, “Great Presidents?” in Against Leviathan: Government Power and a Free Society (Oakland, Calif.: The Independent Institute, 2004), pp. 53-56.
 Walter Karp, The Politics of War: The Story of Two Wars Which Altered Forever the Political Life of the American Republic (1890-1920) (New York: Harper and Row, 1979), p. 169.
 Thomas Fleming, The Illusion of Victory: America in World War I (New York: Basic Books, 2003), p. 75.
 Karp, The Politics of War, p. 169.
 Ibid., p. 324.
 Among recent sources, see, for example, Jim Powell, Wilson’s War: How Woodrow Wilson’s Great Blunder Led to Hitler, Lenin, Stalin & World War II (New York: Crown Forum, 2005); and Patrick J. Buchanan, Churchill, Hitler, and the Unnecessary War: How Britain Lost Its Empire and the West Lost the World (New York: Crown, 2008).
 Robert Higgs, Crisis and Leviathan: Critical Episodes in the Growth of American Government (New York: Oxford University Press, 1987).
 Bernard M. Baruch, Baruch: The Public Years (New York: Holt, Rinehart and Winston, 1960), p. 74.
 Jesse Walker, “The New Franklin Roosevelts: Don’t Count on a Candidate’s Campaign Stances to Tell You How He’ll Behave in Office,” Reason Online, April 10, 2008, at http://www.reason.com/news/show/125921.html.
 John T. Flynn, The Roosevelt Myth (Garden City, N.Y.: Garden City Books, 1949), p. 65.
 Among the many sources relevant to this maneuvering, see the recent works by Robert B. Stinnett, Day of Deceit: The Truth about FDR and Pearl Harbor (New York: Free Press, 2000); Thomas Fleming, The New Dealers’ War:
F.D.R. and the War within World War II (New York: Basic Books, 2001); and George Victor, The Pearl Harbor Myth: Rethinking the Unthinkable (Dulles, Va.: Potomac Books, 2007).
 David M. Kennedy, Freedom from Fear: The American People in Depression and War, 1929-1945 (New York: Oxford University Press, 1999), p. 463.
 H.L. Mencken, A Mencken Chrestomathy (New York: Knopf, 1949), p. 157.