Why George W. Bush Is Really Our King

No one could blame President Bush for wanting to get out of town after the end of October. He’d just experienced what non-partisan political observer Charles Cook dubbed “the worst week of the worst month of the worst year of the Bush presidency.” The president’s approval ratings sagged to an all-time low of less than 40 percent; he suffered the humiliation of having his Supreme Court nominee torpedoed by opposition from his own party; the number of American soldiers killed in Iraq passed 2,000; he was lambasted for yet another slow response to a hurricane disaster, in Florida; and influential White House aide, I. Lewis “Scooter” Libby, resigned after being indicted for perjury and obstruction of justice.

Unfortunately for Bush, his early-November travel plans took him to Argentina for a two-day hemispheric trade summit of 34 nations, where despite his coaxing, no agreement was reached on resuming stalled negotiations on establishing the Free Trade Area of the Americas. The lesson from that weekend seemed to be that, as bad as things are at home, Bush is even less popular in Latin America. Strikes and mass demonstrations by anti-Bush protesters exploded outside the fortified gates of the hotel where the summit was held.

These days, Bush is bashed from Argentina to Australia, but it is rare to encounter a reasoned critique of the man and his administration presented as a means of enlightening the American public. Our colleague Ian Williams has the talent to do just that. Williams is a busy freelance writer, born in Liverpool but since 1989 based in New York. He serves as The Nation‘s U.N. correspondent and has been a regular contributor to many of Britain’s major newspapers. His latest book, Rum: A Social and Sociable History of the Real Spirit of 1776, was published this summer.

Not long ago we criticized the Electoral College method of electing a president as an anti-democratic anachronism. In this issue Williams makes a more far-reaching argument: He sees the presidency itself—embodying the roles of both chief executive and head of state—as an unfortunate relic that the Founding Fathers would have done better to reconsider. Of course, such a critique has no hope of resulting in transformation of any sort. But taking the opportunity occasionally to see things through the eyes of a brilliant foreign correspondent can give us a fresh perspective on the state of our democracy—where we are today and how we got here.

The stately arrival of Prince Charles and his most recent spouse at the White House in early November, shortly after the unstately departure of Vice President Cheney’s aide Lewis Libby from the same place, and one hopes, shortly before presidential adviser Karl Rove gets the bum’s rush as well, was a thought-provoking event. Americans tend to assume that they have the finest democracy in the world—just as they assume that they have the best health care. It often takes an outside perspective to show up the eminently falsifiable nature of these suppositions, but it is always an uphill struggle.

To celebrate the royal visit, I was invited onto the Fox News channel to tut-tut on TV about the anachronistic nature of England’s Windsor line. But alas, since Fox thinks that irony is what they used to make in Pittsburgh, my tongue-in-cheek defense of constitutional monarchy fell somewhat flat. I had forgotten that the untitled Rupert Murdoch, who owns Fox, is a republican as well as a Republican. But I notice that he did not exactly exclude his male heirs from the management of News Corp.

When the people at Fox asked me if the monarchy represented privilege, of course I said I could agree in principle, but I pointed out that in the constitutional monarchies of Scandinavia, the Low Countries and Britain, poor people have far more access to health care and education than in the current Georgian America. In fact, in every measurable way these societies are more egalitarian than the United States.

For all his eccentricities, Charles is a convinced environmentalist, who supports the Kyoto Protocol, while George thinks global warming, like evolution (and indeed probably gravity as well) is just a theory, despite the hurricanes that batter hardest at the states that gave him the presidency.
With that in mind, I told Fox that the hereditary principle is indeed a dubious way to fill jobs, but that even if the prince were eccentric or barking mad, the world would be safe when he becomes Charles III, even if he only makes it because he’s his mother’s son. However, I cautioned, it made one hell of a difference to the world that George W., with more than a few psychological question marks of his own, had become George II just because he was the fruit of his father’s loins. After all, no rational person would believe that the spoiled legacy brat who deserted from the Air National Guard and sank business after business would ever have succeeded in politics without strong dynastic backing.

AN 18TH-CENTURY ANTIQUE—In fact, when the putative Charles III shook hands with George II of the Bush dynasty, he was meeting someone who has pretty much all the powers of Charles’s ancestor, the Hanoverian George III. An equestrian statue of King George was erected in 1770 by the colonists of New York, grateful for the repeal of the Stamp Act, and was toppled in ingratitude by the same people after a public reading of the newly written Declaration of Independence, just six years later.

Essentially unchanged since then, the American political system has escaped the reforms of the British and other democracies. While the powers of the European monarchs have become more and more diluted with each passing year until the kings and queens have all the significance of a team mascot for their nations, the presidential office has retained all those quasi-monarchical powers of centuries past.

As a Hanoverian monarch subject to election every four years, the American president appoints civil servants, ambassadors, and the whole Cabinet, on the same basis as the patronage system of eighteenth-century England. The Cabinet members he chooses need not have any independent political standing whatsoever. Indeed, as we saw with the heads of the Homeland Security and FEMA, not much in the way of professional standing is required either.

Having such an intensely political personage as the head of state confuses issues. The American media and even the political classes show far more deference to the president of the United States than their British counterparts do to the queen of England and her numerous offspring. In fact, most people in the UK tend to ignore the monarchy except as a continuing royal reality show. I have heard Americans say, “I must support my president,” but never heard anyone in Britain say, “I must support my prime minister.”

When the U.S. separated from Britain, the institution of prime minister was in its infancy, and so it was not too surprising that the rebellious colonists overlooked the office in their Constitution, not least since they saw the prime minister of their day, Lord North, as a tool of the king.

Indeed the title of prime minister itself was not formally adopted until 1905, even in Britain. However, as the office of prime minister has developed in Britain and other places, it has become clear that it is no bad thing for the chief executive to come from the ranks of legislators—and to be accountable to them. The roles of head of state and chief executive are separate. But with its political system frozen in 1789, the United States missed out on this idea.

It is not only a question of much needed political experience. We have to ask, how far would George W. Bush’s political career have advanced if he had to stand up for a Capitol Hill version of “Prime Minister’s Question Time” and actually explain and defend his policies on the hoof against unscripted questions? On the other hand, looking at the docility of so many of the U.S. legislators one may wonder whether they could come up with any killer questions on the spur of the moment without a team of aides whispering in their ears.

IMPORTANCE OF OPPOSITION—The offenses for which Libby was indicted suggest that in one major respect, the American political system is not only not reforming, but is actually devolving. To score petty domestic political points against an individual who had crossed them, high-ranking officials in the White House were quite prepared to compromise secret agents and national security, putting possibly scores of lives at risk. For the Bush team, opposition is always disloyal, and the law is no protection for that opposition.

If a democracy is to function and survive, the major protagonists within it must, in the end, believe in the concept of a “loyal opposition.” It does not take too much examination of the world’s politics to see that in many countries this is a complete oxymoron, and of course, there were times in American history, from the Federalist period onwards, when it did not operate too smoothly as a concept. The current White House has clearly abandoned the quaint idea entirely.

This is only the latest manifestation of the idea. Many conservatives, for example, never accepted that Clinton was really president. The mere accident of election did not persuade them that someone with his views could legitimately hold the office. Similarly, when it came to George W. Bush’s assumption of office, the technical detail that he may not have actually won the election was for them no conceptual barrier at all to his taking the oath.

In their own idiosyncratic way, many Democratic legislators have also shown signs of abandoning the concept of a loyal opposition. They have emphasized the loyalty at the expense of the opposition. Being excluded from power does not make you an opposition: opposing the incumbents does. Though Harry Reid’s marshaling of a serious look at the road to the Iraq War was a heartening sign, and the resistance to John Bolton’s nomination as U.N. Ambassador was as well, these examples stand out because of their rarity.
THE PRIMARY PROBLEM—Their lack of feistiness is not the only problem. Democratic legislators must contend with one of the few innovations in the American political system since 1789: the electoral primaries. The original idea behind primaries was to take politics out of the smoke-filled rooms of the party bosses, where as Tammany Hall’s Boss Tweed once said, “I don’t care who does the electin’, so long as I do the nominatin’.” Apart from anti-smoking laws, all that has happened since is that check writers have taken over for ward heelers.

The primaries are now responsible for much of the evil in modern American politics, from apathy and lackluster political platforms to the power of money. We now take it for granted, almost as constitutional, in fact, that the race is much more likely to go to the richest than the worthiest. To gain access to party funding, a candidate has to first win a primary, and to do so needs to raise money as an individual. As we can see, this not only gives a head start to the Mike Bloombergs of this world, it also means that candidates begin their political life in hock to business interests.

Europeans are never sure whether to be amused or horrified at the role of campaign contributions, in the U.S., in buying legislation. In most other countries this would be considered criminal corruption and outright bribery, but the American convention is to assume that as long as the bribes are spent on political expenses rather than going into the candidates’ pockets, all is well.

Primaries are flawed in principle as well as in effect, but Americans are so used to them that even the most radical tend to overlook just how bizarre and essentially undemocratic they are. In few other democracies are a party’s candidates chosen by non-party members. In a sense, it makes a mockery of the secret ballot for voters to declare their party allegiances on the electoral registers, and in many countries it would be regarded as a shocking intrusion to have citizens’ political opinions recorded publicly in this way.

While they are anomalous enough in the states where voters at least have to declare which party they support in order to participate, primaries reach the level of outright insanity in states with “open primaries,” where supporters of one party can actually choose another’s candidates. We saw the results of that recently when Cynthia McKinney was defeated in an open primary in Georgia by a combination of cross voting from Republicans and out-of-state money. When she was able to present herself in a later, general election, she won handsomely, demonstrating presumably how ineffective the primaries are at representing the intentions of the electorate as a whole.

In other democratic countries, the candidates are picked by party members who have paid dues and declared support for the party’s principles. Of course, the association of party and principle seems a contradiction in terms to many disgruntled Americans, but maybe the primaries have had something to do with that as well.

Another direct consequence of this is that as far as the public is concerned, the Democrats will be leaderless until the primaries. There is no leader of the opposition, loyal or otherwise, in the American political system. In more developed parliamentary systems, the scores are settled right after an election. The losing party decides whether the leader of that party is worth another try, or whether to pick someone else quickly to lead the opposition back to power.

But in the U.S., the Democrats will be rudderless for most of the presidential term until at the end, for a long and tedious year the contending candidates will exhaust their wealth and the patience of potential supporters in trashing each other, so that the one with the most money and least mire sticking to him emerges as the winning candidate, to be adopted at the content-free circus that passes for a party convention. If half the energy that went into opposing each other in the primaries went into the task of opposing the incumbent over his term of office, it would be a big step forward.

FACING THE FACTS—Americans often take some convincing that there is much wrong with their system, apart from the wrong people being elected. While the European monarchies were evolving, the American Republic became fossilized in its eighteenth-century form. The United States could benefit from a constitutional monarchy that no one cares very much about, and an established church that no one believes in; but sadly the Bush dynasty, beginning pre-Katrina, has shown many signs of developing into an unconstitutional de facto monarchy, with the White House controlling the legislators and the judges and the military every bit as firmly as George III ever did. And the U.S., for all the talk of separation of church and state is increasingly intolerant in its religion. However, while you could live with an attenuated monarchy inherited and adapted, no rational person save Karl Rove would try to implement one from a standing start.

So, is there an easy way to bring the American political system into the twenty-first century? Sadly, probably not. Even the primaries, enshrined as they are in so many state legislatures, would take a long time to disentangle. However, the Plamegate affair does offer an unrivaled opportunity for the Democrats to stake out a position for the loyal opposition, and to establish the question of to what, or whom, loyalty is due. All too often, the Democrats have acted as if in their hearts they secretly believed that the Republicans were indeed the natural governing party of the United States in some metaphysical way.

Loyalty to the nation and its people now demands an exposure of the disloyalty of the governing party. Its preparedness to lie and invent facts in order to procure a war that it has yet to explain adequately; its willingness to compromise national security to protect its lies; its confusion of loyalty to the Bush family and to its cronies with loyalty to the country, all capped with a willingness to retaliate at once against any liberals who speak out.

In fact, it demands the application of European standards of political conduct, which, even if they are more often honored in the breach than the observance, would pay dividends for a revived American democracy that currently shows signs of ignoring decent standards altogether.