With the death of Osama bin Laden at the hands of American Special Forces on May 2, 2011, the 9/11 decade may be said to be psychologically over.1 Political decades, after all, do not have to fit neatly into the calendar. Just as the Sixties began with the assassination of President Kennedy on November 2, 1963 and ended, arguably, with Richard Nixon’s resignation from the presidency in August 1974, so that successful raid constitutes a suitable bookend for a period that began on September 11, 2001.
Bin Laden’s death does not end the threat of Islamist terrorism, and in an operational sense its importance is unclear. Even if, we now know, bin Laden remained more responsible for al-Qaeda’s day-to-day operations than had been generally thought, he did not manage to produce significant results. But symbols do matter, and in this case—a case of a war unlike any that Americans have ever experienced, with much of it seemingly suspended in a psychological ether without manifest battlefields or other common accoutrements of war—they have mattered more than usual. So this is a suitable moment for a reflection, I think, not just on the death of a master terrorist, but on America’s 9/11 decade as a whole.
As with the bombing of Pearl Harbor on December 7, 1941 and the assassination of President John F. Kennedy on November 22, 1963, all adult Americans know where they were on September 11, 2001. On that beautiful Tuesday morning I was five blocks from the White House at the fifth-floor offices of National Affairs Inc. in Washington, DC, an office that housed both The National Interest and The Public Interest magazines, the two parts of Irving Kristol’s very modest publishing empire. The Washington Monument was in its usual place outside my office window, as were, of course, the streets below. Once it had become clear that an attack was in progress, national and local media assumed a slightly manic tone. Most private offices reacted by letting their staffs go, resulting in near instantaneous gridlocked mayhem throughout the downtown part of the city. That rendered it very difficult for emergency and police staffs to evacuate the Congress and other possible near-term targets and otherwise do their critical jobs.
The headless-chicken reaction of the managing denizens of the nation’s capital to the 9/11 attacks disgusted me. Having lived for a while in Israel, it had become second nature to assume a stoical mien in times like these, lest one contribute to a terrorist enemy’s designs. Just as no one can make a person feel inferior without his or her consent, Eleanor Roosevelt once observed, no one can terrorize you unless you cooperate. I was against cooperating, so I ordered my staff to stay put, do a day’s work and go home as usual. Of course we would use the phone to assure relatives and friends that we were safe and we would monitor the news; if necessary, we would adjust to further events. We all stayed until past five p.m., emerging thereafter for the evening commute into a virtual ghost-town.
I have often thought of those first few hours after the 9/11 attacks during the past decade, and I have come to realize the basic error that U.S. leaders made in responding to 9/11 was to inadvertently cooperate with an enemy too weak to achieve its ends in any other way. To me, 9/11 did not “change everything”, and would not do so unless we were foolishly complicit in it. It was natural to rehearse worst-case scenarios, like al-Qaeda with weapons of mass destruction. I remembered well the Hart-Rudman Commission reports, which had been released just in March and for which I had served as chief writer.2 But I thought that, whatever our private analysis might be, the public face of American leadership should radiate optimism and courage, not anger and fear—and I believed that optimism to be fully justified, despite some novel vulnerabilities, by the state of the world in the decade after the Cold War.
Of course, in what was a miasma of intelligence deficits about al-Qaeda and similar threats, we needed to prevent follow-on attacks to the extent possible. That, it seemed clear to me, as it did to the vast majority of Americans, meant urgently removing the obdurate Taliban regime in Afghanistan that had sheltered and abetted the 9/11 plotters. We also needed to neutralize in due course, in one way or another, those who might be planning more attacks, wherever they might be. The Bush Doctrine version 1.0, rolled out in the immediate wake of the attack, held regimes complicit with terrorism to be equally liable for retribution against it. This was entirely appropriate, in my view.
Nor did I object to President Bush terming the situation a war, for that was necessary to tell the American people what was at stake, and there were certain presidential legal authorities that it was wise at the time to make available. I wished he had coupled that declaration with specific ways in which a psychologically mobilized American people might sacrifice for the common welfare; instead, he asked us to go on vacation to show that everything was still normal. In precisely that vein I was sure that we should not take ourselves psychologically hostage, as the Carter Administration had allowed after Iranian fanatics seized the U.S. Embassy in Tehran in November 1979. We should not allow the attacks of 9/11 to define or monopolize U.S. foreign policy as a whole.
Alas, that’s exactly what we did allow to happen. Indeed, the Bush Administration appeared to insist on it. The only senior U.S. leader who seemed, almost imperceptibly beneath the roar of a defiant White House and Defense Department, to take the approach I thought most wise was Colin Powell, whose influence had already been somewhat marginalized in the Administration. When we spoke in his seventh-floor State Department office just after July 4, 2003, about my coming to work for him, he told me frankly that there was disagreement within the Administration, not about first-order principles but about second-order ones: He did not believe the terrorist threat was of an existential nature. It did not require an extraordinary cashiering of American strategic thinking, allies or institutions. Others thought differently, he averred, and personal relations had become cool in some cases, testy in a few others. (He told me that a lot of my friends—and he knew who they were—elsewhere in the Administration and on its margins might never forgive me for working for him, and I told him that I understood. He hired me.)
In reaching my conclusion about the scope and nature of the threat we faced, and how we should respond to it, I relied on the thinking that had produced my first essay as editor of The National Interest, written some months before the 9/11 attacks. In this essay I tried to integrate the leading economic, geopolitical and moral-philosophical currents of the day into an integrated picture of America’s position in the world. It seemed to me that “globalization”, the “unipolar moment” and “the end of History” each captured part of a complex truth, but that each by itself was too limited in purview. Taking them together, I hoped, would transform these two-dimensional observations into a three-dimensional heuristic of greater intellectual power. I saw opportunity looming larger than danger for 21st-century America, which is why I called the essay “The Present Opportunity”, a deliberate play on the mid-Cold War organization called the Committee on the Present Danger (CPD).3 Having been sired as a liberal Cold War hawk, having worked briefly for Scoop Jackson on strategic arms control issues, having met Paul Nitze and supported the CPD’s work in the late 1970s, this was my way of saying that while hawkish policies were justified in the face of Soviet power and ideology—they worked, after all—it did not follow that the same policies made sense in the absence of that adversary.
I worried more that the overhanging and largely unexamined habits of the Cold War would encourage hubris and unilateralist overreach in U.S. foreign policy. As I had written elsewhere and earlier about the promiscuous deployment of U.S. combat forces after the 1991 Gulf War in the Balkans, Somalia, Haiti and elsewhere, we needed to distinguish more carefully between vital and marginal interests, and to keep our military forces and prestige away from the latter so they could be marshaled in the long-term service of the former.4 Otherwise, I argued, we needed to engage in maximum feasible preventive diplomacy and at the same time deal effectively with “problems so dangerous that we have no choice but to face them”, of which the “most grave is the threat of mass-casualty terrorism.” There was much resentment in some parts of the world from the onslaught of what was perceived as made-in-the-USA socio-economic pressures on traditional patriarchal orders, and the resentful were no longer powerless to reach out and slap someone, to parody a then-current telephone service advertising slogan. Taking a page from Hart-Rudman, I wrote that mass-casualty terrorism is not a problem off to the side of U.S. national security policy but one central to it, for the vulnerability of the American homeland, left unredressed, will severely reduce the flexibility, credibility and political base of any activist U.S. foreign policy.5
As it turned out, of course, we did nor redress that vulnerability in time, and we suffered the consequences. After 9/11, the politics of the matter were such that the Bush Administration had no choice but to create a Department of Homeland Security. Had 9/11 not happened, the aforementioned Cheney committee almost certainly would have buried the idea in the deepest, darkest Old Executive Office Building dungeon it could find. As it was, the Administration created a new Executive Branch department exactly as the Hart-Rudman report advised us not to: too large, too layered and over-centralized, yet underfunded and without control over the bulk of the government’s homeland security budget. DHS remains today a ponderous and largely dysfunctional monstrosity whose ample teats appeal only to legions of thirsty but mostly ineffectual contractors. DHS is what happens when those who think “government is the problem” are forced to create a governmental organization.
The real tragedy of the post-9/11 U.S. reaction does not reside in the malformation of DHS, however. It resides in the fact that the blindness of U.S. intelligence to the terrorist threat, an avoidable calamity inherited by the Bush Administration from its eight-year-tenured predecessor, led most senior Administration officials to fear what might come next. Fear, as Elena Bonner once said, “gives bad advice.”6 As neuroscientists describe it, the human brain is wired to make instant judgments about the sensory data it receives so that the data can be filed in accordance with some model based on prior experience. When uncertainty prevails, the fear-oriented centers of the brain light up, propelling a rush to accept the first conclusion that can assuage the fear.7 The rush to closure was greatly magnified by the anthrax episode of October 2001, which alarmed Vice President Cheney and his staff in particular. Based on what turned out to be misleading intelligence about an Iraqi smallpox bioweapons program, Cheney wanted to re-inoculate the entire U.S. population against the disease. President Bush agreed only to inoculate the U.S. armed forces.8
Worried deeply, as patriots and public servants, about their ability to fulfill their oaths to keep America safe and secure, Administration principals rushed to closure by accepting the only ready-made theory to hand, widely ascribed to so-called neoconservatives, of why 9/11 had happened: A democracy deficit in the Arabo-Muslim world had forced frustrated citizens into the mosque, where they had been easy prey for religious charlatans and demagogues. The answer was to open up space for dissent, democratic debate and the social balm supposedly provided by market economics.9 Then these stultified societies could breathe, develop normally, and thus not produce demonic mass-murderers like Osama bin Laden.
Thus did fear boomerang, in the way that human emotions predictably do, into hubris. Fear posed the disturbing “what if” questions, and hubris provided the psychologically reassuring “do that” answers. The 9/11 attacks had the effect of propelling U.S. policy to do more at a time when its capacity to influence events had diminished, thanks to the end of Cold War bipolarity and the diffusion of lethal technologies to weak states and non-state actors. It propelled the United States to ramp up its metabolism and inflate its definition of vital interests rather than calmly discern distinctions among them. Unrivalled U.S. power, preeminently though not exclusively military power, would end the threat by transforming the political cultures of more than two dozen Arab- and Muslim-majority countries into liberal democracies.10 This solution in turn depended on the validity of democratic peace theory—the idea that democracies do not make war on other democracies—and on cherished Tocquevillian views about the pacific nature of egalitarian democratic societies.
Contrary to what many have since assumed, this theory of the sources of 9/11 existed in the President’s head well before the Iraq War began. Indeed, in February 2003 President Bush gave a major speech at the American Enterprise Institute in which all the basic themes of this view found expression. That, essentially, is when the Bush Doctrine version 3.0 was unveiled to the world. (Version 2.0 was characterized by the pre-emption plank famously inserted into the September 2002 National Security Strategy.) But what became known as “the forward strategy for freedom” did not find full expression from the bully pulpit until November 2003, with the President’s marquee speech at the National Endowment for Democracy (NED), and then in his Second Inaugural of January 2005, which Thomas Wolfe aptly dubbed the globalization of the Monroe Doctrine.11 The worse things got in Iraq, the higher the rhetorical bar rose—a classic case of cognitive dissonance at work in what in Blackjack is colloquially called “doubling down.”
The rush to closure over a fearful shock to U.S. security interests, and the hubristic response to it, was part of a longstanding pattern in American foreign policy history. The Bush Administration’s reaction to 9/11 was not the work of any neocon cabal. Self-avowed neoconservatives composed a group that was always smaller, more internally diverse and less influential than is often supposed. Rather, the neocons struck chords very familiar to American history and political culture, chords that national interest conservatives like Vice President Cheney and Defense Secretary Donald Rumsfeld could harmonize with. Had there been no neocons, the pattern would have asserted itself anyway, albeit in some other kindred idealist dialect.
The pattern of which I speak, well limned by the historian Walter McDougall, consists of four phases that tend to repeat in long cycles. First, there is a shock to the system, usually in the form of a surprise attack: the shot fired at Fort Sumter in April 1861, the sinking of the Maine in Havana Harbor in 1898, the sinking of the Lusitania in 1915 followed by the interception of the Zimmerman Telegram in 1917, Pearl Harbor in 1941, 9/11 in 2001. In the phase directly after the shock, the leader of the day—Lincoln, McKinley, Wilson, FDR, George W. Bush—vows to resurrect the status quo ante and punish the evildoers. That corresponds to Lincoln’s vow to save the Union, Wilson’s vow to defend the right of American free passage on the high seas, FDR’s pledge to restore American security and punish Japan’s perfidy, and of course George W. Bush’s vow to find and punish the perpetrators of the 9/11 attacks. But third, in the course of mobilizing the national effort to achieve the limited goals set after the initial shock, the transcendent God-talk begins and the effort becomes enmeshed in the sacred narrative of American exceptionalism. This leads to a distension of goals and expectations, and to what cognitive psychologists call a dominant strategy that is impervious to negative feedback and logical contradiction.12
That is how we get the Emancipation Proclamation and “The Battle Hymn of the Republic” only after several very bloody battles in the Civil War. It is how we get McKinley’s 11th-hour determination to make Christians out of the Filipinos (though they were already Catholics…), despite the strategic irresponsibility of annexing the Philippines. It’s how we get Wilson’s determination that a “war to end all wars” would make the world “safe for democracy”, and that the right to national self-determination should mandate the destruction of the Ottoman and Hapsburg Empires regardless of the geopolitical consequences. It’s how we get the belated conclusion during World War II that the coming Allied occupations of Italy, Germany and Japan should lead to their democratization, and that the dissolution of the colonial empires of America’s European allies was consistent with the war’s larger purpose, again regardless of geopolitical and other consequences.
And so, in the 9/11 decade, that is how we got a war against Iraq that destroyed the regional balancer to Iranian hegemonism and did not even stop to ask about the broader implications of a Shi’a government in Baghdad. One does not, apparently, descend to the smarminess of geopolitics when one is doing the Lord’s work. So too did we turn what should have remained a punitive military operation in Afghanistan into, first, an occupation and, then, a quixotic, distracted, chronically underfunded, diffusely managed and thus hopeless nation- and state-building campaign. And so did we conflate all our adversaries into one monolithic demon—typical of eschatological thinking (of which more in a moment)—as the Administration conflated secular, Ba’athi Iraq with apocalyptic Muslim fanatics. So did we make war against a country whose threat to America was not, as is now (far more than it was then) commonly claimed, zero, but which hardly justified, or excused, the haste and threadbare planning with which the war was launched and conducted.13
Then, in the fourth phase, overreach leads to setbacks (the Korean War, for example, and the Iraq insurgency), regrets (the Vietnam War, for example) and second thoughts (even Theodore Roosevelt quickly came to see the annexation of the Philippines as a mistake). Phase four ultimately results in at least temporary retrenchment, until the cycle starts again—each cycle, it being understood, more robust than the one before with the growth of American strength. This four-phase model fits the 9/11 decade to a tee. The 9/11 attack itself of course marks phase 1; the Bush Doctrine version 1.0 represents phase 2; the Second Inaugural signals the full efflorescence of phase 3; and the election of Barack Obama on an anti-Iraq War platform marks the consolidation of phase 4.
It matters in all this, however, whether the ideational vehicle that propels phase 3 into being even remotely reflects reality, at least insofar as the burst of American idealism can be sustained by it. In the case of the 9/11 decade, unfortunately, it did not. There have been basically two problems with it. First, the forward strategy for freedom’s ascription of causality for Islamist terrorism was mistaken. Second, even had it not been mistaken, the timetables in which democracy promotion would solve the problem of mass-casualty terrorism did not even begin to match up.
The reason for this is that, despite President Bush’s assertion that democracy promotion is “the work of generations” and that democracy is about more than elections, that is not the basis on which the Administration actually behaved. It rushed into premature elections in Iraq, Lebanon and the Palestinian territories, with troublesome and still open-ended consequences for Iraq and disastrous ones for Lebanon and Gaza. Most of its principals seemed genuinely unaware that certain cultural attitudes long in the making underlay the democratic institutions of Britain, the United States and other liberal democracies, and thus really believed that creating democracies where there never were democracies before would not be all that difficult.
Indeed, just as a fish is the last to discover water, the American political class generally seems to think that democracy is mainly a technical exercise concerned with forming political parties, ensuring press freedoms, and voting, rather than the attitudinal embodiment of a specific historical experience that, for any practical purpose, falls far short of being universal. Its members suppose that with enough money and goodwill any political or social problem anywhere in the world can be solved, and we know it is solved when those fortunate enough to be the object of our beneficent attentions start acting, well, like us. Since money and good will alone manifestly cannot solve core domestic social problems in the United States, it is a wonder that anyone could believe they can do so in places like Afghanistan or Iraq, some 7,000 miles away. But believe it they did, and still do.
What is it about their own democracy that most Americans find so hard to understand, so that they dramatically underestimate the obstacles to producing it abroad? And what is the matter with the underlying thesis of the “forward strategy for freedom”, whose development and denouement shaped directly or otherwise most of the past ten years? Let us take these matters in turn.
Relatively early in the 9/11 decade I interrogated the sudden American interest in and demand for Arab democracy as “the impossible imperative?”14 My conclusion was that, no, of course Arab democracy is not impossible, just very difficult to kick-start or sustain. This meant that, again given the timelines implied by the level of difficulty, democracy promotion was neither necessary nor sufficient as a policy fix for the terrorism problem. My view, and not only my view, remains that the terrorism threat needs to be addressed on its own terms, and that democracy promotion, as a patient, opportunistic effort rather than a mad-dash emergency plan, needs to proceed on its own terms as well; the two should not be mashed together.15 My aim in this interrogation of Arab democracy, however, was more important than my conclusion, for it aspired to, in effect, actually get fish to discover water.
As already noted in passing, even before the Iraq War, before the President’s November 2003 NED “forward strategy for freedom” speech, before his Second Inaugural, and before we knew that the WMD rationale for regime change in Baghdad was wanting, it was clear that the Bush Administration already saw the creation of Arab democracy as a strategic imperative and as a logical political accompaniment to the policy of military pre-emption. The basic idea was that in addition to pre-empting missiles and madmen we would in due course pre-empt motives as well.
There were, in my view, three problems with this idea—as I said at the time, “the first two very serious and the third even more so.” The first was that the assumption that the democratization of the Arab world would give rise to a “peaceful swath” in the Middle East, as William Kristol put it in the Weekly Standard, was probably wrong; young, under-institutionalized democracies have proved quite bellicose historically.
The second was that the aspiration presupposed so vast a shift in policy as to beggar imagination: “If we do suddenly begin to act as though our long-time authoritarian allies are really enemies blocking the democratization of their countries”, I wrote, “we will, in effect, be choosing bad relations with ten mostly well-entrenched regimes, without any reasonable near-term prospect of replacing them with democratic governments.”
But it was the third problem that mattered most: Could we do it?
The contemporary American understanding of democracy and its connection to how societies function, if one can call it an understanding at all, differs markedly from that of the Founders and their tutors. Locke, Montesquieu and even Rousseau never believed that all social and economic virtue depended on the adoption of a particular form of government, a notion that Samuel Taylor Coleridge ruefully called the “talismanic influence” of government over “our virtues and our happiness.” They saw things the other way around: A particular form of government was the consequence of a people’s long and refined moral, social and historical experience. They understood that Americans had suffrage because American society was democratically minded, not the other way around. Democracy, as Jefferson, Madison, Adams and the great lights of the American Founding generation saw it, depended on certain dispositions long in the making, most of which, they intuitively understood, came out of Protestant religious culture. Three such dispositions are critical requisites for a democratic political culture.
The first is that the citizenry believe that the proximate source of political authority is intrinsic rather than extrinsic to society—“of the people, by the people, and for the people” as opposed to some variety of divine law. The second is that they accept the idea that at least a certain subset of citizens (propertied males in the late-18th century scheme of things) are equal before the law, and that law trumps persons. The third is that they have a concept of majority rule.
Without the first disposition, the ideal of pluralism, of a “loyal opposition”, of the utility of honest doubt and hence the value of open debate, cannot exist. Without the second, a polity can be neither free nor liberal, neither meritocratic nor accountable. Without the third, the idea of elections literally makes no sense. When elections are held under conditions in which these three dispositions are weak or absent, one essentially has a democratic form without democrats to fill it in, and one will predictably get outcomes in line with what Samuel Huntington once called “the democracy paradox.”
In most of the Muslim world and the Arab world in particular these dispositions are weak on account of historical factors peculiar to the region. In brief: A belief in extrinsic sources of authority has been ratified by Islamic principles; the legitimacy of social hierarchy makes the idea of impersonal, formal equality before the law difficult to accept; and the tribal/clan tradition of consensus decision-making in most of the countries of the region makes the concept of winner-take-all majority rule almost incomprehensible. Others add that the very idea of representation, necessary to move from village-scale to national democratic polities, is weak in Arab culture. Arab social relations are concrete and highly personal. Abstracting them so that one person can stand in for ten or a hundred or a thousand others in a legislative setting is not a concept that comes naturally to every culture.16
This does not mean that Arab democracy is an oxymoron, that to be democratic requires being culturally Western, and certainly it does not mean that there is anything “wrong” with Arabs either cognitively or morally. There are genuine theological predicates for democracy within Islam, and there have long been genuine Arab democrats as small (or very small) minorities in every Arab country. Cultures change, and these days, arguably, they can change faster than ever with global communications connectivity. The events of this past winter and spring in the region strongly suggest as much.
But that is very different from President Bush’s secularized pseudo-religious view that democracy (and free markets) is the natural default condition of all humanity. This view, that all people at all times value freedom over every other value, including a degree of basic civic order that we take for granted but others cannot, is based on zero historical or social science evidence. The related view, that merely removing artificial obstacles to liberal institutions can therefore bring stable democracy into being fairly quickly, as occurred after the Berlin Wall fell and the Soviet Union collapsed, is similarly without any support in history or social science. These views are, plainly put, matters of faith—which brings us directly to the sources and flaws of the “forward strategy for freedom.”
After 9/11, Americans (and others, of course) searched for analogies that might help them understand the motivations for the attacks. But in their rush to closure, few searched very far, and in any event most had but modest reservoirs of historical analogies in which to search. (America seems to be the only place in the world, after all, where the phrase “That’s history” means that’s irrelevant.) Americans tended almost exclusively to choose Cold War metaphors to explain 9/11. Liberal idealists took their characteristic meliorist approach: It was poverty and injustice that motivated 9/11 and American policies that determined the target. There were dozens of calls for a “Marshall Plan for the Middle East” and hundreds of pleas to concentrate more than ever on solving the Arab-Israeli conflict, as if that were somehow a magic bullet that could fix all problems.
Conservative idealists, a.k.a. neocons, took the aforementioned democracy-promotion approach, arguing that the gist of the motivation was not economic but political. The Administration’s rhetoric went even further in due course, however, suggesting that U.S. policy itself was largely responsible for the debased condition of Arab political cultures. When President Bush famously said in November 2003: “Sixty years of Western nations excusing and accommodating the lack of freedom in the Middle East did nothing to make us safe, because in the long run stability cannot be purchased at the expense of liberty”, he argued in essence that it was U.S. policy, not the millennia-long incubated political culture of the region, that accounted for Arab autocracy. By making this argument, the Bush White House, in essence, adopted the wrongheaded left-wing side of an old debate over “friendly tyrants”, a very strange position for an avowedly conservative administration to take.17 The President also seemed to be saying, in a locution repeated by Secretary of State Rice in Cairo in June 2005 and many times thereafter, that U.S. Cold War policy in the region was unsuccessful on its own terms, that it did not in fact provide safety and stability.
This claim is nonsensical by any realistic measure. U.S. Cold War policy in the Middle East achieved exactly what it set out to within the broad framework of Containment: It kept the Soviets out, the oil flowing to the benefit of the liberal economic order over which the United States stood guard, and the region’s only democracy, Israel, safe. The record was not perfect, of course, and we certainly should have rethought old habits sooner than we did after the Berlin Wall fell, but it was good enough, as we say, for government work. Besides, it was never in the power of the U.S. government to bring about democracy in the Arab world during the Cold War. Yet the Bush Administration’s solution for the problem whose origins it misread was just that: deep-rooted reform of the Middle East’s sordid collection of autocracies and tyrannies (the major differences between the two were summarily overlooked) and, absent motion for reform from within, the policy strongly implied that pro-democracy regime change would be imposed from without by force of arms at times and places of Washington’s choosing.
Both of these Cold War analogies, liberal meliorist and conservative democratist, were mistaken. It is true that social injustice and political repression enable apocalyptical terrorism, but they are not its source. The core cause is the longstanding inability of most Muslim—and especially Arab—societies to adapt to the growing pressures of modernization, some pushing from the outside in the form of Westernization, some pulling from changes within that express themselves in demographic data on age cohort size, urbanization and literacy.18 These pressures tend to splinter traditional societies, notably those characterized by endogamous marriage patterns (those we generically and often too loosely call “tribal”), into three groups: assimilationists, who accept the authority of materially superior new ways; integralists, who try to adapt new ways to traditional values; and nativists, who almost invariably deploy religious symbols and authority against what they construe to be deliberate assaults on their corporate identity and individual dignity.19 The latter description fits al-Qaeda perfectly.
There is nothing particularly Arab or Islamic about this phenomenon. The chiliastic religious violence of nativists resisting pressures to change dots world history from the Jewish zealots of the 1st century CE to the anti-Mongol White Lotus movement of 14th-century China to the Peasants Revolt of 16th-century Germany to the Taiping Rebellion in 19th-century China, and that is not all. Even if American observers forgot or never knew anything about the Mau-Mau uprising in 1960s Kenya, they certainly might have seen in the 19th-century Ghost Dances of American Indians, right here on North American soil, manifestations of the same basic social-psychological dynamics. But those with Cold War analogies on the brain, who hastily jumped to confusions, so to speak, failed to note any of this.
The result was almost breathtakingly paradoxical as well as tragic. The more the “forward strategy” bore down on the Middle East, with guns in Iraq and with projects and programs galore practically everywhere else they could gain access, the more effectively local nativists were able to translate Western energies, jujitsu-like, to gain leverage over their domestic adversaries. Rapid economic growth and rapid democratization, even had they been possible, would not have stabilized Arab societies and made them less likely to spark off political violence against the West; it would have made such violence more likely, as anyone who understands Schumpeter’s term “creative destruction” knows. We are fortunate, therefore, that the strategy did not persist and “succeed” for any longer than it did. And certainly, latter-day claims by unreconstructed neoconservatives that the Arab Spring should be credited to George W. Bush’s “forward strategy” would be merely risible if they were not also so deformed.
The West has been a prop in what is predominantly a civilizational argument among Muslims—so it was with 9/11, so with the famous Danish cartoons episode of 2006, and so with much else since. When the Bush Administration campaigned to spread democracy, it never occurred to most of its principals that what it saw as a secular endeavor would be interpreted in the Muslim world through a religious prism and used accordingly in civilizational disputes. When Abu Musab al-Zarqawi, the late leader of Al-Qaeda in Iraq, tried to persuade Iraqis not to vote because “democracy” was a front, in essence, for Christian evangelism, a slippery slope leading down to apostasy, he spoke a language that resonated in the ears of a great many (though happily not a majority of) Iraqis and other Muslim Arabs.
As it happened, the locals were essentially correct about this. We Americans were speaking a creedal tongue that we thought entirely separate from “religion”, a word that does not exist as such in Arabic. After all, we “separate church from state.” In truth, American political culture is not as secular as most Americans think it is: The contemporary American idea of democracy is an attenuated expression of aspects of Anglo-American Protestant Christian tradition.20 Our longing to spread it to the Muslims is the 21st-century version of what was, in the 19th-century, a much more honest and self-aware missionary movement.21 We might fool ourselves by pretending that our deepest beliefs can be compartmentalized into what is “political” and what is “religious”, but Middle Easterners, who possess no such compartments by dint of a history without Renaissance or Reformation, know better. Not that theology and ideology are identical; they differ in important ways. But as creedal systems they are bound to be seen as dramatically less distinct by cultures in which political theology, to use Mark Lilla’s apt terminology, has never been vanquished or, in most countries, even seriously challenged.22
The upshot is a demonic version of O’Henry’s “Gift of the Magi”, where parallel but separate behaviors lead not to serendipity and bliss, but to violence and disaster. As I put it before, “by threatening and weakening the very Arab and Muslim state elites which we need to contain these [apocalyptical terrorist] movements, we make the prospect of the violence worse. By implying that we are politically and morally superior to them . . . we help nativists in their internal struggles with those who are objectively our natural allies.”23
That the locals in the Middle East were correct to see U.S. policy in creedal, really religious, terms can be subjected to a kind of litmus test, one residing in distinguishing between two ways of thinking about American exceptionalism. One version of American exceptionalism depends on what we might call historical fortuity: The country is singular and superior because a unique concatenation of religious civilization, a near virgin continent and an extraordinary generation of wise men made it so. God may have been behind all this for all anyone knows, but in fact nobody does know. The other version depends on divine favor and guidance: The American experience is a reflection of God’s grace and His plan for mankind. It is easy to tell from the rhetoric of any given President which version of American exceptionalism he avows. George W. Bush avowed the second version. It was no random slip that soon after 9/11 he employed the word “crusade” in his rhetoric, apparently having no idea of the word’s etymology.
Looking at U.S. behavior in the 9/11 decade as a manifestation of a secularized political theology actually explains far more than the standard parsing of the usual-suspects schools of thought approach, what with conservative and liberal realists versus one another and conservative and liberal idealists, Jacksonians and Hamiltonians and all that. Consider for example that when, only days after 9/11, Susan Sontag and other members of the professional adversary culture in the United States dared to suggest—in the New Yorker in Sontag’s case—that, pace President Bush, the perpetrators of 9/11 were not cowards and that Americans were not innocent victims of terrorism but rather were suffering just revenge for selfish and abrasive American foreign policies, they were treated exactly as heretics were in the so-called age of religion.24 They were not to be engaged and debated, only shunned or excoriated. Had it still been in style, I have no doubt that we would have heard calls for them to be burned as witches.
Indeed, America at war after 9/11 became, in the late Michael Kelly’s words, “a secular evangelism, armed.”25 This resonated in perfect pitch with a whole raft of earlier observations—Reinhold Niebuhr’s delicately sarcastic description of America as “tutors of mankind in its pilgrimage to perfection”, G.K. Chesterton’s quip that America is “a nation with the soul of a church”, and most dramatically and sincerely of all, Herman Melville’s declaration in White Jacket (1850), “We Americans are the peculiar, chosen people—the Israel of our time; we bear the ark of the liberties of the world.”
The use of the political theology prism also helps to explain the nature of neoconservatism, as that term was understood as a foreign policy orientation in the 9/11 decade. Neoconservatism overwhelmingly involves secular American Jews. There have been important non-Jewish neocons—Daniel Patrick Moynihan for a time, Jeanne Kirkpatrick, James Woolsey—and a few Jewish neocons are well-educated and traditional practicing Jews. But for the most part neoconservatives are non-observant Jews who exemplify the broad modern tendency for religious energies to attach themselves to politics and, in this case, illustrate the switching out of the realism-inducing awareness of Jewish history for the heroic, idealistic sagas of modern Zionist and American history. At its extreme, neoconservatism is the secularized and displaced yoking of the Jewish messianic ideal to American tanks and fighter aircraft.26 As such, the best definition of it that I have been able to devise is this: Neoconservatives are realists for whom all utopian ideologies are anathema except their own. This of course aligns with the truth that believing theists invariably reject the highly improbable sacred narratives of all religions except, again, their own.
Foreign policy in America lives in the eschatological dimension of what is anyway a heavily moralistic domestic political environment. That is why, with and mostly without neocons, we seem to generate so many foreign policy “doctrines”—in the 9/11 case not just one Bush Doctrine but, by some counts, as many as five. That is why we are able to so deftly deposit real tragedy into a sacred narrative in which imputed meaning and supposedly cosmic purpose soak up the blood and ease the pain: think the Civil War, for example. America is not the only liberal democracy to attach moralistic baggage to its foreign policy. There was the British “white man’s burden” and the French mission civilisatrice, of course. But in our case the baggage contains our work clothes, not our toiletries and make-up kit.
Note, too, that whenever American exceptionalism is propelled into foreign policy thinking, its characteristic brand of Enlightenment universalism crowds out any serious appreciation of cultural differences. This usually leads to an almost willful insistence on not studying the history, not learning the language, and not actually listening to what intelligent people in Iraq and Afghanistan, with troves of precious metis (local knowledge), try to tell us. That, in turns, abets frustration and failure. It is at times like these that one wants to seek counsel not from pundits or scholars, but from poets. I am thinking of Robert Burns: “Oh would God the gift to gi’ us, to see ourselves as others see us.”
Of course, the problems of the 9/11 decade have not flowed only from generative intellectual errors. There was plenty of old-fashioned poor prudential judgment on the part of individuals, a lack of attention to governance design issues well beyond the DHS mess, and bureaucratic screw-ups. All these forms of dysfunction came together in the Iraq War.
L. Paul Bremer’s Coalition Provisional Authority (CPA) decisions to disband the Iraqi military and deeply lustrate Ba’athism were each bad enough on their own, but in tandem they were calamitous. They brought together people who knew how to organize with those who knew how to blow things up.27 The President’s decision-making style, which failed repeatedly to sort out policy differences among the White House, the CPA, the Office of the Secretary of Defense and the uniformed military, was best described to me by Richard Perle as “maddeningly episodic.” Partly as a consequence, the U.S. government as a whole demonstrated mass ineptitude at interagency coordination and evinced little appreciation that new policies cannot easily (or at all in most cases) be implemented without new governmental design structures. At the time of the President’s November 2003 NED speech, for example, there were 16 separate democracy promotion programs in the U.S. government, with no one in charge and no functional budget to track the effort as a whole. Administration principals often failed to realize, too, that abstract policy decisions are not self-implementing; they cost money that has to be gotten from somewhere.28
The U.S. intelligence community, above all, has been responsible for much of the grief of the past decade. It failed to prevent the 9/11 attacks (and they were not inevitable). The CIA-gone-operational botched the battle of Tora Bora, failing to put Osama bin Laden and Ayman al-Zawahiri out of business as early as December 2001. The community botched utterly the WMD portfolio on Iraq and, perhaps even more damaging, failed to assess how Iraqi society would react to its political decapitation.
In retrospect, it is something of a miracle that the Iraq adventure did not turn out even worse than it has (though, of course, its being off the front page does not mean that the drama is over). Things could still go badly. Note finally on the Iraq portfolio that the contention that it was George W. Bush’s “surge”, implemented against the advice of America’s foreign policy sages, that turned the tide in Iraq is not in any simple sense accurate. The rising of the Sunni tribes in Al-Anbar had more to do with the amazingly stupid and counterproductive behavior of al-Qaeda than it did with the “surge.”29 We probably did not deserve this good fortune, and so we are forced to conclude, yet again, as the old saw has it, that “God protects fools, drunkards, and the United States of America.”
That said, the United States will not emerge without lingering debilities from the 9/11 decade. One mishandled war in Afghanistan and one nearly disastrous one in Iraq not only cost more than $1 trillion; they exacted steep opportunity costs in terms of America’s grand strategy. For nearly a decade the government has been pouring resources into the U.S. Army and urging major doctrinal changes in how those resources are used—whether for good or ill remains a matter for debate. But there can be no debate that the U.S. Navy and Air Force, the main pillars of America’s global forward-presence based strategy, have been severely undernourished during this decade. The real costs of the 9/11 decade, therefore, may well lie in the future.
The American penchant for seeing the world, especially the world of foreign policy and national security, in transcendental terms, is not an historical constant. It tends to rise in phase 3 of the cycle, when the God-talk emerges out of post-shock mobilization. But there is a concurrent trend of more recent vintage that may have made things more acute during the 9/11 decade.
Over the past half century America has become increasingly decultured, socially thinned and hollowed out. As Robert Putnam famously put it in his Bowling Alone argument, we have suffered an erosion of social capital, or what some call social trust. This is not the place to delve into why and how this has occurred. Suffice it to say that the face-to-face glue that enables social interactions to generate and sustain certain attitudes about what is and isn’t virtuous behavior—the very heart of what makes a society prosperous and happy—has been in ever shorter supply. As David Brooks has summed it up, the Left has told us that the ultimate aim of public policy is maximizing individual moral freedom and the Right has told us that the ultimate aim is maximizing individual choice through the marketplace. Both messages have been corrosive of social capital. Meanwhile, technology-driven phenomena have given us isolated ring-road malls that have destroyed the vibrancy of downtowns, machines that have reinforced class isolation by replacing bank tellers, gas station attendants, receptionists and more, a banking system that has divorced homeowners from lenders who know local mores and needs, and online networking, where like may talk only to like and which now bids to replace community gatherings. But what are the implications of deculturation for politics?
The decline of social trust tends to abet both the polarization of politics and popular cynicism about government. It produces a political system in which the chain of connective institutions that link family to neighborhood to larger community to town or region and ultimately to the national level gets broken.30 The state thus seems both alien and intrusive at the same time as it tries to compensate for a social fabric now rent and tattered. Political parties, particularly those that represent class or ideological structures, fill the spaces once occupied by a diverse array of social interactions. They become in-group/out-group oriented as well-known psychological dynamics spread the distance between them, leading to a considerable exaggeration of how much they actually differ in practical terms. The result is that compromise and horse-trading become more difficult, and the insertion of “culture war” issues into this environment has served only to harden the edges of the us-vs.-them distinctions that define it. Identity groups disguised as political parties do not play well together.
The implications for foreign policy are obvious. Presidential judgments necessarily become politicized, and opponents invariably try to criminalize them. Every decision becomes part of the catechism to the loyal, an act of moral enormity to the opposition. That underlying polarization helps explain why the Bush haters insist that the fact that the country has not been attacked again in the manner of 9/11 proves that the President exaggerated the threat for political reasons, and why Bush supporters insist that the reason we’ve not been hit again is the success of the bold policies the President ordered to protect us. That is why the acrid debates over Guantánamo and Abu Ghraib, waterboarding and warrentless wiretaps, the Patriot Act and the reach of wartime Executive authority, took on the tones they did. These arguments did not remind one of the civilities of the Common Law tradition; they read more like transcripts from the Spanish Inquisition.
It is also why probably a majority of self-described liberals and Democrats today believe that George W. Bush and his aides, including Secretary of State Powell, knowingly lied about there being WMD stockpiles in Iraq. When you dispute this claim (whether on the basis of access to classified information or not), many will look you in the eye and assert that there is no difference between knowingly uttering a mistruth (lying) and inadvertently doing so (being mistaken). Moral obtuseness on such a scale can only be explained by the imperviousness of religious convictions to evidence and logic.
The great sociologist E. Digby Baltzell, the man who coined the term WASP back in the 1950s, once said to me (in 1969 or 1970, I think it was) that the greatest tragedy of 20th-century America is that the volcanic energies of religion had migrated into politics, to the detriment of both. No wiser comment has ever been made about the trajectory of American politics this last half century or so, and here lies, I think, the key insight for those trying to comprehend the American 9/11 decade at its core. At its core, the 9/11 decade has not been about what others have done to America; it has been about what we Americans have done to ourselves, here in our transcontinental-scale, open-air church we call a country.
1Some weeks ago the editor of the American Review in Sydney, Australia, Mr. Minh Bui Jones, asked me for a combined personal and analytical essay on the 9/11 decade. He thought of me as a Washington insider, and so I had to explain that if I was, I was so only marginally. He was not to be deflected, however. So, as he did once before, when he asked me for a synoptic analysis of the Obama Administration’s foreign policy, he pressed me to write on a subject that proved hard to wrestle onto paper. Doing so was a worthwhile learning experience for me; this essay, too long for the space he had to hand, is what I wrote—since amended only briefly because of Osama bin Laden’s death on May 2, 2011. A shorter version, completed before this event, appears in the May/July 2011 issue of American Review (Sydney).
2The Commission’s recommendations, published in March 2001, for a Homeland Security Department were received coolly by the new Bush Administration, which vouchsafed them to a committee chaired by Vice-President Dick Cheney scheduled to report on October 1, 2001.
3“The Present Opportunity”, The National Interest, No. 65 (Fall 2001).
4See my “NSC-68 Redux?” SAIS Review (Winter/Spring 1999).
5“The Present Opportunity”, p. 159. I had earlier warned about the specific problem of Afghanistan, in “Afghanistanding”, Orbis (Summer 1999).
6Bonner quoted in the New York Times, December 6, 1991.
7This research is nicely summarized in David Brooks, The Social Animal (Random House, 2011), galley pages 248-9.
8First discussed publicly in Jacob Weisberg, The Bush Tragedy (Random House, 2008), chapter 6.
9A guild within American and British intellectual life has long been devoted to the foundational liberal principle that democracy and market economics are ever and always mutually reinforcing. They often have been thus, but “always” is a very demanding standard that reality cannot attain. See the Spring 2011 special issue of The American Interest on “plutocracy and democracy.”
10As many people have pointed out, one has to specify what form of neoconservatism one is talking about, for the original from the middle 1960s, of which Irving Kristol was “godfather”, bears almost no relationship to what neoconservatism had come to signify by 2001.
11Wolfe, “The Doctrine that Never Died”, New York Times, January 30, 2005.
12On the concept of a dominant strategy, see Robert Jervis, Perception and Misperception in International Relations (Princeton University Press, 1976), pp. 109-10, 134-35.
13On the national security threat that U.S. officials genuinely believed to be posed by Ba’athi Iraq, see Douglas Feith, War and Decision (Harper, 2008).
14“The Impossible Imperative? Conjuring Arab Democracy”, The National Interest (Fall 2002).
15See the column I co-authored with Francis Fukuyama, “A Better Idea”, Wall Street Journal, March 27, 2006.
16See in general Lawrence Rosen, Varieties of Muslim Experience (University of Chicago Press, 2008), and on legal representation, see his Law as Culture (Princeton University Press, 2006).
17See my “The Wrong Stuff”, The America Interest (Autumn 2005).
18Garfinkle, “Comte’s Caveat: How We Misunderstand Terrorism”, Orbis (Summer 2008).
19Demonstrated in Anna Simons’s two-part essay, “Making Enemies”, in the Summer and Autumn 2006 issues of The American Interest.
20See the brilliant essay by James Kurth, “The Protestant Deformation”, originally published in Orbis (Spring 1998), updated and extended in The American Interest (Winter 2005).
21Note my “Die bewaffneten Missionare”, Die Zeit, January 30, 2003.
22Lilla, The Stillborn God: Religion, Politics, and the Modern West (Knopf, 2007).
23“Comte’s Caveat”, p. 413.
24America’s literati were long practiced in making such arguments. Consider this extraordinary locution from Tom Robbins’ 2000 book Fierce Invalids from Hot Climates: “’It’s only natural,’ said Switters. ‘American foreign policy invites opposition. It invites terrorism. . . . Terrorism is the only imaginable logical response to America’s foreign policy, just as street crime is the only imaginable logical response to America’s drug policy.’”
25Kelly, “JFK v. Teddy: The Battle of the Kennedys”, Jewish World Review, October 9, 2002.
26See my “Bye-Bye Bush: What History Will Make of 43”, Foreign Affairs (March/April 2008), and Jewcentricity: Why Jews Are Praised, Blamed, and Used to Explain Just About Everything (John Wiley & Sons, 2009), pp. 124-25.
27 I tracked the war as it was developing in three essays in National Review, dated November 25, 2002, April 21, 2003 and July 28, 2003, each essay drafted approximately 3-4 weeks before the newsstand current date.
28Detailed brilliantly in Dov Zakheim, A Vulcan’s Tale: How the Bush Administration Mismanaged the Reconstruction of Afghanistan (Brookings Institution Press, 2011). Zakheim served as Comptroller of the Pentagon during the Bush Administration.
29See David Kilcullen, “Reading Al-Anbar”, The American Interest (September/October 2010).
30Brooks, The Social Animal, pp. 320-21.