Since the end of the Cold War, and especially in the wake of the September 11, 2001, al Qaeda terrorist attacks on the United States, the U.S. government has reexamined the utility of both nuclear deterrence and nonproliferation. The discovery in the wake of the 1991 Gulf War that Iraq, an NPT signatory, had secretly embarked on a huge nuclear weapons program prompted the United States to embrace counterproliferation, which consists of a series of nonwar initiatives designed to prevent hostile states from acquiring nuclear weapons and, in the event of crisis or war, to destroy such weapons and their supporting infrastructure.
The 9/11 attacks a decade later spawned proclamation of a new use-of-force doctrine calling for preventive military action against so-called “rogue states” seeking to acquire nuclear weapons. The doctrine reflected a loss of confidence in traditional nuclear deterrence; rogue states, it was believed, were irrational and might launch attacks on the United States or transfer weapons of mass destruction to terrorist organizations. Thus the global war on terrorism, highlighted by the preventive war against Iraq, became as much a war of counterproliferation as it was a war on terrorism.
The wisdom and necessity of preventive war as a substitute for nuclear deterrence are, however, highly questionable. The evidence strongly suggests that credible nuclear deterrence remains effective against rogue state use of WMD, if not against attacks by fanatical terrorist organizations; unlike terrorist groups, rogue states have critical assets that can be held hostage to the threat of devastating retaliation, and no rogue state has ever used WMD against an enemy capable of such retaliation. Additionally, preventive war is not only contrary to the traditions of American statecraft that have served U.S. security interests so well but also anathema to many longstanding friends and allies.