I have been collecting assorted quotes and pulled out some pertinent ones about nuclear weapons. The challenge, as George W. Bush memorably put it, is that a president wouldn’t even have time to get off the “crapper” before having to make a launch decision, a decision that could be based on partial, contradictory, or even false information. Ronald Reagan, when he assumed the presidency, was said to have been shocked that he would have as little as six minutes to make a decision to launch. I highly recommend the movie House of Dynamite to get a sense of how the decisions are made and the limited amount of time available to make those decisions.
The socio-biologist E. O. Wilson described the central problem of humanity this way: “We have Paleolithic emotions, medieval institutions, and godlike technology.” The main challenge of the 80 years since the Trinity atomic test has been that we do not possess the cognitive, spiritual, and emotional capabilities necessary to successfully manage nuclear weapons without the risk of catastrophic failure.
Technical flaws are inevitable in any complex system. I read and reviewed an excellent book entitled Normal Accidents several years ago. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. [2] Just a couple examples:
November 9, 1979 – NORAD False Alarm
A training tape simulating a massive Soviet strike was mistakenly loaded onto an operational NORAD computer. The system flagged a large‑scale attack, prompting SAC, ICBM crews, nuclear bombers, and the National Emergency Airborne Command Post to go on high alert. Six minutes later, satellite data showed no real launch, and the alert was cancelled. The error was traced to the technician’s mistake. As senior State Department adviser Marshall Shulman later noted, such false alerts are not uncommon.
October 28, 1962 – Cuban Missile Crisis False Alarm
Radar operators in Moorestown, NJ, reported an imminent nuclear strike aimed at Tampa, FL, just before 9 am. The warning turned out to be a false positive caused by a test tape simulating a Cuban missile launch being run simultaneously with an unexpected satellite passing overhead. Overlapping radars that could have verified the event were offline, and the operators had not received the usual satellite‑pass notification because the responsible facility was reassigned elsewhere. The alarm was quickly dismissed.
In 2007, John Rubel decided to write a short memoir (1) that detailed his experiences working on the grand plan for nuclear war. As Deputy Director of Defense Research & Engineering, Rubel was invited to attend a very secret meeting of the top military brass that discussed and presented SIOP-62 that determined how an attack would be met, i.e. the nuclear response. It was guaranteed to cause about 2 billion deaths, which, at that time (1960--I was 13 at the time) represented about 30% of the world's population. They also analyzed the possibility of accidental firing of the Minuteman missiles as well as the options available to the president, none really, and as noted above, s/he had but minutes to decide what to do. It amounted to genocide on a massive scale. "Rubel revealed this information in a short memoir. As Rubel prepared for his own death, he summoned the courage to express a long-repressed truth. That he felt remorse for having participated in such a “heart of darkness” plan. For saying nothing for so many decades after the fact. What he was part of, Rubel wrote, was a plan for “mass extermination.” [2]
SIOP-62 and the design of the MinuteMan missile system had a commonality: “Both deliberately removed effective operational control from the President or any other civilian or even military commander in the event of a nuclear confrontation. And the Minuteman launch system design, a “detail” not generally considered within the purview or even competence of high-level policy makers, invited the possibility of unauthorized or accidental mass launch of tens or even hundreds of nuclear-tipped missiles with little or no warning.”[3]) Numerous military leaders argued for a pre-emptive strike rather than wait for an attack and they actively lobbied for it, retaliation with mutual destruction be damned.
Rubel’s Doomsday Delayed demonstrates how strategic doctrine became hardwired into technological systems during the formative years of the Minuteman ICBM and SIOP-62. The Minuteman missile was praised as a survivable second-strike deterrent: hardened silos, solid-fuel propulsion, and near-instant launch capability. Yet these features also supported launch-on-warning logic and narrowed civilian decision space. The original launch architecture required only a limited “vote” among underground control centers to fire an entire squadron of missiles, emphasizing speed over deliberation. Early electromechanical vulnerabilities further exposed the risks of unintended launch.
SIOP-62 compounded this rigidity by offering essentially all-or-nothing strike options. The scale of projected casualties—hundreds of millions—was matched by the narrowness of political flexibility. Rubel’s central insight is that strategic systems can “determine policy by their very design.”¹
The 1998 New England Journal of Medicine Special Report [7], issued not long after the fall of the Soviet Union, challenged the widespread assumption that the end of the Cold War eliminated nuclear danger. The authors concluded that U.S. and Russian nuclear forces remained on high alert and that launch-on-warning procedures persisted unchanged. They warned that aging Russian technical systems, deteriorating early-warning satellites, and declining morale among nuclear personnel had increased the risk of accidental or unauthorized launch.
The NEJM analysis emphasized that both countries maintained thousands of warheads capable of being launched within approximately fifteen minutes. Launch procedures allowed only a few minutes for detection, top-level decision-making, and dissemination of authorization. Such compressed timelines magnify the danger of false alarms, technical malfunction, or misinterpretation.
The NEJM authors also noted that even after the 1994 U.S.–Russian agreement to “detarget” missiles, no additional time had been added to the launch process; retargeting is simply part of routine procedures.⁵ Thus symbolic de-escalation did not materially reduce operational risk.
The Union of Concerned Scientists’ fact sheet documents numerous “broken arrows” and false warnings, including radar misreadings, computer errors, and hardware malfunctions. These incidents reveal that technical failure is not anomalous but recurring. "Despite the most elaborate precautions, it is conceivable that technical malfunction or human failure, a misinterpreted incident or unauthorized action, could trigger a nuclear disaster or nuclear war." — U.S. –Soviet Accident Measures Agreement, September 1971 [4], The sheet notes that while there are multiple redundant safety mechanisms to prevent accidentally triggering a device, there are been failures and only luck has prevented an explosion. Many accidents have happened over the United States, but we never hear of them as publicity is disallowed. Similarly, we can only assume a similar number of accidents have happened in other nuclear countries in spite of precautions.
The 1983 Petrov incident remains one of the most consequential examples. Soviet satellites detected what appeared to be incoming U.S. missiles. All systems pointed to an imminent attack. Lieutenant Colonel Stanislav Petrov judged the alert to be a malfunction rather than an attack. Subsequent investigation showed that sunlight reflecting off clouds had fooled the satellite sensors. His refusal to escalate the warning likely prevented retaliatory launch.
The NEJM article reinforces this concern, identifying false-warning–triggered launch as one of the most plausible accidental-war scenarios. It specifically referenced the 1995 Norwegian rocket incident as an example of how ambiguous data nearly initiated Russian launch procedures under standard protocols.
On January 25, 1995, a Norwegian scientific rocket was mistaken by Russian radar for a possible submarine-launched ballistic missile. President Boris Yeltsin activated the nuclear briefcase for the first time in history. Only minutes remained before a response deadline under launch-on-warning doctrine when Russian analysts concluded the object posed no threat.
These incidents demonstrate how automated systems, when coupled with high-alert postures, compress decision time to a matter of minutes and elevate ambiguous data into existential threats.
The NEJM article moved beyond strategic analysis to model the public health consequences of an accidental intermediate-scale Russian submarine launch. The authors analyzed a scenario involving a single Delta-IV submarine carrying 16 missiles with multiple 100-kiloton warheads. They estimated that if 48 warheads detonated over eight major U.S. urban areas, immediate firestorm deaths could total approximately 6.8 million people.
The physical effects would include super-heated firestorms with near-100 percent lethality within several kilometers of each detonation, widespread fallout zones delivering lethal radiation doses within hours, collapse of sanitation and medical infrastructure, and likely epidemics of infectious disease. The authors concluded that secondary deaths from radiation and infrastructure collapse could exceed initial fatalities. (See Annie Jacobsen's book for a precise recounting of the process.)
Health care systems would be completely overwhelmed. Most major medical centers in affected cities would be destroyed. The United States’ limited burn-care capacity—only about 1,700 beds nationwide—would be grossly insufficient. The NEJM authors emphasized that no effective medical response could meaningfully mitigate such destruction. Prevention, therefore, becomes the only viable public health strategy.
The NEJM report concluded that ballistic missile defense offers no reliable short-term solution and that de-alerting nuclear forces is both more feasible and more effective. The authors urged a verified bilateral agreement between the United States and Russia to remove missiles from high-level alert and eliminate rapid-launch capability. Similar recommendations have been advanced by the National Academy of Sciences, senior military leaders, and nuclear policy experts.
Rubel likewise credited civilian intervention during the Kennedy administration with correcting Minuteman vulnerabilities and expanding presidential control.¹ The UCS report similarly advocates reducing hair-trigger alert status to mitigate accidental launch risk.
These proposals share a common goal: extending decision time and restoring deliberative space to nuclear command systems. The logic is straightforward—if accidental or mistaken launch becomes physically impossible within minutes, the risk of catastrophe declines dramatically.
Across Cold War and post–Cold War contexts, a consistent pattern emerges. Nuclear systems are designed for speed. Warning systems are imperfect. Human error and technical malfunction are recurrent. Decision time is measured in minutes.
Rubel shows how doctrine became embedded in system design. The UCS documents repeated near-disasters. The Petrov and Norwegian rocket incidents illustrate how individual judgment narrowly prevented escalation. The NEJM assessment demonstrates that even a “limited” accidental launch would produce millions of immediate deaths and incalculable secondary casualties.
Nuclear catastrophe has been avoided not because systems are fail-safe, but because failure has thus far stopped short of irreversibility. As long as launch-on-warning postures and high-alert arsenals persist, the annual probability of accidental war—however small—remains nonzero. Over time, such probabilities accumulate. [9]
The most dangerous feature of nuclear arsenals is not their destructive yield but their speed. Extending decision time, removing weapons from hair-trigger alert, and ultimately eliminating nuclear arsenals altogether remain not merely strategic preferences but urgent public health imperatives.
Endnotes
John H. Rubel, Doomsday Delayed: USAF Strategic Weapons Doctrine and SIOP-62, 1959–1962 (Lanham, MD: Hamilton Books, 2008). https://www.si.edu/media/NASM/NASM-DoomsdayDelayed.pdf free download from the Smithsonian.
Perrow, Charles (1999) Normal Accidents: Living with high risk technologies. Princeton University Press.
Perrow, Charles. (2011) The next catastrophe: Reducing our vulnerabilities to natural, industrial, and terrorist disasters. Princeton University Press.
This follow-up to his 1999 book discusses solutions to the highly coupled systems in order to reduce the inevitability of accidents.
Jacobsen, Annie. Nuclear War: A Scenario, 2024. Chap 1.
Rubel, , Preface
Union of Concerned Scientists, “Close Calls with Nuclear Weapons” (Fact Sheet): https://www.ucs.org/sites/default/files/attach/2015/04/Close%2520Calls%2520with%2520Nuclear%2520Weapons.pdf
“1983 Soviet nuclear false alarm incident,” Wikipedia.
TOI World Desk, “The first world leader to activate the ‘nuclear briefcase’: How a Norwegian research rocket nearly triggered nuclear war,” Times of India, Feb. 19, 2026.
Lachlan Forrow et al., “Accidental Nuclear War — A Post–Cold War Assessment,” New England Journal of Medicine 338, no. 18 (April 30, 1998): 1326–1331.
9. Avenhaus, R., Fichtner, J., Brams, S. J., & Kilgour, D. M. (1989). The probability of nuclear war. Journal of Peace Research, 26(1), 91-99. https://doi.org/10.1177/0022343389026001009
I really hesitate to even list this as a source. However, it's an interesting attempt to statistically analyze and estimate the likelihood of nuclear war. If you like lots of stuff like: then you'll love this article. According to George Sorenson, during the Cuban missile crisis , Kennedy estimated the chances of a major war between the United States and the Soviet Union to be somewhere between 30 and 50%. How he came to that conclusion and its validity would appear to be nothing but his intuition. The authors in this article have applied statistical methods to estimate the statistical likelihood given certain assumptions, e.g. first strike, accidental, erosion of trust, avoid defeat, etc. It's an interesting exercise, but I'm not sure it gets us anywhere. One element they did not account for was a demented in the White House. You can get the article (it's short) here. Note that it was written just before the fall of the USSR,
Other Sources
Ellsberg, D. (2017). The doomsday machine: Confessions of a nuclear war planner. Bloomsbury Publishing USA. Memoir by a former nuclear war planner who had access to highly classified plans; he describes Cold War launch authorities, hair‑trigger postures, and multiple times
Rhodes, R. (2007). Arsenals of folly. Vintage. Covers late–Cold War nuclear politics, arms racing, and crises around the Reagan–Gorbachev era, with attention to moments when misunderstanding and brinkmanship raised the risk of war.
Schlosser, E. (2014). Command and control. Michael Joseph. (2013) Focuses on the 1980 Titan II missile explosion in Damascus, Arkansas, then widens out to a history of nuclear accidents, broken arrows, and close calls inside the U.S. arsenal, arguing that human error and system complexity make “perfect” safety impossible.
No comments:
Post a Comment