Nuclear Weapons & the Always-Never Dilemma

By Joseph T. Stuart, PhD
Assistant Professor of History, University of Mary

On September 18, 1980, Senior Airman David F. Powell dropped the nine-pound socket that he was using with a wrench to unscrew a Titan II missile pressure cap. Although Powell was only 21 years old, he was already an experienced Titan repairman. But it had been a long day and he was tired. As the socket bounced on the work platform, he tried to catch it. Powell missed and the socket fell through the gap between the platform and the missile. It plummeted 70 feet, ricocheted off the thrust mount and pierced the skin of the missile. Rocket fuel began to spray into the silo, which was located near Damascus, Arkansas.

When the fuel— highly flammable aerozine-50—is mixed with an oxidizer, it explodes with enough power to send the U.S. military’s largest intercontinental ballistic missile as far as 6,000 miles. The Titan II missile carried a W-53 thermonuclear warhead with a yield of nine megatons—about three times the force of all the explosives dropped during World War II, including atomic bombs.

Powell’s dropped socket created an emergency that eventually caused a tremendous explosion, killing one worker and wounding many others. Tons of concrete were blown into the air and the missile’s warhead popped off like a cork, soaring 1,000 feet overhead.

Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, (Penguin Press) Eric Schlosser

Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, (Penguin Press) Eric Schlosser

This is how Eric Schlosser opens his book Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, which was a finalist for the 2014 Pulitzer Prize in history. Schlosser is an investigative journalist with academic degrees in British imperial and American history from Princeton and Oxford Universities. In 2001, he gained notoriety with the publication of Fast Food Nation: The Dark Side of the All-American
Meal.
In Command and Control, Schlosser argues that even the most unlikely
of events involving nuclear weapons can happen, despite technological sophistication
and the command-and-control system governing the nukes, because humans are
fallible. As a result (and a precaution), we need humility in our use of centralized “expert systems.”

“I wrote the book to tell a story, to honor the heroes of the Cold War and to inform the public,” said Schlosser during a presentation in Bismarck in September 2016, sponsored
by the North Dakota Humanities Council, “so that we as a nation can decide to either get
rid of nuclear weapons or invest in the necessary precautions to take care of them to
minimize the risk of accidental detonation.”

The book illustrates the dangers of technologically sophisticated command-and-control structures with implications for all power structures, not just the U.S. military. Civil governments, businesses and nonprofit organizations also rely on structures connecting the executive authority of a centralized network to the institution’s body.

Contingency

The story begins with the dropped socket, which is an example of contingency, a key historical concept meaning that history is shaped by unforeseen variables of human choice and events outside human control. Little things can make a huge difference, such as a wrong turn by Archduke Franz Ferdinand’s driver that led to his assassination, which in turn set Europe on course toward the Great War. There are many contingent factors involved in any event, but if even one changes, the outcome often changes too. The past does not inevitably lead to the present. History is not governed by fate, but is instead a drama of surprises in which individual human judgment, heroism and fallibility play key roles.

There are many examples of contingency in relation to nuclear accidents in Schlosser’s book. In 1964, at a Minuteman missile site near Ellsworth Air Force Base (AFB) in South Dakota, a worker fixing the security system forgot to bring a fuse puller and instead used a screwdriver. This caused an electrical short, and a rocket fired on top of the missile, causing the W-56 warhead to fall off, bounce against the sides of the silo and hit the bottom of the silo. Fortunately it did not explode.

In 1965, a high-pressure hydraulic line ruptured, possibly due to a welding mistake, in a missile silo near Searcy, Arkansas, leading to a fire that killed 53 repairmen. The missile survived. It was refurbished and relocated to the silo near Damascus, Arkansas. And it was the same missile on which Powell would drop his socket 15 years later.

In 1968, a B-52 bomber carrying four hydrogen bombs flew a routine mission near Thule Air Force Base in Greenland. The co-pilot had put foam rubber cushions under the instructor navigator’s seat to make it more comfortable, not realizing this blocked a vent. The cushions caught on fire, which no one could put out. The crew ejected and the plane crashed, causing a huge conventional explosion that spread plutonium dust over many acres. Fortunately, again, the nuclear weapons didn’t explode.

On September 15, 1980—three days before the dropped socket incident—someone at Grand Forks AFB forgot to screw a nut on a fuel strainer in the engine of a B-52 loaded with eight short-range nuclear missiles and four hydrogen bombs. During a drill that day, a fire started as the plane sat on the runway where strong wind gusts turned it into a giant flamethrower. The crew jumped clear and ran. The fire burned for almost three hours as phone calls to Strategic Air Command (SAC) and Boeing proved fruitless. No one in command knew what to do. Then the Grand Forks fire chief telephoned Tim Griffis, a civilian fire inspector on base who had a wife and small children. He knew the inside of B-52s intimately and had helped put out fires on them before. When he got to the plane, paint was peeling as fire hoses tried to keep the bomb bay cool. Tim ran to the B-52 and jumped inside. He switched on the emergency battery, which activated the fire suppression handle, cutting the fuel supply to the fire, which went out immediately. Everyone cheered. Griffis received a Civilian Medal of Valor.

Had Griffis not known what to do and acted with such alacrity, the bombs would certainly have exploded in a conventional detonation and blanketed the city of Grand Forks in plutonium dust. It was also theoretically possible that a nuclear explosion would have been triggered. Had the plane been facing the other direction, the wind would have blown the flames across the fuselage, killing the crew as they tried to escape and rapidly causing a blast. Socket, screwdriver, hydraulic line, foam cushion, missing nut, wind direction—all are examples of unexpected contingencies in the management of America’s nuclear weapons, and all of them contributed to emergency situations that the command-and-control network could not defuse. Only in Grand Forks did a man on the ground prevent an explosion.

Launch complex 571-7 is all that remains of the 54 Titan II missile sites that were on alert across the United States from 1963 to 1987. The 103-foot-long Titan II missile could deliver a 9-megaton nuclear warhead to a target 6,000 miles away in 30 minutes.


Photograph taken by Steve Jurvetson.

Command & Control

All military operations rely on “command and control”—an expert system by which a commander exercises authority toward the accomplishment of a mission. The paradox at the heart of nuclear command and control is the “always-never dilemma [that] still plagues us,” said Schlosser in a 2014 interview for the Bulletin of the Atomic Scientists. “The Pentagon wants to deploy nuclear weapons that are always available for immediate use but will never be stolen, used without proper authorization or detonated by accident. The administrative and technical means necessary to ensure the ‘always’ part of the equation often conflict with those necessary to ensure the ‘never’ part.”

SAC was established in 1947 to govern a defense system involving hundreds and then thousands of nuclear warheads. These warheads were meant to always be ready for delivery via bomber or intercontinental ballistic missile (such as the Titan II), but never explode accidentally or turn up missing. This paradox lies at the heart of nuclear weapons design, which together with its mission of deterrence, made SAC one of history’s most powerful military organizations.

To fulfill this mission, SAC required the best command and control possible. Decisions had to be made quickly, and a breakdown in command could lead to a mistaken nuclear launch or annihilation. This required advanced technology that would make communication fast and computation powerful—needs that led to the first electronic computer. Thus, Schlosser notes that the computerization of war was an important step toward the computerization of society. The need to automate early warnings of incoming Soviet bombers and to communicate with antiaircraft missiles and fighter-interceptors during a nuclear war created the first computer network as the ancestor of the Internet. By the early 1960s, the Air Force’s demand for self-contained, inertial guidance systems inside missiles led to the miniaturization of computers and integrated circuits, which became the building blocks of the modern electronics industry.

The Temptation of Power

Command and control technology had to be protected from possible Soviet nuclear attack. In 1957, SAC moved into new headquarters near Omaha, Nebraska. A command bunker three floors underground could house 800 people for several weeks when necessary. There was a wall, Schlosser described, about 20 feet high and almost 50 yards long, covered in charts, graphs and a map of the world showing the flight paths of SAC bombers. Eventually, this information was projected onto movie screens, giving the underground command center a “hushed, theatrical feel, with rows of airmen sitting at computer terminals beneath the world map and high-ranking officers observing it from a second-story, glass-enclosed balcony.” That was the feel of the power of a central headquarters keen on exerting influence over events and people from a distance.

The “fascinating-terrifying” sensation of atomic power was also present when Brig. Gen. Thomas Farrell held the plutonium core of the first atomic bomb to be tested in New Mexico in 1945. It was the size of a softball with the weight of a bowling ball. “So I took this heavy ball in my hand and I felt it growing warm,” Ferrell recalled. “I got a sense of its hidden power.” After the explosion, he wrote, “Thirty seconds after [an intense light], the explosion came first, the air blast pressing hard against the people and things, to be followed almost immediately by the strong, sustained, awesome roar which warned of doomsday and made us feel that we puny things were blasphemous to dare tamper with the forces heretofore reserved to The Almighty.”

These experiences of power create two temptations. The first is akin to our deep attraction for magic, which J.R.R. Tolkien defined in On Fairy Stories as “not an art but a technique; its desire is power in this world, domination of things and wills.” This fascination with the pursuit of power can become an end in itself, as evidenced in Schlosser’s account of how the U.S. built many more thousands of nuclear weapons than needed for national defense. We built them because we could and, at some point, stopped asking why.

Secondly, there is the temptation to think that one can master situations from afar through hidden forces that can be monopolized, as did Sauron in Tolkien’s Lord of the Rings. Another example is the natural magic of the Renaissance when “scientific magicians” strove to understand how material bodies act upon one another through hidden forces such as magnetism. “Knowledge itself is power,” wrote Francis Bacon, who pursued alchemy to unveil the secrets of the universe and place them at the disposal of the centralizing English state.

Today, it is electronic and digital technologies that create unique temptations toward the centralization of power from afar. In 2004, Lt. Comdr. George Franz wrote a paper for the Naval War College, titled “Decentralized Command and Control of High-Tech Forces,” arguing that improvements in information and communications technology entice commanders to increasingly exercise more centralized command and control. For example, the helicopter command posts of the Vietnam War gave commanders the illusion of having perfect knowledge of the situation on the ground. This technology led them to directly influence fighting on the battlefield rather than letting subordinates do their jobs. President Lyndon Johnson and Defense Secretary Robert McNamara personally selected targets during their Tuesday lunch meetings from almost 9,000 miles away.

Similarly, Gen. Lloyd Leavitt, Jr., Vice Commander-in-Chief of SAC in 1980, monopolized decision-making authority on the night of the Damascus incident from SAC headquarters 500 miles away, even though he had no experience with Titan II missiles. His decisions led to a catastrophe.

Lt. Comdr. Franz pointed out that in the name of efficiency, such centralization can lead to many inefficiencies. Automating processes and centralizing judgment detach leaders from reality, making effective decision-making difficult. This is an ancient truth, which the British experienced prior to the American Revolution as they tried to govern unruly colonists from the center of the Empire in London, 3,000 miles away. In his great speech on conciliation with the American colonies in the House of Commons in 1775, the statesman Edmund Burke said, “I have in general no very exalted opinion of the virtue of paper government; nor of any politics in which the plan is to be wholly separated from the execution.” The naïve belief in control from afar collapsed only a month after Burke’s speech as the battles of Lexington and Concord ignited the American Revolution.

They constantly
try to escape

From the darkness
outside and within

By dreaming of
systems so perfect
that no one will
need to be good.

But the man that
is shall shadow

The man that
pretends to be.

T.S. Eliot, chorus from The Rock

Too much trust in centralized expert systems can be very dangerous in a nuclear context. On November 9, 1979, the computers at the Colorado headquarters of the North American Air Defense Command (NORAD, created in 1958) said that the U.S. was under attack by Soviet missiles and that warheads would begin to hit U.S. cities in five or six minutes. Bomber crews ran to their planes and fighter interceptors took off. As the minutes passed it became clear that the U.S. was not actually under attack. A technician had put the wrong tape into NORAD computers. The tape was part of a war-game training exercise that simulated a Soviet attack.

Centralized command-and-control systems tempt us to distrust human judgment and trust machines. The deputy commander of the Damascus silo immediately grabbed the fire checklist when his console lit up, indicating a problem soon after the socket fell from Powell’s hands. A technician started going through checklists, too. But the problem was: Which checklist applies? What is the actual nature of the problem at hand? Here the human component comes clearly into view. Machines can solve many problems, but only humans can determine the kind of problem they are faced with. That sort of judgment, Matthew Crawford remarks in Shop Class as Soulcraft, demands a concern for the truth that depends on personal knowledge rooted in experience.

Rather than depend on experienced personnel at the scene in Omaha, SAC’s command tried to control the situation from headquarters. Gen. Leavitt took over the approval process for each step of the checklist being developed in response to the leaking fuel. Col. Ben Scallorn, who had worked on Titan II missiles for many years, objected to Gen. Leavitt’s plan for venting the Stage-1 fuel tank. “Scallorn,” Leavitt replied, “just be quiet and stop telling people what to do. We’re trying to figure this thing out.” As Schlosser noted, “It was an awkward moment. Nobody liked to hear one of SAC’s leading Titan II experts being told to shut up.”

Meanwhile, Jeff Kennedy, perhaps the best missile mechanic at nearby Little Rock AFB, proposed opening the silo door, which would vent the vapor, lower the temperature and relieve the pressure on the Stage-1 tank. This idea was relayed up the chain of command, and Kennedy was told to wait and do nothing without Gen. Leavitt’s approval. Kennedy was disgusted and stymied, unable to act quickly and decisively as Tim Griffis did at Grand Forks AFB a few days earlier.

Nearly six hours after Kennedy’s proposal, the explosion occurred, throwing Kennedy 150 feet to land against the silo’s fence along with burning debris, alive but severely injured. Tragically, an airman was killed. The silo was completely destroyed and, thankfully once again, the missile’s safety features prevented a nuclear detonation.

The real heroes in Damascus weren’t distant authorities but the men on the ground trying to save the missile, rescue the wounded after the explosion and find the missing warhead, which they did a few hours later in a nearby ditch.

Losing Control

It is the human element that holds complex command-and-control systems together, especially in emergencies. This was illustrated dramatically in 2009 when Capt. Chesley Sullenberger, a former fighter pilot, landed his passenger plane safely in the Hudson River across from midtown Manhattan after losing both engines after striking a flock of Canada geese. All 155 people on board survived the controlled ditching, which was termed a “unique aviation achievement” by the Guild of Air Pilots and Air Navigators. The pilot’s judgment, rooted in considerable experience, saved the day.

Centralized authority worked well in Operation Neptune Spear that killed Osama bin Laden in 2011, as Schlosser points out. Army, Navy, Air Force and CIA commanders communicated secretly in real time. But outcomes are radically different in an unforeseen emergency, such as Damascus, or when we are the target of a covert operation. The 9/11 Commission Report chronicled the confusion and miscommunication regarding the terrorist attacks at the very highest levels of government. Schlosser notes that, “A command-and-control system designed to operate during a surprise attack that could involve thousands of nuclear weapons—and would require urgent presidential decisions within minutes—proved incapable of handling an attack by four hijacked airplanes.”

A culture of denial and secrecy made the situation in Damascus worse. The Air Force kept insisting everything was under control, but a white cloud rising from the silo told local sheriff Gus Anglin that the opposite was true. SAC refused to confer with state and local officials, leaving everyone wondering who was in charge. Anglin ordered an evacuation of everyone within a mile of the missile silo. His initiative served to check the decisions made, and not made, at the top of SAC’s bureaucracy.

The ability to gather and process massive amounts of data gives those at the top an information superiority at headquarters. This hidden or “occult” knowledge creates the impression of mastering a local situation and tempts leaders to exert tighter authority from afar. But Lt. Comdr. Franz pointed out that the sheer volume of raw data might actually reduce understanding and delay decision-making, as with the Damascus accident. Tighter control from the top means subordinates have fewer opportunities to make decisions and gain experience, often demoralizing them. “The commander who reaches down to exercise command and control at subordinate levels,” Comdr. Franz writes, “will lose the support of his men and women.”

The Ronald Reagan Minuteman Missile State Historic Site near Cooperstown, North Dakota, is one of only three once top-secret Minuteman sites in the nation open to the public. The site was part of the 321st Missile Wing, a cluster of 165 intercontinental ballistic missile launch sites dispersed over a 6,500-square-mile area in eastern North Dakota and supported by Grand Forks AFB. The 321st was decommissioned in the late 1990s. Today, 150 Minuteman III Intercontinental ballistic missiles remain fully operational and ready-to-launch in North Dakota. These missiles are dispersed around and supported by Minot AFB. Mark Sundlov, now a director at the North Dakota Heritage Center, served for several years as a Missile Combat Crew commander and helped open the Cooperstown site in 2009. Sundlov made many helpful comments on this essay. The missile site is open to the public daily from Memorial to Labor Day. For more information click here: http://history.nd.gov/historicsites/minutemanmissile/index.html

The Ronald Reagan Minuteman Missile State Historic Site near Cooperstown, North Dakota, is one of only three once top-secret Minuteman sites in the nation open to the public. The site was part of the 321st Missile Wing, a cluster of 165 intercontinental ballistic missile launch sites dispersed over a 6,500-square-mile area in eastern North Dakota and supported by Grand Forks AFB. The 321st was decommissioned in the late 1990s. Today, 150 Minuteman III Intercontinental ballistic missiles remain fully operational and ready-to-launch in North Dakota. These missiles are dispersed around and supported by Minot AFB. Mark Sundlov, now a director at the North Dakota Heritage Center, served for several years as a Missile Combat Crew commander and helped open the Cooperstown site in 2009. Sundlov made many helpful comments on this essay. The missile site is open to the public daily from Memorial to Labor Day. For more information click here.

Mission-Tactics & Subsidiarity

Perhaps the ancient Catholic principle of subsidiarity provides guidance. Subsidiarity means that as a matter of justice, lower levels of government or any organization should have the freedom to pursue their own ends. However, subsidiarity also implies that higher levels have the authority to lead, in order to provide an overall framework within which lower levels can act safely and effectively. In a military context, this means that decentralized command and control, which is termed “mission-tactics” or “mission command,” allows subordinate commanders to freely respond to situations developing on the ground, as long as they understand the commander’s overall intent.

For effective mission-tactics, top leadership must provide a clear strategic objective. When that vision is lacking, mission-tactics don’t work, as in Afghanistan. In The Killing of Osama bin Laden, Seymour Hersh quoted a special operations consultant saying, “It’s all about tactics and nobody, Republican or Democrat, has advanced a strategic vision. The special ops guys are simply carrying out orders, like a dog eager to get off the leash and run into the woods—and not thinking about where it is going. We’ve had an abject failure of military and political leadership.” The same was true in Vietnam where the U.S. military achieved almost 100-percent tactical success, while the entire intervention was doomed by strategic failure.

Humility & Humanity

Schlosser’s book reminds readers of the need for humility and humanity. In contrast to SAC’s technological sophistication, communications on the ground during that night in Damascus were rudimentary. Mechanics used hand signals to communicate as they approached the silo just before the explosion. Then they spoke by radio to the team chief who relayed information to the colonel standing next to him near the silo’s gate. In turn, the colonel spoke by phone to another officer at the command post in Little Rock, who then talked with SAC headquarters in Omaha. “[H]opefully … nothing would be garbled or misunderstood,” Schlosser comments wryly. Technological sophistication could not save the day. In the end, humble humans and their judgment had to decide and act.

Dangerous systems certainly require standardized procedures and judicious central control during normal operation. But in an emergency, “those closest to the system, the operators, have to be able to take independent and sometimes quite creative action,” Schlosser concludes, quoting sociologist Charles B. Perrow. Subsidiarity in mission-tactics, which strikes a balance between centralized and decentralized decision-making, is needed to protect institutions from the over-centralization so tempting in the information age. Humility and humanity—and the humanities in education to provide proper leadership formation—protect us from hyper-rationalism that cuts us off from the real world of contingency, unpredictability and mystery.