Could a nuclear explosion set Earth’s atmosphere on fire?

by | Feb 16, 2024

A pair of nuclear astrophysicists explore this question, assessing the risk of this outcome back when nuclear physics was still in its infancy.
A blood red sky at sunset.

“Are you saying there’s a chance that when we push that button, we destroy the world?” General Leslie Groves asks J. Robert Oppenheimer in the film Oppenheimer. Groves is referring to the “non-zero” chance that the atomic bomb about to be detonated as part of the Trinity Test would cause an endless chain reaction, setting Earth’s atmosphere ablaze.

Two nuclear astrophysicists, Michael Wiescher of the University of Notre Dame and Karlheinz Langanke of GSI Helmholtzzentrum für Schwerionenforschung in Darmstadt, Germany, were intrigued by this question.

In a recent paper, they explored how Oppenheimer and his colleagues assessed the risk of this outcome back when nuclear physics was in its infancy, as well as how their experience and knowledge gained from the Manhattan Project then flowed into other fields, leading to unexpected benefits in nuclear astrophysics and radiocarbon dating.  

Inventing the atomic bomb

The research and design of the atomic bomb in the United States, named the “Manhattan Project”, was spearheaded by Oppenheimer, a theoretical physicist, at Los Alamos Laboratory in New Mexico during World War II. The project was a massive undertaking, involving numerous big names in the field, and took three years to complete.  

“The Manhattan Project was an enormous investment of about $2 billion [USD], which translates into about $60 billion of today’s dollars,” Wiescher stated. “The project employed more than 125,000 scientists, technicians, administrators, and military personnel. The biggest cost item was not the physics but the facilities to generate the bomb fuel.”

Atomic bombs rely on nuclear fission, the splitting of the nucleus of a radioactive isotope — an unstable form of an element — into two or more smaller nuclei by bombarding it with subatomic particles called neutrons. The fuel for the bomb dropped on Hiroshima by the United States in 1945 was uranium-295, while plutonium-239 was used in the bomb dropped three days later on Nagasaki.

“The fission releases large amounts of energy comparable to about 400,000 tons of TNT (dynamite) for the biggest bombs,” Wiescher explained. Oppenheimer had one major scientific concern during the bombs’ development at Los Alamos.

Theoretically, rapid local heating of the air when the bomb exploded could initiate fusion—the opposite of fission — reactions in the atmosphere if the cooling of the air through the release of radiation did not overcompensate. The fusion of hydrogen nuclei is responsible for sustaining stars.

Among the isotopes in air that could potentially undergo fusion reactions, Oppenheimer was worried about nitrogen-14, the most abundant isotope in the atmosphere. “The fusion of two nitrogen-14 nuclei and a hydrogen nucleus seemed most likely at the time,” Wiescher said.

Ignition of the atmosphere would be most likely within the first few milliseconds of the bomb’s explosion, when the fireball was opaque and the temperature was still rising. To gauge the probability of this, Oppenheimer consulted Arthur H. Compton, a renowned physicist and 1927 Nobel Prize winner.

Compton assured Oppenheimer that such an extreme event was unlikely. His opinion was echoed by Hans Bethe, head of the Theory Division at Los Alamos at the time — the temperature and pressure conditions of the atmosphere would simply not be high enough. As predicted, the plutonium-239 bomb did not ignite the atmosphere during the Trinity test in the New Mexico desert, only a few days before they were unleashed in Japan. 

According to a report written by Los Alamos researchers in 1946, “It is impossible to reach such temperatures unless fission bombs or thermonuclear bombs are used which greatly exceeds the bombs now under consideration.”

The hydrogen bomb

More powerful nuclear weapons were invented after the atomic bomb. Thermonuclear bombs, also known as hydrogen bombs, rely on the fusion of tritium and deuterium, hydrogen’s heavier isotopes. The fusion reaction requires high temperatures generated by a fission bomb as a trigger. 

“A fusion bomb releases much more energy than a fission bomb, about 20 million tons of TNT,” Wiescher stated. “The biggest fusion bomb ever exploded was the Soviet Tsar bomb, with 52,000,000 tons of TNT in 1961, which single-handedly doubled the radiation level in our atmosphere.”

Although the bomb, which was only detonated in a test run and show of power by Russia, was reportedly 1500 times more powerful than the Hiroshima and Nagasaki bombs combined, it still did not ignite the atmosphere. 

“The limits were luckily never tested, but in general, I would say, the density of the atmosphere is too low,” Wiescher responded when asked whether a powerful enough bomb to burn the Earth’s atmosphere could ever be built.

“If one would substantially increase the atmospheric density to Venus values — 100 times denser than Earth — one would still not have the density of water, and the underwater test program did not ignite the oceans, as some people predicted,” he elaborated.

Unexpected benefits

Although Oppenheimer and his colleagues correctly assumed the implausibility of atmospheric ignition, they failed to consider one aspect. “They overlooked the neutron release of the bomb, which caused an enormous neutron flux triggering the ‘nitrogen-14 + neutron’ reaction, producing long-lived radioactive carbon-14, which is now safely embedded in the biosphere,” Wiescher explained.

This carbon-14 peak, which was short-lived, has been called the “silver lining of the mushroom cloud” owing to its usefulness to archaeologists.

Carbon-14 is absorbed by humans, animals, and plants, and stops being absorbed after the organism’s death. Because the isotope decays at a predictable rate after death, the age of any remnants up to about 60,000 years old that were once part of the living organism can be determined by measuring the amount of carbon-14 remaining in it.

Aside from the boon to carbon-14 dating, Wiescher says that the fear of atmospheric ignition helped advance nuclear astrophysics. “The fear of atmospheric fusion led to the measurement and better understanding of fusion reactions in stars that determine the stellar evolution in the final stages of a star,” he said. “The observation of superheavy elements in bomb debris led to the prediction of neutron-induced reaction sequences such as the s- and r-process, which are responsible for the production of all heavy elements in our universe.”

Although stars are perfect fusion reactors, Earth’s conditions are far more hospitable. “In stars, fusion reactions ignite at temperatures of hundred millions of degrees and densities of about 10,000 to 100,000 times the density of the atmosphere, which are much higher than anything on Earth,” he elaborated.

“So, I don’t think there is much danger, but during the early test program, people just did not know how reliable their models and assumptions were. In hindsight, it turns out they were pretty good.”

Reference: Michael Wiescher and Karlheinz Langanke, Nuclear astrophysicists at war, Natural Sciences (2024). DOI: 10.1002/ntls.20230023

Feature image credit: Antek on Unsplash

ASN Weekly

Sign up for our weekly newsletter and receive the latest science news.

Related posts: