Explosive testing
Nuclear weapons are in large part designed by trial and error. The trial often involves test explosion of a prototype.
In a nuclear explosion, a large number of discrete events, with various probabilities, aggregate into short-lived, chaotic energy flows inside the device casing. Complex mathematical models are required to approximate the processes, and in the 1950s there were no computers powerful enough to run them properly. Even today's computers and their codes are not fully adequate.
It was easy enough to design reliable weapons for the stockpile. If the prototype worked, it could be weaponized and mass produced.
It was much more difficult to understand how it worked or why it failed. Designers gathered as much data as possible during the explosion, before the device destroyed itself, and used the data to calibrate their models, often by inserting fudge factors into equations to make the simulations match experimental results. They also analyzed the weapon debris in fallout to see how much of a potential nuclear reaction had taken place.
Light pipes
An important tool for test analysis was the diagnostic light pipe. A probe inside a test device could transmit information by heating a plate of metal to incandescence, an event that could be recorded at the far end of a long, very straight pipe.
The picture below shows the Shrimp device, detonated on March 1, 1954 at Bikini, as the Castle Bravo test. Its 15-megaton explosion was the largest ever by the United States. The silhouette of a man is shown for scale. The device is supported from below, at the ends. The pipes going into the shot cab ceiling, which appear to be supports, are diagnostic light pipes. The eight pipes at the right end (1) sent information about the detonation of the primary. Two in the middle (2) marked the time when x-radiation from the primary reached the radiation channel around the secondary. The last two pipes (3) noted the time radiation reached the far end of the radiation channel, the difference between (2) and (3) being the radiation transit time for the channel.
From the shot cab, the pipes turned horizontal and traveled 7500 ft (2.3 km), along a causeway built on the Bikini reef, to a remote-controlled data collection bunker on Namu Island.
While x-rays would normally travel at the speed of light through a low density material like the plastic foam channel filler between (2) and (3), the intensity of radiation from the exploding primary created a relatively opaque radiation front in the channel filler which acted like a slow-moving logjam to retard the passage of radiant energy. Behind this moving front was a fully-ionized, low-z (low atomic number) plasma heated to 20,000 °C, soaking up energy like a black box, and eventually driving the implosion of the secondary.
The radiation transit time, on the order of half a microsecond, is the time it takes the entire radiation channel to reach thermal equilibrium as the radiation front moves down its length. The implosion of the secondary is based on the temperature difference between the hot channel and the cool interior of the secondary. Its timing is important because the interior of the secondary is subject to neutron preheat.
While the radiation channel is heating and starting the implosion, neutrons from the primary catch up with the x-rays, penetrate into the secondary and start breeding tritium with the third reaction noted in the first section above. This Li-6 + n reaction is exothermic, producing 5 MeV per event. The spark plug is not yet compressed and thus is not critical, so there won't be significant fission or fusion. But if enough neutrons arrive before implosion of the secondary is complete, the crucial temperature difference will be degraded. This is the reported cause of failure for Livermore's first thermonuclear design, the Morgenstern device, tested as Castle Koon, April 7, 1954.
These timing issues are measured by light-pipe data. The mathematical simulations which they calibrate are called radiation flow hydrodynamics codes, or channel codes. They are used to predict the effect of future design modifications.
It is not clear from the public record how successful the Shrimp light pipes were. The data bunker was far enough back to remain outside the mile-wide crater, but the 15-megaton blast, two and a half times greater than expected, breached the bunker by blowing its 20-ton door off the hinges and across the inside of the bunker. (The nearest people were twenty miles (32 km) farther away, in a bunker that survived intact.)
Fallout analysis
The most interesting data from Castle Bravo came from radio-chemical analysis of weapon debris in fallout. Because of a shortage of enriched lithium-6, 60% of the lithium in the Shrimp secondary was ordinary lithium-7, which doesn't breed tritium as easily as lithium-6 does. But it does breed lithium-6 as the product of an (n, 2n) reaction (one neutron in, two neutrons out), a known fact, but with unknown probability. The probability turned out to be high.
Fallout analysis revealed to designers that, with the (n, 2n) reaction, the Shrimp secondary effectively had two and half times as much lithium-6 as expected. The tritium, the fusion yield, the neutrons, and the fission yield were all increased accordingly.
As noted above, Bravo's fallout analysis also told the outside world, for the first time, that thermonuclear bombs are more fission devices than fusion devices. A Japanese fishing boat, the Lucky Dragon, sailed home with enough fallout on its decks to allow scientists in Japan and elsewhere to determine, and announce, that most of the fallout had come from the fission of U-238 by fusion-produced 14 MeV neutrons.
Underground testing
The global alarm over radioactive fallout, which began with the Castle Bravo event, eventually drove nuclear testing underground. The last U.S. above-ground test took place at Johnston Island on November 4, 1962. During the next three decades, until September 23, 1992, the U.S. conducted an average of 2.4 underground nuclear explosions per month, all but a few at the Nevada Test Site (NTS) northwest of Las Vegas.
The Yucca Flat section of the NTS is covered with subsidence craters resulting from the collapse of terrain over radioactive underground caverns created by nuclear explosions (see photo).
After the 1974 Threshold Test Ban Treaty (TTBT), which limited underground explosions to 150 kilotons or less, warheads like the half-megaton W88 had to be tested at less than full yield. Since the primary must be detonated at full yield in order to generate data about the implosion of the secondary, the reduction in yield had to come from the secondary. Replacing much of the lithium-6 deuteride fusion fuel with lithium-7 hydride limited the deuterium available for fusion, and thus the overall yield, without changing the dynamics of the implosion. The functioning of the device could be evaluated using light pipes, other sensing devices, and analysis of trapped weapon debris. The full yield of the stockpiled weapon could be calculated by extrapolation.
References:
- ^ DoE Fact Sheet: Reliable Replacement Warhead Program
- ^ Sybil Francis, Warhead Politics: Livermore and the Competitive System of Nuclear Warhead Design, UCRL-LR-124754, June 1995, Ph.D. Dissertation, Massachusetts Institute of Technology, available from National Technical Information Service. This 233-page thesis was written by a weapons-lab outsider for public distribution. The author had access to all the classified information at Livermore that was relevant to her research on warhead design; consequently, she was required to use non-descriptive code words for certain innovations.
- ^ Walter Goad, Declaration for the Wen Ho Lee case, May 17, 2000. Goad began thermonuclear weapon design work at Los Alamos in 1950. In his Declaration, he mentions "basic scientific problems of computability which cannot be solved by more computing power alone. These are typified by the problem of long range predictions of weather and climate, and extend to predictions of nuclear weapons behavior. This accounts for the fact that, after the enormous investment of effort over many years, weapons codes can still not be relied on for significantly new designs."
- ^ Chuck Hansen, The Swords of Armageddon, Volume IV, pp. 211-212, 284.
- ^ The public literature mentions three different force mechanism for this implosion: radiation pressure, plasma pressure, and explosive ablation of the outer surface of the secondary pusher. All three forces are present; and the relative contribution of each is one of the things the computer simulations try to explain. See Teller-Ulam design.
- ^ Dr. John C. Clark, as told to Robert Cahn, "We Were Trapped by Radioactive Fallout," The Saturday Evening Post, July 20, 1957, pp. 17-19, 69-71.[1]
- ^ Richard Rhodes, Dark Sun; the Making of the Hydrogen Bomb, Simon and Schuster, 1995, p. 541.
- ^ Chuck Hansen, The Swords of Armageddon, Volume VII, pp. 396-397.
- ^ a b Sybil Francis, Warhead Politics, pp. 141, 160.
No comments:
Post a Comment