Book Deconstruction: Atomic Accidents

by Gattsuru — on  ,  , 

It takes an unusual author to write a book about nuclear disasters and then begin with two entirely non-nuclear incidents. It takes a better one to pull it off, and James Mahaffey manages it in Atomic Accidents. While subtitled "from the Ozark Mountains to Fukushima" (ugh), it begins instead with Sayano-Shushenskaya and a handful of steam locomotive accidents (and not-accidents), described nearly to the same detail that CSI would apply to its final reveal of an episode.

That ends up being for a reason, and not simply as a teaser, or as a nuclear power cheerleader trying to downplay the Other Famous Russian Disaster. The anecdotes and history introduce different challenges that plague modern power plants, just or more heavily than the actual radioactive material themselves. Hydraulic dams aren't the only demonstrations of the power involved in macroscale engineering, any more than steam locomotives are the sole variant of high-pressure steam, but they provide entertaining (literally: some locomotive accidents were planned shows that had unexpected casualties) and thematically appropriate introductions. Mahaffey -- likely correctly -- recognizes how tempted readers might be to focus on the atomic side even for accidents where radiation ended up being an afterthought.

Of course, the first genuine chapter is named "We Discover Fire", and the Promethean metaphor isn't undeserved. The special risks of radiation become apparently quickly. After early experiments working on fluorescence gave his lead assistant eventually lethal doses of X-rays:

Edison was shaken, and he dropped all work on the fluorescent lamp. "I am afraid of radium and polonium too," he commented, "and I don't want to monkey with them."

His rival, the archetype of the mad scientist himself, famous for toying with hair-raising AC voltages and invisible rays, had given up the skeletal ghost earlier:

In his final article of 1896, published on December 1, he [Tesla] advised staying away from x-rays, "... so it may not happen to somebody else. There are real dangers of Röntgen radiation."

But where genius recognized the terrible risk (or, at least applied it to themselves, as the Curies, or their volunteering employee, as Edison), before finding controlled methods usable for medicinal or scientific value, the scam artist set up and went wild. Here we see the Radium supplements, the dangerously ignorant or willfully blind upper-class victims, the snake oil salesman, and the harassed or exploited workers. Both libertarians and leftists are likely to find the political or social analysis here a little weak, if only because US Radium was so cartoonishly vile, but there's enough to give a fairly deep explanation of how and to who's expense the various dangers of radioactivity were discovered and rediscovered.

Next is the inevitable the fireball, either the literal blooms of the bombs themselves or the criticality accidents that came before or after them. Where others like Nelson's The Age of Radiance gave a good people-focused breakdown of how Heisenberg got onto a Nazi nuclear bomb project and explain how it fell from grace at a political level, Atomic Accidents emphasizes the technical aspects and explains how Heisenberg's first 'bomb' was an overpressure reactor vessel and some bad experimental practice. Where most authors bog into the question of whether it was correct to drop the bombs, Mahaffey brings forth the last-minute tweaks to get them to work and the after-the-last-minute attempt from Japan to stop the second bomb. It's not a terribly good strategic, political, or social analysis of the Manhattan Project (and anti-nuclear advocates are likely to be frustrated that it doesn't really give much play to the victims of Hiroshima or Nagasaki), or even the best technical breakdown of atomic weapons strategy, but it does a great job of explaining and illustrating the weight of the matter, and how physical effects go from scarcely measureable to instantly lethal to city-leveling by sheer numbers, or how fissile materials go from boringly safe to critical mass not just by enrichment or isotope, but also by container, surrounding material, experimentors, and local characteristics.

((It's not entirely on the things side of people versus things: there's a bizarre Unsong-esque moment where a madman predicts the 1940's establishment of Blackoak Ridge in 1902.))

The dangers established at great cost, the remainder of the book explores the variety of new and surprising ways scientists, engineers, airmen, soldiers, scientists, plumbers, piles of ammonium nitrate, pump bearings, and (in one case detected before anything went on fire) a small brown rat could run headlong into those problems again. The breakdown is sometimes meandering -- rather than separate incidents into chapters chronologically or geographically, Atomic Accidents separates by project, zipping across years and continents in a single page where there's enough technical similarity -- but always eminently readable.

Mahaffey's greatest strength is his ability to bring a surfeit of otherwise unavailable detail to even the least-known of his examples. The Wood River Junction criticality accident, for example, normally receives only a footnote as the only US civilian death to acute radiation poisoning, if it's mentioned at all. Atomic Accidents gives details I'd never heard hinted at before, across several pages, giving a full breakdown focusing not on the Great Men surrounding the topic, nor the driest recitation of bare facts, but a narrative description of events as they were experienced by the people in the building. It's not the only such case, or even an exception. In no few cases, he gives compelling arguments that the official story is incomplete or misleading, and provides more complete analysis and citations for the true underlying causes. This sometimes borders on bragging, such as his own experiences installing safety equipment or debunking cold fusion, but those tidbits are sprinkled in appropriately enough and often against interest that I found them good spice to the work.

One shocking point is the repetition. Some incidents are unique, as you might expect from the cutting edge of a novel field, often operating under extreme time constraints. Just as the Curies could not reasonably prepare themselves against the dangers of an element they first isolated, it's hard to criticize Daghlian too much for bad experimental protocols, even with Frisch's warnings. Others reflected sheer weight of probability fighting to unravel the best-laid plans of men, where one-in-a-million odds encountered hundreds of thousands of hours.

Yet more came about as a result of secrecy or military classification, and while some of those were on opposites side of a war, not all of them. Oppenheimer(!) at one point had to send Feynman(!) to Oak Ridge, to find out how long his facility could hold enriched uranium under the normal security classification rules, and Feynman realized it was a miracle the building staff hadn't killed themselves yet, because they were following best practices for types of explosives that uranium wasn't. Uranium isn't the only material to become more dangerous when mixed with water -- the book alone includes a number of anecdotes about metallic sodium and graphite -- but it was genuinely novel for water on the other side (or inside) a wall feet away to be actually start fires.

((Though there's a few situations where wiser heads at least fought against the perils of oversecrecy, sometimes by roundabout ways: it seems like near every Western nuclear power got very close to finding the Wigner Effect the very hard way before getting a nice talking to, even when doing so wasn't strictly 'legal' or 'not treason'. Didn't seem to ever prevent the inevitable graphite fire, but the thought counts. This is the first I'd heard about Rickover smuggling materials into Canada for testing, and it probably saved a good few lives of his men.))

But most aren't. One nuclear waste reprocessing fatality occurred due to a stirring tank already scheduled to be replaced, precisely because its design was declared to be a risk (by Feynman!) a decade before. The Atomic Man had known the exact danger presented by aged and settled resin for thirty years, but his boss knew 'better'. Mahaffey takes to the obvious rejoinder with a dry wit and expert timing, giving a near dozen pages between BORAX-I testing excursions that would exceed what anyone expected to see in the real world because "nobody would be crazy enough to jerk out the central control by hand" and someone jerking out the central control by hand.

Even many 'firsts' came from old, well-established classes of problems: the first (two, simultaneous) nuclear reactor scrams occurred when a reactor technician jammed a control button down and went to answer a call (not the only time controls like that would show up), while the scientists standing inside the reactor at the time had left their dosimeter badges elsewhere like high school students warned about safety glasses. Two of the most politically consequential Broken Arrow incidents are attributed to human factors issues that were well-known among pilots by the start of WWII, to a point that would be comedic had there not been a fatality.

On the other hand, the scale dizzies in a different direction than one might expect. Mahaffey is no nuclear cheerleader, and he does not downplay Windscale, Three Mile Island, or Andreev Bay, nor shy away from Chernobyl, Kyshtym, or Fuukushima. But when a man very nearly imitated Slim Pickens on accident over a residential area, it sent two people to a family doctor with minor injuries. The aftermath of the Castle Bravo test was an international incident, killed several more than any official count ever recorded, and also despite being an above-ground test with badly underestimated yield okayed by a man with a backstory like something out of Marvel, it also pales in comparison to any one of countless industrial accidents from that era. Five men stood directly under the burst of a Genie nuclear weapon, as part of a series of tests that would encourage soldiers to do the same, and the greater risk was near certainly the cigarettes they smoked while setting up the camera. And that's still not the most shocking thing Atomic Accidents lists, and lists as having less human risk than a long bus trip. Some of this is plain luck, but most of the luck is that no one was squished by a falling aircraft.

Most experts say that the outsized fear of nuclear risks comes as a result of its invisibility, but these scales aren't larger than other similarly invisible dangers. Many mainstream commentors, especially contemporaneous to the dawn of the atomic age, describe nuclear risks like a walking a tightrope between two moving aircraft. This feels less like a tiger waiting for an opportunity to strike, and more like a mule men spent years or decades stamped on before being surprised it kicked them.

Yet the modern age doesn't see BORAX-LXII, with its lead technician and designer still sending plumes of water into the desert air for yucks; we've been shutting down plants, often for rather spurious reasons. Nuclear waste reprocessing died in the United States in '77, strangled by the only man both clean up a reactor spill and be President. Fukushima was not only a death knell for nuclear power in Japan, but stalled or rolled back nuclear power projects across much of the northern hemisphere, including even in China. There are some small talk of modular or pebble-bed reactors, as well as sodium-cooled ones, but they're about as serious as the Polywell: wonderful if you could, but you probably can't. Where the last chapter of Atomic Accidents discusses alternative designs to what Mahaffey calls the "Rickover Trap" of overscaled pressure vessels, hard manufacturing constraints, and rigid design limits, it hasn't aged well: almost all of the options he considers the most serious contenders have either been withdrawn, closed, or otherwise nonfunctional, largely due to lack of interest, active opposition, or simply coming from Babcock & Wilcox. There is no taste for nuclear power; indeed, there are a few countries where support might map the bottom of the Lizardman's Constant.

((There's a certain morbid irony that this distaste for nuclear power has also stalled cleanup of radiation in some ways, but politically it's not too much of a surprise even if an accident, and the books either doesn't notice or doesn't dwell on it.))

On the gripping hand, it isn't hard to see why this distrust exists: it may be easy to blame tall tales of China Syndrome (and though the author doesn't mention it, Nuclear Winter), but there's been no surfeit of honesty from the pro-nuke side. Hanford is the typical Western go-to example, but instead Atomic Science unloads more broadly on the US Atomic Energy Commission and its equals in other countries, who across several chapters manages to directly or indirectly hide trivial accidents, downplay the scale but not scope of larger ones, and generally just be idiots in the auspices of protecting their domain, making sure that the only times any civilians heard of a nuclear disaster it would simultaneously have the smallest doses and greatest fatalities, regardless of whether the actual truth. Plant operators insist on giving rosily optimistic scenarios, papering over numbers, eliding the distinction between what must happen or what could happen, or simply not bothering to talk at all.

"In the early days of nuclear power, it was an unwritten rule in the AEC that the public was not to be burdened with radiation release figures or the mention of minor contamination. It was true that the general population had no training in nuclear physics and radiation effects, and if given numbers with error bars and a map of an airborn radiation plume, imaginations could take control in nonproductive ways. Nobody wanted to cause a panic or unwarrented anguish or to undermine the public's fragile confidence in government-sponsored research. The results of such a policy are worse than what it is trying to forestall, as the government is commonly accused of purposefully withholding information, and misinformation rushes in to fill the vacuum. Conspiracy theories thrive. This fundamental problem of nuclear work has yet to be turned around."

I'd been generally frustrated by the prevalence of (what could charitably be called) linear no-threshold thinking among anti-nuclear activists who often would be appalled by opposition to smoke detectors with higher raw radiation output, but in the face of these circumstances it's hard not to feel a bit of sympathetic paranoia. Where more overtly anti-nuclear writings could tell their beliefs, this gave me a view on why those ideas might have taken root that I'm not sure the adherents themselves could have given. To borrow from McArdle: once you convice people you're willing to say anything to persuade them, you lose the ability to persuade them of anything else.

Not that this gives any routes toward fixing it, gives any reason to believe it is fixable, or that anyone's interested in trying. It's not even a footnote, but the overlap between Three Mile Island and Fukushima's public relations department isn't encouraging. (Though Mahaffey seems to believe this book is one small step)

There are three constraints I have to list, though.

I.

I can't call it comprehensive, as some others have. Along with minor excursions that sometimes straddle the line between accident, daily practice, and unauthorized experiment, the author also skips over the human radiation experiments, and cases like Crofut or Hahn, whether for political reasons or simply as out of scope. No take, one way or the other, for Silkwood. There's less focus outside of North America, the UK, Russia, and Japan, with little mention of Brazil, Switzerland, or Serbia. India and France's respective attempts at sodium-cooled fast breeders get brief mention, but nothing of Narora or St. Laurent; Germany's Wuergassen (and a Synchrotron?) reasonably gets attention, but Greifswald does not. While the politically minded are likely to find the Radium Girl history gives the actual victims short shift, the focus on the Curies, Edison and Dally, or Tesla at the expense of other X-ray experimenters may bug Europeans.

But if there's a more complete yet human-readable source, I'm not aware of it. And to be fair, most of the overlooked accidents (especially, as Mahaffey notes, the sodium-cooled reactor ones) start as out-of-scope and quickly become uninteresting.

II.

There are also some odd parts to the book.

Many of these are nitpicks. Sources I can find attribute Dally's death, and Edison's terrorized flight from X-rays, to attempts to develop fluoroscopy, not a fluorescent lamp, explaining Dally's constant hand-waving in the X-ray path. Edison did patent a fluorescent lamp using x-rays (rather than ultraviolet light, as modern lamps do) but the timeframe isn't right, and while the underlying principles overlap, some readers may come away with the wrong interpretation. Radithor's infamous "The Radium Worked Fine Until His Jaw Came Off" is attributed to the New York Times in the book, while the majority of other sources say it was in the Wall Street Journal (though Atomic Accidents is not the first to do so: Kristi Lew's Radium did it in 2009). The first chronic cases of radiation poisoning are footnoted as having likely occurred in copper mines of the Scneesburg region of Bohemia; most other sources call the region Schneeberg and, while they had some copper yield, hold them primarily as silver (historically) and cobalt (more recently) mines. The Ithica M6 is described as chambered in .22 long rifle (a chambering present the civilian Springfield variants, among others) rather than .22 Hornet every source I can find says for the military version. Each minor, and may reflect limitations in my sources, simple changes in transliteration or writing, or incomplete records. It's possible my other sources are the result of citogenesis, and Mahaffey's in the right.

Some are stranger. The explanation and calculations for an atomic explosion in the chapter focused on the Little Man and Fat Boy bombings use an example calculation for a one-megaton device, while those two bombs combined only marked 50 kilotons; the footnotes mention that this 1MT device is a theoretical example taken from a Glasstone work -- and more seriously, I think misreads that paper's math -- but also describes 16kT devices as "the effects scale down slightly", which I'm not sure even counts as technically correct. A footnote about the 1986 Brown's Ferry fire notes it as "the only case of a nuclear-plant cooling tower being destroyed by fire", which might be strictly true but a little misleading given than another fire destroyed 80% of a different tower at the same plant a decade later (as distinct from the minor arson in '87). A later bit repeats Blair's famous allegations regarding the ICBM Eight-Zeros passcode without referencing the Air Force's denials, and that's perhaps reasonable (that's what they would say); when it does so in specific reference to Permissive Action Links, which the Air Force claims were never used for Minutemen ICBMs and had no class with eight digits, this seems at least worth mentioning. The worst mistake I caught was mistakenly saying there were no, rather than five, crew fatalities in the 1950 British Columbia B-36 crash, and that 15, rather than 16, crew died in the Lebanon Ohio B-50 crash.

Few, if any, of these mistakes undermine the author's work or intent. Indeed, I'm still not sure that my sources are 'better' than his: even the B-50 crash, there seems to have been genuine confusion about how many people were aboard. It's not merely possible but plausible that these are accidents, or reflect information made more widely available after his publishing, or were picked up from other authors (again, Lew's Radium did that mistake earlier), or were just copyright traps.

But they do leave serious questions about Atomic Accident's strongest area: its deep and humanized descriptions of events that normally only receive description or analysis in the most clinical form. Mahaffey consistently brings forth new and extremely detailed aspects to both high- and low-profile incidents, many of which involved chasing highly limited sources (sometimes in bizarre places: he says one early American atomic science paper was found only in Russian archives). In other cases, he recognizes that he's extrapolating from multiple conflicting tales, limited or manipulated public records of secret information, or in a few cases, complete assumptions regarding the mental states of dead men. This puts his work on a whole different level than conventional criticality accident databases, but necessarily leaves readers dependent on an evaluation which can vary from well-grounded to extrapolative to explicitly fictional in a single page.

III.

It also leaves questions about the big theme of the book. While the last chapter is a fairly underwhelming summary of possible future trends in novel reactor design, a greater underlying motif throughout the work is the triumph of leaden system design as the sole and only route to actual safety. Where Feynman realized that even the most robotic and work-a-day tasks in a nuclear facility, up to and including unloaded a lorry, needed workers who understood the unique threats involved when working with fissile material, Mahaffey finds that the complicated tasks needed to be redesigned into such simplicity that zombies could do them. For every Frisch, reacting quickly to complex and ambiguous stimuli, you'd see a dozen people causing those problems. "It is painful to notice, but some of the worst nuclear accidents were caused by reactor operator errors in which an automatic safety system was overridden by a thinking human being."

By Mahaffey's reckoning, it is not enough -- indeed, it may be actively counterproductive! -- to have the best and brightest minds available with a complete understanding of a system. If there is a bowl or god forbid a spherical container in a nuclear materials processing center, even if up a flight of stairs, someone will eventually turn it into a supercritical reactor. If there is a button labelled "do not press during a power outage", someone will press it during a power outage because they knew better. If there are precise instructions to assemble a sub-critical experiment, someone will optimize it to their own agonizing death. If there are filters that must be replaced, they will be forgotten by genuinely brilliant men laser-focused on cutting-edge physics until they erupt into a ball of flame. In a couple cases, men ignored safety manuals they themselves wrote, or killed themselves rerunning old experiments shoddily despite decades of experience. One man, nicknamed the Chief Armorer Of the United States, sat at the deathbed of an acute radiation sickness victim and then promptly optimized out considerations for gravity to his own (fatal) peril.

This is by no means a universal theme -- most notably, it's not heavily pressed for the Chernobyl incident -- but I'm not alone in seeing it. For those in the Site Reliability Engineering field or certain security-focused software spheres, this is a compelling story, and Mahaffey tells it well. In no small number of cases, he tells of heroic efforts to make disasters where none existed before, or to make new problems worse than the old, solved ones.

But it's hard to come away without feeling he put a bit of a thumb on the scale.

The most overt example comes as a footnote for Fukushima, on March 11th:

"The operators at Unit 1 were desperate to restart the isolation condeners or to initiate the PHCI, but there was no DC current from the backup batteries. They stormed what was left of the plant parking lot, where the 6,413 workers had left their cars, and started pulling batteries out of the cars that remained on site, and, where available, they took jumper cables out of trunks. They stacked batteries on the floor in the control room, connecting ten 12-volt batteries in series using the jumper cables to make 125 volts DC for the valve-control motors. There was not enough current to keep the system alive for more than a few minutes, but for an instant the control panel lighted up and they could see that the isolation condenser valves MO-2A and MO-3A indicated closed. At 6:18 pm they were able to apply power to the motors and open the valves. A lookout reported steam coming from the condenser pool, indicated success. Seven minutes later, an operator reached for the switch and closed the valves, just as the automobile battery-pack died. No reason for this action has been determined. The isolation condenser valves could not be opened."

This is not the single most Herculean and Sisyphean attempt at preventing a core breach in the book: another incident, at Chalk River, Mahaffey describes as using jumpers to override a robot's hardware interlocks in a desperate attempt to manage a burning fuel canister (which promptly dumped its fuel into a loading bay with no rad shielding, instead of the reactor it was previously safety-locked over defined by its shielding). It's just very nearly the leader in scale and degree, with thousands of bright people, dozens directly involved, showing deep understanding of their areas of expertise, and under extreme conditions and hard time pressures, and goofing it up entirely at the last last minute.

But IEEE Spectrum and PBS say that the chain of batteries was used to power instrumentation, not motors, at Unit 1. According to the IAEA report, assuming I've read it correctly, this approach was used to activate pressure-release valves from Unit 3, hours after Unit 1's explosion on the evening of the 12th, and the isolation condensor values were powered by AC, not DC as provided by car batteries. TEPCO's report says personal vehicle batteries were not scavenged until the 13th and used in Units 2 and 3 (and also say that the valve was eventually reopened at 21:30).

The end result still points to a familiar scenario, but it's one driven by faulty communication, bad assumptions for undefined behaviors, insufficient redundancy, limited resources, and the normal problems of logistics. One could imagine a world less dependent on automated systems (and especially 'invisible' automated systems with a status that can not be verified externally), and which corresponding evolved better communications methods for reporting that status, and have it describe at least the official story better than Mahaffey's intended solution.

Of course, that's the imagination. In addition to Mahaffey's greater expertise and breadth of available resources on the general topic, this particular accident is one where both national and international reporters have cause to give reports gloss. For a well-supported example, his analysis of the SL-1 accident favors the murder-suicide or death-by-misadventure hypothesis, and gives a far more compelling answer than the official report (a fun read, but one whose surest takeaway is that it wasn't a chemical explosive).

But he also felt it worthwhile to put a reference about a Russian scientist thinking about Matryoshka dolls, admiting that he includes it not because it happened (or that it could even be known if it happened), but because fit the tone.

Without that certainty the book falters.

It's still interesting to hear about all these varied failure modes. But they're simply too electic for object-level solutions to come across as anything but a thousand one-off bandaids, useful in hindsight but individually only closing off tiny fractions of disaster possibility space. We already have a goal of making reactors safe. Making reactors -- or anything else -- genius-proof, more than idiot-proof, hits the sweet spot between possible, novel, and counter-intuitive, something that genuinely can't be done from within conventional models. And that might still be true!

The other ramification is that this is not, despite first appearances, Gell-Mann amnesia. Mahaffey isn't simultaneously a Japanese reactor operator and a hundred-year-old American uranium miner, but he comes to atomics not as a outsider scrabbling for the pretense of newsworthiness, nor a popular science writer assembling bits and bobs. The man had literally lived this topic most of his adult entire life; he has been part of a team installing safety equipment in a nuclear reactor, and he played a role showing errors in the original Cold Fusion experiments. His acknowledgements page thanks a handful of PhDs for checking his work, including further people who were in the buildings that many incidents (liquid sodium reactors, everybody!) occurred. He's not merely been focused on this topic longer than it would have been possible for me to apply for the necessary security clearances, but longer than most people reading this have been alive. Nor, to be very clear, am I claiming these discrepancies come across as Bellesilesean malfeasance: there's too many places he could have put a heavier favoritism toward his philosophy I couldn't verify, and yet does not, including many first-hand experiences that the version he does provide is simply too complicated. Where he does spice up anecdotes he is clear what is authorial intervention and what was hard reality. Indeed, I'll give credit that few authors earn: where there's cause to explore the borders of the popular story, Mahaffey gives useful footnotes starting with questions like "Or were they?", explaining limits to his confidence in places most writers would have just left an ibid.

Likewise, Pegasus Books is a smaller independent publisher, but not one known for its lackadaisical editing.

There's a reason I emphasize these as mistakes, and just as possibly on my part as Mahaffey's own. I bring them up not to call him out, nor as a mark against wonderfully written book. There are few non-fiction works that lack at least a few oddities of this sort. But even earnest, honest, expert analysis by a recognized mind in his field might, at least possibly, provide a more complete understanding when read with hard-nosed disbelief and a drive to verify.

True absolute skepticism is, if not impossible, at least impractical. I literally can't check every fact in this book, with some only observable through a time machine. Others require a specialized classification I couldn't get if I dedicated my career to the field. Yet others are simply beyond my abilities to understand the underlying math or physics, even had I the tools and base information. At the same time, it's unusual here how much I can at least attempt to confirm: compared to the typical field focused on the past, nuclear history has a lot of dedicated adherents, sometimes on opposite sides, and far more objectively measurable impacts in a more recent and better-documented period. Even attempting to research Usenet histories would run into as many or more dead ends. Trying to retreat to evidence that you can examine alone quickly turns into mere radical skepticism, more of philosophical than epistemic interest.

But ultimately, you can not actually outsource the requirement to evaluate truth. You can only decide what you're evaluating.


Add a comment