THE DEMISE OF NUCLEAR ENERGY? Lessons for Democratic Control of Technology Joseph G. Morone Edward J. Woodhouse New Haven: Yale University Press, April 1989 |
Rating: 5.0 High |
|||
ISBN-13 978-0-300-04449-2 | ||||
ISBN-10 0-300-04449-6 | 172pp. | HC/GSI | $21.00 |
The purpose of the authors in researching and writing this book is not to probe the variegated question of why nuclear power failed in the United States; as they note, there are many excellent analyses in print that cover that subject. Rather, their goal is to understand why it did not avoid failure. More specifically, they aim to investigate the reasons that, in a democracy, the multibillion dollar investment in a promising new technology was not steered toward configurations that were safer and less costly, hence more acceptable to the public.
"Perhaps the most important thing to know about nuclear technology is that there are dozens of ways to design nuclear reactors, and a growing number of nuclear technologists believe it is possible to construct reactors invulnerable to catastrophic accidents." *
* * "If nuclear power is to be discontinued, this should be based on a clear-eyed assessment of our experience through the 1980s. If nuclear power is to win a second chance, surely we ought to know what to avoid and how to do it better. The purpose of this book is to contribute to that scrutiny." – Pages 25 & 26 |
Two roads diverged at the start of reactor development in the U.S. One led to types and designs costing the least to develop, with safety features engineered in through redundancy and conservative tolerances. The other led to painstaking research, with the goal of maximizing safety through the laws of physics and the inherent physical characteristics of the reactors.
The old AEC took the road most traveled by — and that has made all the difference.
The authors ask if an inherently safe reactor was possible. They provide this answer:
"The answer appears to be yes. As early as 1956, General Atomic set out to build a reactor that was "so safe it could be given to a bunch of high school children to play with, without any fear that they would get hurt." As envisioned by Edward Teller, the reactor would have "inherent safety," or safety "guaranteed by the laws of nature and not merely by the details of its engineering . . . [If] all its control rods [were] instantaneously removed, it would settle down to a steady level of operation without melting any of its fuel." The reactor, known as TRIGA [Training/Research/Isotopes/General Atomic]), was successfully developed. More than sixty TRIGA reactors were sold, primarily to hospitals and universities that needed low-power reactors for research, producing radioisotopes, and training students." – Pages 106-7 |
As they note, there are dozens of types of reactors. Gas-cooled reactors, like the one that operated commercially for years at Fort St. Vrain (see opposite sidebar), can operate at higher temperatures and so are more efficient than the light-water type. They are also easier to decontaminate. Then there is the Integral Fast Reactor, a type of breeder reactor that promised to use uranium more efficiently, produce less and less long-lived radioactive wastes, be inherently immune to meltdowns, and (with "pyroprocessing") resist diversion of plutonium. Its development began in 1951 with EBR 1, where many of its promises were proven valid. It still promises to solve those vexing problems of nuclear power, but we have never completed its development.
The Navy and the AEC began their development of reactors using very different approaches. The AEC recognized that accidents were well-nigh inevitable with this new technology; hence, it sited all nuclear power plants in remote areas chosen for stable geology and little chance of extreme weather (e.g. tornadoes) and required the additional measure of a containment shell around the core, to minimize release of radioactivity in the event of equipment failure. The Navy had no option to use these measures; its nuclear submarines perforce cruised the world's oceans, were home-ported near populated areas, and could not tolerate the weight of containment shells. Accordingly, Navy reactor designs focused on very conservative engineering tolerances and multiple redundant systems. In other words, the Navy approach was to minimize the chance of catastrophic failure; the AEC chose to minimize the cost of catastrophic failure.
The AEC soon adopted the Navy's approach as well, requiring conservative tolerances and redundancy in all critical systems from sensors to emergency cooling systems to multiple sources of power to keep the other critical systems running if main power was lost. The authors compare their situation to that of a novice trapeze artist who, while learning to perform, enjoys a variety of protective devices including a safety rope to prevent falls and a net to prevent injury if a fall does occur.1
Located near Platteville, Fort St. Vrain was Colorado's only nuclear reactor and one of the few that used mixed thorium/uranium fuel. It was a helium-cooled, graphite-moderated reactor of the type generally known as HTGR (high-temperature gas reactor.) As such, it operated at 700°C and achieved 39-40% thermal efficiency, significantly better than light-water reactors (or coal plants). It was rated at 330MWe and ran from 1977 through 1992.
Fort St. Vrain was a successor to the experimental HTGR at Peach Bottom in Pennsylvania, which ran from 1966 to 1974. Some of the problems at FSV stemmed from design choices different from Peach Bottom or other HTGRs. One of them led to frequent water infiltration and corrosion. Contractor errors also caused trouble at FSV; one incident is reminiscent of the fire at Brown's Ferry. Technically successful but a commercial disappointment to its operator, it was decommissioned by 1995. The site now houses multiple conventional generators fired by natural gas. At Peach Bottom, the HTGR was replaced by twin boiling-water reactors made by GE. In the UK, however, seven power stations operating two second-generation HTGRs each provide commercial power to this day.
More information:
Things changed in the late 1960s. Based on their experience with fossil-fuel-fired steam plants, General Electric and Westinghouse began boosting the rated capacities of their reactors to achieve economies of scale, letting them lower prices.2 This led to a surge of orders from utilities. By 1970, American utilities had ordered some one hundred reactors. Construction of these reactors was accompanied by continuing news accounts of faked pipe weld x-rays and other shoddy construction practices, furthering public distrust of the nuclear industry.
It soon became apparent that containment of the maximum credible accident in such a large reactor (producing 1,000MWe or more) could not be assured by any device it was economical to build. The result was that the AEC changed its safety strategy from containing accidents to preventing them. What this meant in practice was preventing a core meltdown. The AEC attempted to do this by mandating hugely beefed-up cooling systems. Typically, there would be a low-pressure cooling system and a high-pressure cooling system, each of which had to be redundant. That is, redundant pumps, redundant valves, redundant pipes, redundant controls. Then there had to be redundant elevated tankage and piping to flood the core if the pumped cooling systems all failed. All of this had to be scaled to match the enormous heat capacity of the large reactor core. (These pumps and pipes and valves are not small components.) And every redundant cooling system had to have multiple power sources. Finally, the containment domes had to be larger and stronger. You can begin to see why reactor costs headed for the stratosphere.
Protests by anti-nuclear groups mounted through the 1970s, but the public by and large held to an uneasy truce with the industry. Then, in 1979, came the meltdown at Three Mile Island. Although little radioactivity escaped the plant3 and no deaths or injuries resulted, this was a disaster for public acceptance of nuclear power in America. Existing orders were cancelled, new orders never came. Utilities were forced to recognize that nuclear power had become unaffordable. The authors note:
"This downward spiral of events was as irrevocable as it was rapid. One difficulty fueled the next, and the rate at which this happened seemed to accelerate. Yet as varied and complex as these difficulties became, one set of decisions stands out as the catalyst, the central cause of the demise of American nuclear power: the rapid scale-up in the number and size of reactors and the resulting shift in safety strategy." – Page 88 |
To extend the trapeze-artist analogy beyond the authors' use of it, it's as if the trapeze artists had all become obese. Not only would absurdly costly protections be needed for them, but the public, anticipating a disaster, would not pay to see them perform. Knowing these things, no circus would hire them. Opinion polls bear out the nuclear industry's fall from grace. The authors note that public trust of the AEC faded markedly over the period 1965-1972,4 just when manufacturers were scaling up their reactors and the AEC was shifting its rules to match. They point out that the AEC had trapped itself, because it could not prove a maximum credible accident would never happen. Despite all its heroic efforts at prevention, and its multiple safety analyses based on extensive PRA (probabalistic risk assessment) studies, it failed to reassure the public.
The human perception of risk differs from the AEC's analytical approach in several important ways (paraphrased from page 94):
In addition, the public evaluates a risky undertaking in relation to its perceived benefits to each individual, and to the magnitude of its potential for catastrophe. Nuclear power provided a small perceived benefit (pollution-free power) and presented the possibility (remote but real) of unprecedented catastrophe.5
The authors conclude their analysis with two chapters in which they first revisit their original question, summarizing their findings, and finally assess the possibility that the public might be persuaded to try nuclear power once again. They think this is unlikely, since there is little prospect of overturning the dominance of light-water reactors in commercial plants, and since most Western nations show signs of abandoning nuclear power.6 They note prophetically that even those holding out might change their policy in the face of another accident like Chernobyl.7 But they also observe that there are points in favor of nuclear: the environmental harms of coal; the impending depletion of easy oil; the ever-clearer threat of global climate change. They recommend a series of institutional changes that must be in place to make such a nuclear renaissance happen. The first step is to turn the original precept of development on its head: Instead of rapidly building bunches of bigger and bigger reactors and then adding complicated safety features, patiently research the inherent safety of several reactor types and test these in carefully monitored prototype reactors over several years. Only proceed with commercial deployment once the degree of inherent safety of the various types is well understood.
"Admittedly, the odds of revising the inherited approach do not look very favorable, and the feasibility of reshaping the technology into an acceptable form is not fully knowable until we try it. But just as nuclear technology was initially shaped by political and economic forces a generation ago, so it can be reshaped if a political majority becomes willing to undertake the task." – Page 155 |
As can be seen from the length of the list of books they cite, the authors have been extraordinarily thorough in researching this question. While clear-eyed over the causes of the massive failure of nuclear reactor development in the United States, they hold a rational hope that a fresh start can be made and that, this time, a successful development can result. Their analysis is cogent. The book is also admirably free of errors. I give it my top recommendation and rate it a keeper.