So why are we so collectively scared of nuclear power?
Bradford Plumer interviewed some cultural historians in an attempt to answer that question. Our fears of nuclear power have a long history and predate World War Two and Hiroshima. Movies about the dangers of radiation were being made as early as the 1930s.
In 1928 a lawsuit brought by the “radium girls” against the United States Radium factory in New Jersey was settled. Beginning in 1917 United States Radium had employed thousands of people to paint radium pigments onto watch dials to make the dials glow-in-the-dark. The workers were encouraged to sharpen the points of their brushes with the lips and tongues. The health results were predictably catastrophic: anemia, bone fractures, and necrosis of the jaw. The lawsuit proved to be a landmark in labor relations and worker safety. The implications for our fear of radiation seem obvious.
Radiation is invisible. It can’t be detected without special equipment so there is no way for someone to see the risk beforehand. A refinery or a coal plant next to a neighborhood is something that can be easily seen and easily incorporated into the phenomenology of risk. Individuals can, in theory, choose how close they want to live to an industrial plant. The actual risk of disaster may be still be hidden because of industrial cover-up or lax regulation but that perceived choice seems to assuage many fears, at least until the next explosion.
Plumer quotes the historian Spencer Weart about the effect of nuclear scientists on the safety perceptions of the public.
But Weart notes that nuclear scientists themselves also hyped the danger of reactors. “Every worry you hear about nuclear reactors exploding, about meltdowns-every exaggerated scenario originated with some nuclear scientist or engineer,” he says. In the postwar era, many of these scientists fretted that the nuclear industry would adopt overly lax standards. Edward Teller, “the father of the hydrogen bomb,” was a big proponent of civilian nuclear power (in 1979, he suffered a heart attack and blamed it on Jane Fonda’s anti-nuclear activism). But he spent a lot of time agitating for safety rules, playing a role in persuading U.S. reactor operators to install containment vessels (something Chernobyl, tragically, lacked). Such warnings, ironically, both made the industry safer and stoked fears of catastrophe.
There is a common argument in the discussion of expert/non-expert communication about the need for transparency. Just how much information should we share with the general public about the risks of nuclear power or any other technological system? Almost the entire content of the academic journal Public Understanding of Science deals with this problem. Most of the time the answer is to increase transparency and information sharing.
But the example of nuclear power shows how this increased honesty about risk may make the general public more afraid. If you accept that transparency is a primary goal then the only way out of this conundrum is to be transparent about everything. In principle that feels like the correct way to go, but it really just pushes the problem up a level to the level of information management. How does the general public assess the overwhelming amount of risk information being presented? It was hard enough to do the task when focused on nuclear power. Must we now become lay experts on everything.