What I Think About When I Think About Fukushima

Science Technology Society

Learning lessons the hard way from the Fukushima disaster, the Japanese government implemented new safety requirements for nuclear plants on July 8. Risk assessment consultant Woody Epstein shared his thoughts on the Fukushima accident and the use of nuclear energy at a symposium in Hiroshima in March 2013. Here we present excerpts from his speech.

I would like to begin a conversation with you about risk and society. I hope that my words will help form a basis for honest relationships with the public and policy makers.

Because of the accident at Fukushima Daiichi, we are here today speaking about nuclear energy. The accident of 3/11 has focused the attention of the world on whether it is possible to use nuclear energy in an acceptably safe way.

I am neither pro– nor anti–nuclear power. I am, however, pro-honesty and anti-dishonesty. I am a mathematician who has worked for 30 years with technologists who analyze risk. Two goals for a risk analyst are to provide reasoned evidence to decision makers and to give clear, down-to-earth explanations to the public.

The Meaning of “Risk”

Risk (and therefore safety) is the answer to three questions:

(1) What can go wrong?
(2) How likely is it?
(3) What are the consequences?

Nuclear power generation has, and always will have, risks. Nothing is 100% safe. Kurokawa Kiyoshi, the chairman of the government’s independent panel to investigate the Fukushima accident, said: “Accidents happen, machines break, and humans make errors.”

Over the years, most regulators have endorsed a safety goal that says that we must ensure that for each reactor, the likelihood of a core damage accident is no greater than 1 time every 10,000 years; and that the likelihood of a large release of radiation is no greater than 1 time every 100,000 years.

As of March 10, 2011, there were 438 commercial nuclear power generating units in the world. If each unit was operating 70% of the time, and each unit was safe according to the safety goals I have just described, then the likelihood of having a core damage accident, somewhere in the world, was about three times every 100 years. You should expect a core damage accident about two or three times during your lifetimes.

Now core damage accidents do not mean radiation releases. They do not mean Fukushima or Chernobyl accidents. If we use the same reasoning for large releases of radiation, then we should expect an accident such as Fukushima about one time every 330 years.

So here is my question to you: Are you willing to accept nuclear power if the likelihood of a large radiation release, somewhere in the world, is about one time every 330 years? Realize this likelihood will grow greater as more nuclear power plants are built. Also realize that we live in a global village: an accident in China will affect lives in Okinawa.

I am only talking about the likelihood of an accident, not the consequence that you will die or become sick from this type of accident. You are much more at risk (likelihood and consequence) from being in a car crash, drinking alcohol too much, eating fatty foods, or smoking cigarettes.

The Emotion Factor

Nuclear power has benefits: smaller carbon footprints, reduced reliance on oil, lower balances of trade, less pollution from burning fossil fuels. Somehow these benefits have little impact on our thinking because we are talking about nuclear power. We are dealing with human perception of risk.

Technological consequences to the public will not go away if we abandon nuclear power. Look at all of the gas, oil, and LNG tanks on the coasts of Japan. We have done studies on the impacts of large earthquakes, tsunamis, and typhoons on the chemical, oil, and gas industries. Believe me, there are dangerous accident scenarios with environmental, social, and economic consequences that could approach Fukushima.

Policymakers tend to borrow ideas from economics to create models for risk perception. One key assumption is that individuals and society behave in a rational manner. If we could provide people with more or better information, everyone would make more logical, rational, and informed decisions regarding risk.

But people do not behave rationally. Rationality is one part of decisions. Emotions are just as important. Those who ignore the need for emotions in decision-making have missed the point of the human condition.

Conversation with the Public

How can technologists more effectively work with the public and government? We need to be better listeners. We need to understand the issues important to all of the people involved. We need to be better at giving easy-to-understand explanations.

Part of the challenge is that we, in the technical community, have disengaged from public discourse to a large extent. A hundred years ago, the latest scientific theories and discoveries were discussed in leading newspapers in a way that an educated person could understand. There was a dialog with the public, and science was a subject of popular discussion.

What happened? We in the scientific community have become isolated and arrogant; we have lost the art of conversation with the public.

Soon after the Fukushima accident, I was at a town meeting in Tōkaimura. During the meeting, a very worried mother asked one of the sensei leading the meeting about the amount of radiation that could harm her children. And the learned sensei told her to look it up on the Internet.

The Art of Being Understood

How are we to understand the framework within which the evidence that technologists develop is understood and applied?

Step one is learning to listen. When you begin to listen, you begin to build a bridge.

The second step is for everyone to ask each other hard questions. Asking questions is the road to knowledge. Remember that the definition of risk above seeks answers to three questions. Technologists must bring the spirit of questioning to the decision-makers we support and, in the case of technologies with the potential to harm the public, to the people as well.

Step three is to realize that science is never exact or final. Science changes over the years, sometimes with unanticipated discoveries. We must help you understand the uncertainty, how to “expect the unexpected,” and how to create resilient institutions that can flexibly respond to an accident when—not if—it happens.

Safety Is a Perception

After Fukushima, many of my Japanese friends asked me, “When is safe safe enough?” I answered them, “Safe is safe enough when you say it is safe enough. How do you want to live? What risks are you willing to accept to live the way you want to live?”

There are no magic answers. There are only difficult choices, some of which have objective measures, and some of which have to be made with the heart. What is important is that we all take responsibility for our decisions, especially if the decisions go wrong as they did at Fukushima Daiichi. When you’re wrong, be the first to admit it, even if you’re the last one to know.

Risk is a perception, a feeling, and we as technologists must be honest with the public and the policy makers by saying truthfully what we know and what we do not. All of us must be in this together, in a continual conversation, trying to do the right thing.

(Excerpted from a speech given on March 25, 2013, at the third Global Environmental Leaders International Symposium at Hiroshima University. The full text is available at Woody Epstein’s site.)

disaster nuclear power Fukushima Chernobyl risk safety 3/11