NYT, April 12, 2011 - The condition of the damaged Fukushima Daiichi reactors in Japan is “static,” but with improvised cooling efforts they are “not stable,” the chairman of the Nuclear Regulatory Commission told a Senate committee on Tuesday.
Reading between the lines, the NRC chair is saying in effect, that events unfolding at the Japanese nuclear plants are as they have been since the crisis began—out of control.
If you were to glean just one single idea from reading my blog, my hope would be that you would come to really, really, understand the difference between “in control” and “out of control”. If you get this, you will understand all you need to know about 3-sigma and the nature of human knowing.
For the sake of simplicity, it would not be incorrect to say that “predictability” and “control” mean the same thing. A system is any process or group of processes that behave in a predictable manner within some knowable limits. If a “system” is not predictable—i.e. not in control—it is not a system. It is simply unknowable chaos.
Now here’s where 3-sigma comes in. As we all should know, we cannot predict anything perfectly because there is variation in every pattern—snowflakes, grains of sand, people’s appearance and personality, galaxies and stars and the seasons—but for practical purposes we can predict within useful limits the general nature and behavior of these and other things that we find interesting and relevant to our lives.
But how can we know when there is a predictable pattern—a system—or no pattern at all? Walter Shewhart used the statistical value of 3-sigma, or 3 standard deviations from any measure of central tendency—the mean or median for a set of measurement data—as a dividing line between predictable processes and utter chaos. A system that produces consistent outcomes within 3-sigma limits, even if those outcomes vary widely, can be said to be “stable”. Is 3-sigma exactly the right value? There is no “correct” value but it turns out that 3-sigma works pretty well. The really important thing though, is the idea itself—the difference between processes that signal evidence of control and those that give off no such signals and are therefore for our intents and purposes, out of control.
So the NRC chairman is saying that the Japanese nuclear reactors are not stable. There is no pattern that we can reference to predict what will happen next, but sooner or later things will happen next and we can only guess whether or not the things that happen next will harm us or not. We just have no way of knowing and KNOWING is the whole game, because if we have no way of knowing it is impossible for us to know what to do next. We can do nothing more than roll the dice and cross our fingers.
I know of course that we do know some things about nuclear power plants. We know that if everything goes as planned, we can control the nuclear reactions, contain the radiation, and generate steam to turn turbines and make electricity. We also know that if everything does not go as planned, the nuclear reactions will become “uncontrolled” and do things that we do not fully understand. What will the melting core really do? How will the radiation released into the groundwater, air, and oceans affect life on earth next week? Next year? A hundred or a thousand years hence? Roll the dice.
In the final analysis, all knowing is a crap shoot—a bet that the patterns we observe and predictions we make will turn out as planned. We bet that cars will stop at the stop light every time we cross the street—true enough most of the time. We bet that the jobs we work will not kill us. We bet that the ladder we stand on to fix the roof will not fail. We bet that our friends and lovers will act in our best interest. We bet that the sun will rise tomorrow morning—so far so good.
But not all bets are created equal. We can bet a quarter on the next roll of the dice—the tomatoes we planted will produce fruit next month—or we can bet the farm, as in the case when we bet that unforeseen events will not result in a nuclear power plant going out of control, spreading plutonium, with a 25,000 year half life and the deadliest toxin KNOWN to man, throughout the oceans and atmosphere of our planet.
Over the course of our history as the one and only betting species—Homo sapiens predictus—we have continued to up the game stakes. In a 1998 article, “A Special Moment in History” in The Atlantic, Bill McKibben paints a very clear picture of the interconnectedness of the systems we routinely bet on—of how no single bet we make is independent of all the others—and how the stakes have risen and continue to rise. In part two of his article he makes a very tangible point: Insurance companies—gamblers one and all—are in the business of knowing how systems vary and when prediction is impossible–of knowing when a system is out of control. The article stands up as well today as it did when it was written and I recommend reading it.