Redesigning Resilient Infrastructure Research is a lecture I gave at Ohio State University about what we need to learn to ensure that storms, crashes, and accidents do not become catastrophes.

The Fallacy of the UNpossible

How to surprise yourself with disaster

Thomas P Seager, PhD
6 min readDec 16, 2018

--

Dan Eisenberg, PhD, Lucien Hollins, and Marcus Snell contributed to this article, which was originally published by The Urban Resilience Research Network.

In the popular US cartoon television series The Simpsons, there is a brief scene in which one of the child characters is placed on academic probation.

Ralphie from the Simpson’s exclaims “Me fail English? That’s unpossible!”

As viewers, it strikes us as ridiculous to witness a character so unaware of his own cognitive limitations. We sense the paradox: his lack of awareness ensures that he will never become aware.

On a cartoon series, the stupidity of a cherubic elementary school student is comedic. In infrastructure management, such a lack of self awareness can be tragic.

Overconfidence in the technology of safety features built into the RMS Titantic had the perverse consequence of putting the vessel at greater risk.

Sometimes these failures can be attributed to distortions of perception called cognitive bias.

Common human biases, fallacies, and systematic errors are now so well documented that Wikipedia has catalogued an extensive list. For example, the Normalcy Bias describes the common misconception that conflates unprecedented with impossible. We often hear the argument that because something has never happened before, it must be true that such a thing could never happen. While the Normalcy Bias correctly captures this distortion of perception, it is so dangerous, and we see this erroneous reasoning so often, that when we notice it in infrastructure systems, we should give it a more memorable name.

When unprecedented is equated with impossible, we shall call it The Fallacy of the UNpossible.

The Fallacy of the UNpossible describes the phenomenon we observe when otherwise reasonable people confuse rare, or highly unlikely probabilities, with impossibility. While there are some things we can imagine that really are impossible, such as traveling faster than the speed of light or violating the second law of thermodynamics, unprecedented flooding, or earthquakes, or stock market crashes, or even election outcomes are not subject to these physical constraints.

For example, consider the flooding of California’s Oroville Dam. Still the tallest dam in the United States, officials recently ordered a mandatory evacuation of communities downstream of the dam, for fear the 770 foot edifice would collapse due to the simultaneous failure of the main and emergency spillways — an unlikely failure scenario environmental groups predicted back in 2005. In a public hearing held as part of a relicensing approval process, these groups argued that the emergency spillway was unsafe. According to them, should the spillway ever be activated, the flow of water over the top would erode the supporting embankment and eventually lead to collapse.

But the dam, built in the late 1960’s to provide flood control and water diversion from northern to southern California, had never before reached water levels that necessitated use of the emergency spillway. Over three decades of safe operation at the dam convinced officials that the “facilities, including the spillway, are safe during any conceivable flood event.”

Herein lies the fallacy. Dam officials were convinced their facilities were safe because they could not conceive of any scenario in which they weren’t. In the face of criticism of the spillway design, they held to the view that it was safe despite previous surprises, including a 1997 evacuation in response to the failure of some downstream levees.

Six years later, in 2011, water levels climbed within 11 inches of the Oroville emergency spillway. Such a near-miss might drive officials to rethink the security of the spillway, or perhaps conduct tests of spillway performance. However, in cases such as this officials sometimes confuse near-misses as proof of the graceful extensibility of their system, rather than as a warning that failure is more likely than previously thought.

The lessons of the Oroville Dam and recognition of the Fallacy of the UNpossible might inform other potential catastrophe scenarios.

One may be found in extreme heat. Temperatures in Phoenix Arizona in the United States recently neared 49 degrees Celsius. While such extreme readings are previously unheard of for a major US population center, a combination of climate change and urbanization might push future temperatures even higher — especially in the desert southwestern US, where Phoenix is a major transportation, economic, and cultural hub.

At the component level, the physical response of infrastructure to such extremes is predictable. For example, record high temperatures cause record high demand for electric power (for air conditioning), at exactly a time when power plants and transmission lines are operating at lower efficiencies (due to the higher ambient temperatures). However, at the larger complex systems scale, the consequences become un-predictable, and potentially catastrophic.

Should a cascading power failure occur during an extreme heat event, individuals would no doubt seek refuge in the mountains or towns surrounding Phoenix. Yet gasoline stations, dependent upon the city’s electrical grid, would be unable to refuel private cars. For those with fuel, traffic signal and rail crossing outages may result in gridlock, overwhelming taxed emergency response crews. Water pumps would fail and, when elevated storage reserves became depleted, there would be no pressure in the distribution system for supplying water for drinking, firefighting, evaporative cooling or irrigation. Sewage pump stations and treatment plants without back up power sources would overflow. And those with backup generation would be difficult to resupply. Airborne evacuation would be complicated by the fact that helicopters and jets are not rated for flight in temperatures that near 50 degrees C.

A 2011 power outage in neighboring Mesa, Arizona that left over 100,000 people without power for 11 hours gave citizens and officials a brief glimpse of what such a catastrophe might look like. However, the fact that the potential catastrophe was contained may result in overconfidence, rather than additional precaution.

When subject to the Fallacy of the UNpossible, people sometimes confuse the worst historical case with the worst possible case, and thus fail to prepare for the unprecedented. This type of thinking places emphasis on ensuring that the UNpossible event could never take place — i.e., to reduce the probability of the already unlikely. But recognition of the fallacy demands anticipation of the event rather than forecasting it — as in, “What do we do if… ?” rather than “How can we be sure that… ?This reframing avoids the Fallacy by applying resources to minimizing the consequences of the event, rather than the probability.

As climate, technology, and social systems evolve, we have every reason to believe that the future will be, in some characteristic and important ways, a significant departure from the past. Thus, extrapolation from historical datasets is likely to be less reliable than ever. Wherever we encounter designers, operators, policy-makers, or others arguing that historical experience has given them confidence in bounding the future, we must recognize the Fallacy of the UNpossible. Here, we use the ridiculous prefix “UN-” rather than “im-” to remind us that tragi-comedy only exists in cartoon worlds. Our real experiences will not submit to the fantastic failures of our imagination.

Additional reading

You might want to join our LinkedIn group because you have an interest in contrarian, controversial, and useful knowledge https://www.linkedin.com/groups/13613731/

--

--