Book Review: The Improbability Principle
By Matthew F. Baretich, P.E., Ph.D.
The other day I heard about a book called The Improbability Principle: Why Coincidences, Miracles, and Rare Events Happen Every Day, by David J. Hand (Scientific American / Farrar, Straus and Giroux, 2014). Over the years I have learned a fair amount about statistics and have used statistical methods in my management research and engineering practice. I wouldn’t say I have always had fun with statistics, but I certainly had fun reading this book.
A few sections of the book would be hard slogging for those without basic statistical knowledge, but the author provides helpful examples and warns readers when they might want to skip ahead a few paragraphs. Some of the tough material has been moved to appendixes for the intrepid.
The basic principle is that if there are enough opportunities for something improbable to occur, it will probably happen sooner or later. Nothing earth-shaking about that. It’s extremely unlikely that I will win the lottery—even if I buy a ticket, which I won’t—but it’s almost certain that someone will win. It’s what we commonly call the Law of Large Numbers.
But, of course, it’s not quite that simple. There’s more than one way to define probability, including the frequentist or aleatory interpretation (aleatory, I learned, refers to rolling of dice) and the subjective or epistemological interpretation (what we believe about the likelihood of a future event). Our commonsense ways of thinking about and talking about probability are not so clear when we look more closely. We need to be clear about definitions and meaning.
There’s a companion to the Law of Large Numbers—the Law of Truly Large Numbers—by which nearly impossible things become nearly certain. There are the Law of Selection and the Law of the Probability Lever, which show how things we might imagine as unlikely can be, in fact, relatively commonplace. And there are principles of psychology, like the Law of Near Enough that leads us to vastly overextend our sense of what counts as a meaningful coincidence.
As with other branches of mathematics, and as with the scientific method itself, one of the great benefits of statistics is to protect us from our biases. The physicist Richard Feynman said, “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Accuracy and precision count. Sound principles and careful reasoning count. Knowledge counts. As the philosopher and writer George Santayana put it, “Wisdom comes by disillusionment.” One way to interpret that pithy quote is that ridding ourselves of mistaken beliefs makes us wiser.
What’s the take-away? As professional engineers and managers working to improve patient care, we need to make sound judgments and decisions. In many cases, perhaps the majority of them, the data we have to work with are incomplete. Sometimes we have no choice but to make quick decisions based on experience and judgment. In his book Thinking Fast and Slow (Farrar, Straus and Giroux, 2011), Daniel Kahneman refers to that process and its neurological underpinnings as Thinking Fast. It’s a good way to handle many of the dilemmas we confront on a daily basis.
But there are other times during which we probably should be Thinking Slow, using available data and then reasoning our way to sound decisions. It seems to me that too often, under the pressure of overwhelming demands and unforgiving time constraints, we think fast when we ought to think slow. And I believe that, more often than we tend to imagine, we actually have access to enough data to support reasoned and thoughtful decision-making. Moreover, there are many thorny issues that are so important yet so fraught with uncertainty that it’s worth taking the time to collect more data and to increase our stockpile of information. In that direction lies better judgment.
© 2016 Baretich Engineering, Inc. This article was first published as a blog post at baretich.com.