When I studied physics in college, the professor would often start a discussion by saying. “Let’s assume for a moment there is no gravity and no friction.” What a cool world that would be! You could just push off, as if in outer space, and end up at your destination across town with no effort.
This is a total fantasy, of course — but it makes concepts in elementary mechanics much easier to understand. The “pure” forces at work are much easier to grasp clearly without the rigors of reality intervening.
Later, when I was in management school, my dear professor Art Swersey would frame problems in operations research in much the same way. “You start by making a call to God…” was Art’s irreverent way of positing the assumption that you could have perfect and comprehensive information, instantaneously and at no cost, about all the factors affecting the case.
That’s obviously true only in some idealized version of reality, where the “God assumption” allows you to define the information you will require without considering the costs incurred in acquiring it. In the real world, information is never perfect, is never comprehensive, is never instantaneous — and often incurs a substantial set of costs, both monetary and other.
Understanding information costs, and how they relate to the benefits provided, is the basis of knowledge economics — and a key element of my work with clients. “Value” is, in its simplest expression, the ratio of benefits to costs; V = B/C.
Sometimes this value ratio is incalculably high — when human lives are at stake, for example.
Try this experiment — which I recommend as a thought experiment, rather than an actual one. Take a chain necklace and tug at both ends. The chain breaks, and you now have two partial chains. If you look carefully, it’s not the chain itself that has broken — it’s only one link that breaks, and the rest of the two shorter “chain-lets” are intact.
Proverbially speaking, the chain is only as strong as its weakest link. When that breaks, the chain itself is deemed broken — i.e., useless for the purpose for which it was intended. The health of the entire chain is judged — not by its strongest element, or even its average component — but by its weakest element.
Information is much the same. The chain that takes data into knowledge into decisions and actions, is a “serial” one that requires all the sequential elements to be working. If even one element is not, that is the weakest link — and the entire chain fails to serve its intended value purpose.
The “Valentine’s Day Massacre 2.0” in Parkland Florida is a tragic case illustration of this. The New York Times headline from February 17, 2018 says it all: “Warned About Suspect, F.B.I. Didn’t Act.” Neighbors and other observers had witnessed the eventual killer’s aberrant behavior and statements over an extended period of time leading up to the horrific event. At least two of them had taken the action to call an F.B.I hotline to report that the eventual killer was both heavily armed and clearly deranged.
Thus, the foundation data was in the system. And the signals were in no way covert nor coded, including social media posts by the killer proclaiming his aspiration to become a “professional school shooter.”
However, these disparate calls and social media posts (i.e., data points) were never connected with each other (i.e., analyzed), nor were they passed along (i.e., communicated, in this case to the Miami F.B.I. field office) for further investigation and action. The appalling net result is that nothing was done to stop the killer — even though his intentions were known and clearly stated, by himself and others.
That one little link in the knowledge chain broke — the chain itself failed — and as a result seventeen lives were lost, and dozens other shattered in an instant.
Chain broken, lives broken, value destroyed. We must do better. The stakes are too high.
To avoid the sky-is-falling hyperbole that so many of us, myself included, so easily fall into these days, I note a similar case in which there was a more positive outcome — if you can call it that. The New York Times reports on April 5, 2018, that Julian Edelman, the New England Patriots footballer, was alerted to a threat to “shoot up a school” left by someone in the comments section of Edelman’s Instagram account. Edelman’s assistant called the Boston police, who made an emergency records request to determine the origin of the threat. They traced the sender back to Port Huron, Michigan, where local police arrested a 14-year-old boy who admitted to making the threat — and who had access to rifles belonging to his mother. The youth is in detention and has been charged with making a false report of a threat of terrorism, a felony punishable by up to four years in jail. In this case, people saw something, said something — and may have prevented another potential tragedy.