Analytics and Forecasting

Signal to noise

10 Nov 2009  

Why doesn’t early warning work?  While it’s a good idea in theory, in practice it seldom seems to have its intended effect.  In every major intelligence failure I’ve looked at, there were clear, credible early signals — and even explicit warnings—that tragically remained unheeded.  Why is this, and what can we do about it?

For example, in the recent meltdown of the US real estate market, and much of the world economy with it, there were lots of warning signs.  Some of these were very explicit and very public.  To name a couple:

  • The FBI’s 2006 published report warning of widespread fraud in the US residential mortgage market, which backed the mortgage-back securities market that subsequently collapsed
  • Yale economist Robert Shiller’s testimony before Congress in September 2007 that housing prices were dangerously overinflated, and that their imminent collapse would cause significant damage to the economy.
World Trade Center - September 11, 2001 9:45am

Downtown New York City - September 11, 2001 9:45am

In another example, leading up to the bombing of the World Trade Center towers in New York City in 2001, there were many events that could have been read as “feasibility tests” for 9/11.  There is a chapter (“Foresight — and Hindsight”) in the 9/11 Commission Report that catalogs the missed signals and other structural conditions that might have prevented the attack.  In retrospect, there seems to have been a straight-line connection between:

  • The February 1993 truck bombing of the WTC North Tower
  • The August 1998 bombings of the US embassies in Nairobi, Kenya and Tanzania
  • The October 2000 bombing of the USS Cole.

Our persistent inability to read and act upon clear signals in time is not a recent development.  In the most-examined intelligence lapse in modern history, the December 1941 bombing of the US fleet at Pearl Harbor, Congress demanded to know what had happened.  Hearings were held, and enough testimony produced to fill nineteen volumes.  I recently read the monumental 1962 one-volume condensation, Pearl Harbor:  Warning and Decision by Rebecca Wohlstetter.

Wohlstetter, then a history professor at Stanford, devotes most of the book to exploring in detail the many warning signals that preceded the event.  A great many of these clues were made possible by the fact that the US had previously broken the Japanese diplomatic codes. They included:

  • The November 1941 breakdown in negotiations regarding the oil embargo of Japan that the US had initiated four months earlier
  • An abrupt and massive change in Japanese codes and call signals in the early days of December 1941, including the required burning of all confidential and secret documents in embassies around the world
  • Unusual movements of Japanese submarines near Pearl Harbor in the days and weeks before the attack

She even reports that as early as 1936, war games and drills had been conducted that included a surprise air attack on Pearl Harbor as a possible scenario.  But following these drills no planning for such an eventuality was conducted.  And other signals were observed, documented, reported — then ignored.

Why are significant early warning signals typically ignored until it’s too late?

Wohlstetter’s elegant summary: “We are constantly confronted by the paradox of pessimistic realism of phrase coupled with loose optimism in practice.” (My emphases.)  In other words: too often we talk the talk of early warning, but fail to walk the walk.  This seems to be the prevailing pattern, not some random exception.

Here’s a simple explanation of one of the main reasons.  Events by definition unfold in a forward direction through time.  Looking forward, many events typically compete for our attention and overlap as potential early warnings.  In contrast, as we look backwards at them after the fact, we have the analytic luxury of highlighting those that fit our picture of what we now know actually happened — and “discarding” the rest as outliers.

This is a form of what psychologists call confirmation bias: we tend to pick out facts that support our narrative, rather than building our narrative on the available facts, as you might logically think would be a better way.  Hindsight, as a consequence, is perfectly accurate.

Confirmation bias works going forward too, and in effect filters out signals that don’t fit our pre-existing view of things.  In each of these three cases, our narrative supported and reinforced current expectations of what was possible.  It was widely “known” that the US economy could not melt down, that Islamic fanatics could not attack New York, and that Japan could never attack the US.  Our processing of any signals contradictory to our belief system was blocked by the assumed impossibility of the event.  As in Nassim Taleb’s metaphor, all swans are white until you’ve seen a black one.

As Wohlstetter puts it, “We failed to anticipate Pearl Harbor not for want of the relevant materials, but because of a plethora of irrelevant ones” — that is, of competing and/or contradictory signals.  She ironically concludes that, “If it does nothing else, an understanding of the noise present in any signal system will teach us humility and respect for the job of the information analyst” — a job that she observes was held in relatively low regard in the pre-Pearl Harbor military.

Paradoxes in organizational intelligence often rest on basic principles of human psychology.  These “facts of human nature” often have parallels in other areas of behavior.  One of my extra-curricular activities is audio engineering.  Because this field involves real-world signals, and how they affect human perceptions and actions, I sometimes find concepts there that are useful by analogy in organizational intelligence.  A key metric of audio quality is signal-to-noise ratio.  That is, in any given piece of electrical information, what is useful (“signal”), what is not (“noise”), and what is the ratio between the two?

In audio, the signal is what we’re trying to listen to — music, for example.  The noise includes hiss, buzz, hum, crackle, pop — things we’d prefer not to listen to.  While we can never completely eliminate the noise, our goal is to maximize the ratio between the signal and the noise.  The human mind completes the filtering process.

In analyzing events and making forecasts, we must overcome similar challenges.  Data comes at us continually, and at an increasingly great velocity.  Somewhere within that “firehose” lie narrative threads of great relevance to us — things that could help us greatly if we focused on them.  The skill lies in being able to pick out the relevant signals, while ignoring the noise — especially when the noise constitutes most of what we take in, and the signal is something that doesn’t fit our prior expectations.

To do this successfully, you need to:

  • treat analysis and forecasting, not as a one-time activities, but as part of a structured, consistent process
  • make sure this process is focused on the things that matter most to your desired outcome
  • make sure that the results carry through to decisions and actions.

There are signals ‘out there’ today, right now, that six months from now each of us will regret not having attended to and acted upon.  Do you know what these are?  Do you know how to find and monitor them?


4 Responses

  1. Excellent article, as usual.

    My one comment is that maybe (22% likely?) your opening question (“Why doesn’t early warning work?”) doesn’t frame the issue correctly. It would be just as incorrect to say, “Why do early warning systems always work?”

    Maybe a better way to frame the issue is by making the following observation:

    Scenario planning involves creating two or more possible future scenarios, with each scenario assigned a probability of likelihood.

    Leaving aside how to best structure scenarios (like using unknown variables for each of Porter’s five forces) – the reality is, only ONE scenario can take place.

    If you “pick a card” the probability before you pick might be 1 out of 13 of selecting an ace, but after the card is drawn, the outcome is 100% known. If you count cards and see that after 46 cards no ace has appeared, your early warning system has a 67% chance of “knowing” an ace will be next (because there are 4 aces left among the 6 remaining cards in a deck of 52 cards). With perfect “pre-knowledge” in a closed system with master counting – the early warning system, for this single event, only has a 67% chance of betting correctly.

    Today, you have “all of the signals/warnings” that China and India are booming and their populations are so large we are heading for global prosperity never before seen globally. (For the U.S., which has so dominated the global economy since the 1950’s, the rising tide will not feel as dramatic.)

    Today, you also have “all of the signals/warnings” that our economy and the global economy are going through the same cycle as during the depression. After the stock market crashed back then, stocks bounced back 50% before the long downhill slide and economic fallout for over ten years.

    Bottom line: there are early warning signals for each scenario…. at the end of the day, only one scenario will take place.

    (An early warning system that studied my behavior might have calculated that there was only a 7% chance that I would read your blog today – but look what happened!)
    .

  2. Tim Powell says:

    Thanks, Alan as always for your thoughtful comments.

    What you’re referring to is what the quants — and at heart I am one of them — call Bayesian probability. This is, I agree, important in the real world because we almost always have partial information. In that sense, life these days is more like poker than chess, which was a decade or so ago used as a metaphor for strategic planning. With each round, we gain more information, and place our bets (i.e. “allocate resources”) accordingly. But we NEVER have complete information — until it’s after the fact.

    What I’m proposing is that we need to take in and process information on a real-time basis, not once a year as in the “strategic planning” model. Then adjust what we are doing based on Bayesian probabilities.

  3. I’m 100% in agreement with your proposal. If I’m allowed to “Second it” – I do! because I’m all in favor of strategic planning in real-time (which is how we designed our web service at http://www.eCompetitors.com – to provide corporate planning and competitive intelligence information “on demand”.)

    If you want to give a presentation on your proposal and/or anything else (knowledge value chain concepts, etc.) to our LinkedIn group “Corporate Planning & Global Industry Segmentation” which has monthly webinars for and by members – we will roll out the virtual red carpet for you. We have over 3,500 members; and our next open slot for a speaker is for May.
    .

  4. […] Post passionnant dans un blog que je découvre; l’auteur (Tim Powell) rend compte du bouquin de Rebecca Wohlstetter, Pearl Harbor:  Warning and Decision . Wohlstetter examine la question suivante : comment se fait-il que nous ignorions (ou interprétions mal) des signaux comparativement clairs de ce qui va se passer. Elle cite l’exemple de l’attaque sur Pearl Harbor; Powell ceux de  la crise des subprimes, des attentats du 11SEP etc… […]

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Comments

  • Tim Powell on A brief pause: “Thanks, Les. My frequent conversations with you over the past year or two has helped my thinking a lot in…Jan 15, 13:07
  • Les on A brief pause: “Excellent advice thank you for your terrific reflection piece Tim!Jan 15, 09:40
  • Glenroy London on Knowledge Erosion: How to Avoid It: “Hi Tim I am knoco caribbean. About to join the global km family. Exploring km frameworks for design, development, implementation,…Jul 12, 08:52
  • Tim Powell on War of the Words: “Glad to oblige TJ — and thanks for your note — but I do encourage to try it for yourself…Jun 23, 08:10
  • T J Elliott on War of the Words: ““Chat credited me with founding and/or leading 20 different companies and writing 13 books. In fact, I founded one company…Jun 22, 22:42