Wednesday, January 20, 2010

Chasing More Information Alone is a Strategy for Failure

Consider the following recent events highlighting the fact that over reliance on information to make informed decisions is an incomplete strategy that can lead to disaster:
  1. Officers supervising a U.S. Army Major were well aware of extreme behavior and statements that should have raised red flags about his fitness for duty. Unfortunately, those concerns were not acted upon and the army psychiatrist used his security clearance to murder 13 and attempting to murder 30 soldiers at Fort Hood on November 5, 2009.
  2. United States government agencies had intelligence from a family member, a wiretap, and other warning signs that a Nigerian passenger on board an airplane from Amsterdam to Detroit posed a risk.  They plan to question him when he arrived on December 25, 2009 (Christmas Day). Unfortunately, the passenger was able to smuggle explosives onboard and attempted to ignite it while in flight to murder the 300 people on the plane and possibly others on the ground. Only due to an attentive and courageous fellow passenger was he prevented from carrying out his plan.
  3. US CIA knew that they were meeting with a Jordanian agent whose history included ties to al Qaida and extremist views. Unfortunately, the agent was not searched before he was driven onto a Forward Operating Base in Afghanistan where he blew himself up, murdering 7 CIA officials and a Jordanian military officer on December 30, 2009.
  4. Haiti’s government was warned by scientists in 2008 about signs of growing stresses in a tectonic plate fault indicating that the country was at risk of a 7.2-magnitude earthquake. Unfortunately, none of the recommendations for strengthening hospital and other buildings were implemented before a 7.0-magnitude earthquake hit the country on January 12, 2010 causing catastrophic results.
What these events have in common is that key decision makers who were in positions of authority relied on partial information and failed to act to prevent disaster. For example, in the case of the Fort Hood shooter, base security wasn’t aware of the risks of allowing him on the base because his supervising officers hadn’t shared what they knew and feared. In the case of the “Christmas Day bomber,” he was able to make his way through security because the authorities didn’t consider the information that they had to be proof of an imminent threat and they believed that standard airport security could prevent any threats to the airplane and its passengers. In the case of the double agent, he was driven to the base without being searched because CIA and Jordanian intelligence believed that he was reliable based on the past help that he had provided. In the case of the earthquake, the earthquake wreaked havoc on public buildings not able to withstand a major quake because the Haitian government didn’t have proof of when a major earthquake would occur and they didn’t make it a priority to find the time and resources to prepare key buildings to withstand a major quake in part because of other economic and political crises that seemed more pressing at the time.

 
Consider the following insightful resources that explore the reasons behind faulty thinking and analysis and offer solutions:
  • In the upcoming book ”Why Intelligence Fails: Lessons from the Iranian Revolution and the Iraq War,” Professor Robert Jervis describes four ways in which faulty thinking occurs. First, our minds are quick to see patterns and meaning and then tend to ignore information that might disprove them. Second, we tend to ignore the fact that a key piece of information is missing and we focus on what we can see. Third, our conclusions often depend on assumptions that are not subject to testing. Fourth, we tend to think that others see things the same way that we do. The solution? Instead of gathering more information, organizations should engage people who can think more broadly and imaginatively to provide different perspectives and challenge established views.
  • In the 1/14/10 BusinessWeek article, “Innovation's Accidental Enemies,” Roger L. Martin and Jennifer Riels from the Rotman School of Management at the University of Toronto write that innovation is often killed with the two deadliest words in business: Prove it. Too often, when faced with a new idea, leaders reject new ideas until they are convinced by deductive or inductive reasoning that the idea will be successful. The solution? Instead of waiting for proof, organizations should use abductive logic of what could be to make a logical leap to the best possible conclusion.
  • In the book “The Strategy Paradox: Why committing to success leads to failure (and what to do about it),” Deloitte Research Distinguished Fellow Michael E Raynor writes that organizations that experience strategic success and failure are similar in that they actively make strategic choices which when made are perfectly reasonable. Luck is often what determines success or failure. The solution? Instead of trying to achieve greater accuracy in making predictions which at some point becomes an exercise in futility due to our limited ability to predict the future, organizations should manage risk in uncertain environments by understanding the degree of uncertainty and practicing strategic flexibility and create options that can be followed or dropped in response to changes.

The bottom line is that organizations looking to increase the changes of being successful in developing and implementing strategic planning would benefit from following the following steps: First, gather as much relevant information as possible. Second, create a process for open, innovative, imaginative, and ongoing analysis to look at issues from many different directions. Third, recognize that information will nearly always be incomplete and that change is constant. Take precautions so that there are options for acting according to the latest facts on the ground or in the air as the case may be.

 
Thoughts?

No comments:

Post a Comment