Guest Post: A knowledge management system capable of blinking red
Inattention to critical knowledge is an old problem. Lessons are forgotten, near misses are ignored, caution is dismissed, disasters result. Titanic. Bhopal. AIG. Katrina. Fukushima. And on and on.
Knowledge Management (KM) is supposed to make the right information available to the right people at the right time in the right form—and to the best level of certainty possible—for making the most appropriate decisions when and where they are needed. KM should also direct the attention of decision makers to critical information and help them make sense of it. The bigger the stakes, the more situational awareness and mindfulness are needed.
“The 9/11 Commission’s gripping book, the Columbia Accident Investigation Board’s thorough report…and the New York Times’…account of the scandal involving the fabrications by reporter Jayson Blair are windows on…organizations’ vulnerabilities….[A]ll of these tragedies are striking reminders that while individuals can be quite adept at picking up on hints of failure in the making, organizations typically fail to process and act on their warnings.
The FBI field agent warning about terrorists in flight schools; the engineers requesting better photos of the space shuttle’s wing after it was struck by debris; the department editor [warning] that Blair shouldn’t be writing for the paper—all these individuals were sending signals of impending disaster. ‘The biggest screaming headline is that all the knowledge needed was already inside,’ says Jeffrey Sonnenfeld [of] Yale School of Management. Or as George Tenet, the former director of central intelligence, told the 9/11 Commission, ‘The system was blinking red.’ “
—Jena McGregor, Gospels of Failure
Improvement is elusive. Issues of “data ownership” and “organizational stovepiping” still often dominate consideration rather than critical knowledge needs (e.g., “Who is your company’s Chief Decision Officer?”).
“Errors in process or product design are often ignored….The more times small failures occur without disaster, the more complacent managers become.”
—C. Tinsley, R. Dillon, and P. Madsen, How to Avoid Catastrophe,
“Even if your organization is already using scenario planning, leaders tend to focus on the probable rather than the disruptive” (McGregor).
“Trying to imagine future scenarios—without the right framework or expertise—can [be] bewildering. A more manageable approach”, says Karl Weick (University of Michigan organizational theorist and author of Managing the Unexpected), “is to think backward from a potential outcome, which will surface the events that could create it.”
What information is most important to manage? Answer: that which informs about existential risk and helps to identify the best actions among alternatives, including in wild circumstances.
KM should curate, make relevant, and draw attention to timely information potentially needed for critical actions.
That requires working backwards through a causal chain, providing a culture open to information sharing, and presenting the information in a way that’s easily understood. Decision Intelligence is the discipline that provides this solution. As such, it offers a robust platform for KM and for a system capable of blinking red.