The Freakish World of Root Cause Analysis
Daragh O Brien
On a recent project I had to guide a stakeholder through a root cause analysis activity to determine the real root causes of an Information Quality Problem.
My stakeholder had a number of assumptions that had been paraded as fact for some time and as such, everyone believed that tackling these “known” issues would deliver an antidote to the business problem that was the focus of the project. So strong was the stakeholder's confidence in these assumptions that he believed we could dispense with our scoping and definition of metrics for the quality of the process and the identification of the various scenarios that existed in the raw data. All we needed to do was prove that native use of a legacy system, or a bug in a call-centre system, was the problem and get 'The Business' to sort it out.
Unfortunately, the data simply did not support that analysis. Our own preliminary study, based on business rules for the flow of data through the various stages of the process, suggested that the actual root cause lay elsewhere, with native use of systems and other causes being lesser contributors. Unfortunately, our early findings flew in the face of the conventional wisdom so it was a hard sell. Our analysis must be wrong. Recheck the data. Shoot the messenger.
“Conventional wisdom” is a phrase that was coined by the economist J.K Galbraith to describe ideas that are generally accepted as true by society. He did not intend it to be a compliment. Indeed, many urban legends are accepted as true on the basis of conventional wisdom. Statements which are constantly repeated become conventional wisdom regardless of their truth or accuracy, because “audiences of all kinds most applaud what they like best.”