Wednesday, June 17, 2009

Causal Bias

As humans we naturally look for a cause for every event even if none exists. A gambler will attribute a winning streak to "being hot" or "beginners luck" when in reality randomness alone prevails.

You should be aware of this tendency in problem solving also. We naturally want to find a concrete reason for every event. Take the rudimentary root cause analysis here. Our first instinct is to look for the cause of a rupture disk failure as an overpressure in the reactor. After all, isn't that what the rupture disk is there for? This can quickly lead down the path of investigating the reaction conditions and potentially getting involved in a costly solution.

Another explanation is just that the disk failed randomly. (Insert Link to Drunkard's Walk) Perhaps it was installed incorrectly? The installation can be considered random, particularly if someone who doesn't routinely do the task was involved. Maybe the disk was an outlier (not really random problem) and failed at a much lower pressure than design pressure?

What evidence do you to support either route? Is it simple evidence (like pressure readings from a data logging transducer) or the opinion of an expert who has had bad experience with runaway reactions in the past? The strength of the Apollo Root cause approach is that it helps you to explore many different pathways, but most critically you need evidence to support whatever pathway you go down.

Perhaps you have evidence for both. In the future I'll explore ways to weigh evidence in order to choose the proper route to explore.

Wednesday, June 3, 2009

Beware of a Story

Real life is complicated and messy. If you encounter a root cause analysis where the data fits the solution too neatly, be careful. It is rare that all evidence fits a theory, it if does, there's a chance that something was left out in order to make it fit or the data was handled incorrectly. This is known as 'silent evidence' since it is not presented or available. This is another form of confirmation bias, the situation where we only focus on data that fits our mind-set or theory.

For an example of this see, the "story" presented by a correlation plot of time spent eating versus Body Mass Index (BMI) for a number of countries presented on the NY Times website. Read the comments and you can see that there are many flaws in the story illustrated by the x-y scatterplot. The plot weaves a nice story supporting the author's prejudice about fast food but the data does not support that story. (nor does it not support it)

A strength of the Apollo root cause analysis approach is that it helps force you to consider many causes that contribute. However, the trap we fall into is to focus on only one cause-effect path and neglect others. This may make a nice story but won't necessarily reflect reality.