Wednesday, June 17, 2009

Causal Bias




As humans we naturally look for a cause for every event even if none exists. A gambler will attribute a winning streak to "being hot" or "beginners luck" when in reality randomness alone prevails.



You should be aware of this tendency in problem solving also. We naturally want to find a concrete reason for every event. Take the rudimentary root cause analysis here. Our first instinct is to look for the cause of a rupture disk failure as an overpressure in the reactor. After all, isn't that what the rupture disk is there for? This can quickly lead down the path of investigating the reaction conditions and potentially getting involved in a costly solution.

Another explanation is just that the disk failed randomly. (Insert Link to Drunkard's Walk) Perhaps it was installed incorrectly? The installation can be considered random, particularly if someone who doesn't routinely do the task was involved. Maybe the disk was an outlier (not really random problem) and failed at a much lower pressure than design pressure?

What evidence do you to support either route? Is it simple evidence (like pressure readings from a data logging transducer) or the opinion of an expert who has had bad experience with runaway reactions in the past? The strength of the Apollo Root cause approach is that it helps you to explore many different pathways, but most critically you need evidence to support whatever pathway you go down.

Perhaps you have evidence for both. In the future I'll explore ways to weigh evidence in order to choose the proper route to explore.

Wednesday, June 3, 2009

Beware of a Story

Real life is complicated and messy. If you encounter a root cause analysis where the data fits the solution too neatly, be careful. It is rare that all evidence fits a theory, it if does, there's a chance that something was left out in order to make it fit or the data was handled incorrectly. This is known as 'silent evidence' since it is not presented or available. This is another form of confirmation bias, the situation where we only focus on data that fits our mind-set or theory.

For an example of this see, the "story" presented by a correlation plot of time spent eating versus Body Mass Index (BMI) for a number of countries presented on the NY Times website. Read the comments and you can see that there are many flaws in the story illustrated by the x-y scatterplot. The plot weaves a nice story supporting the author's prejudice about fast food but the data does not support that story. (nor does it not support it)

A strength of the Apollo root cause analysis approach is that it helps force you to consider many causes that contribute. However, the trap we fall into is to focus on only one cause-effect path and neglect others. This may make a nice story but won't necessarily reflect reality.

Saturday, May 23, 2009

We're all Creative

In Moving To Higher Ground, Wynton Marsalis writes that creativity is not the province of  a small, specialized group of people but that everyone is creative.  He gets kids to blast away on instruments doing whatever and points out that they are creating.

He goes on to say "I told you it was easy [to be creative].  It's only hard if you want to sound good."
We don't lack creativity.  The trick is bringing that creativity to fruition.

Saturday, May 16, 2009

Relationship of Art & Science

Too often we focus on our strengths. Scientists & engineers have strong analytical skills and cultivate those throughout their education. Others have artistic skills and cultivate those. Frequently this is of necessity because we don't have time for both. Mae Jemison, an astronaut, doctor, and dancer (not necessarily in that order) talks about the relationship between art and science in a TED talk.

Problem solving requires creativity and as I've mentioned before, developing our creativity requires developing all parts of our mind. Try doing something new. Even if you aren't successful (or even moderately good), it will stimulate your mind. Bust out the crayons and color with your kids, try photography, write a Haiku, who knows what. Just try something you wouldn't normally do today.

Thursday, May 7, 2009

Scientific Method and Problem Solving

In the scientific method one suggests a hypothesis and then tries to find evidence disproving it. Failure to find evidence that refutes the theory means the theory is true.

This approach is a more robust method of problem solving than suggesting a solution (hypothesis) and then doing experiments to support it.

As Heuer points out in Chapter 4 of his book, a hypothesis (solution) can never be proven by even a large body of evidence since many hypotheses may be consistent with the evidence, but a single instance of incompatible evidence is enough to sink a hypothesis.

When suggesting solutions, plan some experiments that seek to disprove them. If you are unable to cause your solution to fail deliberately, it will be that much stronger.

Saturday, May 2, 2009

Psychology of Intelligence Analysis

I recently came across a book written for Intelligence Analysts in the CIA. After reading just the introduction and the a couple of chapters I can tell it will be the subject of several (many?) future posts. Although this book by Richards Heuer, Jr. is directed towards the analysis of what I call "soft" information, it is equally useful for scientists and engineers working with more concrete (or simple) evidence.

Psychology of Intelligence Analysis is available on-line. There is also a 2006 version available .

This book is about avoiding pitfalls when analyzing information and looks like it will address the topic from a higher level. In Chapter 2 Heuer already addresses the issues of simple evidence and confirmation bias. Heuer points out that "We tend to perceive what we expect to perceive" It is easier to notice data that already fits into our mind-set and is similar to what we already know. This is a little different than perceiving what we want to perceive which I think is a little easier to recognize and avoid.

Thursday, April 30, 2009

Confirmation Bias

We often look for data to support our proposed solution. "The Drunkard's Walk: How Randomness Rules Our Lives " mentions this in relation to us picking only the random events that support our hypotheses. We also will plan experiments that support our existing hypothesis instead of looking for ways to disprove it.

The best solutions are those which can't be made to fail as opposed to those which need the stars aligned (or maybe that special operator) to work.

Psychologist's also have a term for this as I heard on the radio this morning. "Magical Thinking" or something like that. We tend only to remember what we want and not look at the data impartially.