Showing posts with label Bias. Show all posts
Showing posts with label Bias. Show all posts

Friday, November 5, 2010

Short Memory

"You can’t improve a design when you’re emotionally attached to past decisions. Improvements come from flexibility and openness." A quote from the 37signals blog.

This brings me back to diverse thinking and confirmation bias. The longer we work on a problem, the more focused we are. The good part is that we tune out the unimportant and distractions, but the downside is that we are less open to a new insight which might lead to a better solution.

How does one achieve a balance between focus and freshness? One way is to have several projects going at once (who doesn't) but rather than flitting back-and-forth in a feeble attempt to multi-task, I think you should dedicate significant chunks of time and effort to one problem. Then switch to another without re-visiting the first problem for some time. When you return to the initial problem you can't help but have a fresh perspective. You've also allowed for some incubation to occur. The longer you've worked on a project, the harder it is to return with a fresh perspective. That's where the challenge is.

The time spent devoted to one project is a factor you can play with. If it is a project you're familiar with, you can stay away from it for some time. For a new, unfamiliar project, don't stay away too long because you may end up spending too much time refreshing your memory.

Tuesday, June 1, 2010

Overcoming Mental Blocks

A quote from the signal vs noise blog. "Sometimes a design isn’t working because you think you can’t change the one element that needs to be changed"

The same goes for problem solutions. Maybe you're not finding an effective solution because you are locked in on something as being essential when it isn't. Take a step back, attack the problem with a beginner's mind and maybe another solution will present itself.

I was once faced with an analysis problem in which I couldn't avoid the compound I was trying to analyze decompose in the equipment. After many iterations of trying to find a way to avoid decomposition, I finally realized that if I deliberately decomposed the compound in a known manner, the solution was easy.

Monday, May 10, 2010

Lazy Thinking

Prejudice in problem solving is caused by lazy thinking. When we are pre-disposed to a particular solution, we need to be careful to avoid simply assuming the previous solution is the correct one this time also. This is related to the "beginner's mind".

How do you strike a balance between re-creating the wheel every time a problem comes up and re-applying the same solution? It has to do with lazy thinking. If you carefully consider a problem instead of jumping to conclusions, it doesn't cost much in terms of effort and will help avoid overlooking potential new solutions or new wrinkles to an old problem.

Wednesday, June 17, 2009

Causal Bias




As humans we naturally look for a cause for every event even if none exists. A gambler will attribute a winning streak to "being hot" or "beginners luck" when in reality randomness alone prevails.



You should be aware of this tendency in problem solving also. We naturally want to find a concrete reason for every event. Take the rudimentary root cause analysis here. Our first instinct is to look for the cause of a rupture disk failure as an overpressure in the reactor. After all, isn't that what the rupture disk is there for? This can quickly lead down the path of investigating the reaction conditions and potentially getting involved in a costly solution.

Another explanation is just that the disk failed randomly. (Insert Link to Drunkard's Walk) Perhaps it was installed incorrectly? The installation can be considered random, particularly if someone who doesn't routinely do the task was involved. Maybe the disk was an outlier (not really random problem) and failed at a much lower pressure than design pressure?

What evidence do you to support either route? Is it simple evidence (like pressure readings from a data logging transducer) or the opinion of an expert who has had bad experience with runaway reactions in the past? The strength of the Apollo Root cause approach is that it helps you to explore many different pathways, but most critically you need evidence to support whatever pathway you go down.

Perhaps you have evidence for both. In the future I'll explore ways to weigh evidence in order to choose the proper route to explore.

Wednesday, June 3, 2009

Beware of a Story

Real life is complicated and messy. If you encounter a root cause analysis where the data fits the solution too neatly, be careful. It is rare that all evidence fits a theory, it if does, there's a chance that something was left out in order to make it fit or the data was handled incorrectly. This is known as 'silent evidence' since it is not presented or available. This is another form of confirmation bias, the situation where we only focus on data that fits our mind-set or theory.

For an example of this see, the "story" presented by a correlation plot of time spent eating versus Body Mass Index (BMI) for a number of countries presented on the NY Times website. Read the comments and you can see that there are many flaws in the story illustrated by the x-y scatterplot. The plot weaves a nice story supporting the author's prejudice about fast food but the data does not support that story. (nor does it not support it)

A strength of the Apollo root cause analysis approach is that it helps force you to consider many causes that contribute. However, the trap we fall into is to focus on only one cause-effect path and neglect others. This may make a nice story but won't necessarily reflect reality.

Thursday, April 30, 2009

Confirmation Bias

We often look for data to support our proposed solution. "The Drunkard's Walk: How Randomness Rules Our Lives " mentions this in relation to us picking only the random events that support our hypotheses. We also will plan experiments that support our existing hypothesis instead of looking for ways to disprove it.

The best solutions are those which can't be made to fail as opposed to those which need the stars aligned (or maybe that special operator) to work.

Psychologist's also have a term for this as I heard on the radio this morning. "Magical Thinking" or something like that. We tend only to remember what we want and not look at the data impartially.