Friday, December 11, 2009

Fail=Success

Sorry for re-posting but not only does this post have good advice but I've recently come across the Lifehacker website. If you use computers (and what scientist or engineer doesn't), Lifehacker has many great tips, links to make your computing life easier. They are scouring the web so you don't have to.

Friday, November 27, 2009

Databases

Some of my earlier posts have had to do with information (or data) visualization. Before you can visualize data, it must be readily available. In today's age of computers, your data is certainly electronic but if it is spread across multiple servers, directories, and files it is unmanageable and not available. This is where relational databases come in.

If your data is in a well-designed relational database, you can access it in a variety of ways. Most data analysis software (e.g. Excel, MiniTab, Origin, Quality Analyst are a few that I've used) has wizards that make it easy to develop SQL queries to retrieve data. With a minimal knowledge of SQL you can modify the wizard queries to make them even more powerful. The important part is to have your data in a database to begin with. While a LIMS or another database may seem like an expensive investment, it will pay off in the long run by giving you many opportunities to examine your data and find answers to your problems in data you have already gathered rather than having to design new experiments.

In today's age of tight budgets, many companies want concrete justification for purchase and use of a database. The problem is that a database will help you solve problems that you haven't even imagined yet and so it is hard to provide evidence of the "hard" savings that a database will provide.

Anybody with an eye towards the future (not just the next 3 months) should have a relational database for storing their data and begin inserting data immediately. Without data in your database, it will be useless.

Monday, September 28, 2009

Know your data

Often we gather data to aid in solving a problem we encounter. One trap that you can fall into is that the data is not exactly what you think it is. For example, you might be taking pressure readings from a process line to decide whether a filter is plugged. The pressure readings may be normal leading you to believe that the filter is not plugged and in satisfactory condition. However, if the pressure gauge is upstream of the filter, then the data is misleading you. The filter may be plugged but the pressure gauge location is misleading you.

This is particularly a problem if you don't have good change control on your equipment. The diagram you are working on may not reflect reality. It is always a good idea to go look at what you're working on, both the help you visualize the process and to make sure things are as you think they are. The biggest danger to problem solving is thinking you can sit back in your office and just think through the problem.

Sunday, September 6, 2009

Active Exploration

When I was in school I was listening to a talk about a topic (or so I thought) unrelated to my research. The speaker mentioned in passing that they were adding hydrogen to the helium microwave plasma. That caught my attention as I realized that it may be suitable for the problems I was having getting my project on RF ICPs to work. With a little bit of literature searching I was able to find much work from the light bulb industry that addressed the problem that I was having. Imagine my surprise that my "research" problem had already been encountered (in a somewhat different form) in another industry.

You should always be aware of potential solutions to problems you might encounter in unexpected places. In "Made to Stick" the authors talk about the power of spotting a good story that supports what you want to communicate. Unless you are actively looking for such an story, you may miss it when you come across it. Likewise problem solutions may elude you unless you have the problem(s) in mind and you are actively looking for solutions.

Don't neglect this key ingredient of problem solving.

Getting your solution to Stick

In their book "Made to Stick" Chip and Dan Heath make the following statement (slightly edited) in the Epilogue.

"[Problem solving] has two stages: the Answer stage and the Telling Others stage. In the Answer stage, you use your expertise to arrive at the [solution] you want to share. [...]
Here's the rub: The same factors that worked to your advantage in the Answer stage will backfire on you during the Telling Others stage"

After solving a problem, we know a lot about it. The problem is, the people who we are communicating the solution to, don't know nearly as much. This is where the "Curse of Knowledge" comes in. What we already know, we have to get across to others. The problem is that what we think is important may not matter unless you can get others to buy in to your solution. The book provides the recipe for SUCCESs.

www.madetostick.com is their website.

Monday, August 31, 2009

Multi-tasking, Learning and Phone Conferences

My company is getting more distributed. Workers on a project (problem) located at opposite ends of the country and even overseas. For that reason, we are having increasingly frequent phone conferences. They're great for touching base frequently without having to hop on a plane but I have some doubts about how effective these meetings really are.

My earlier post about multi-tasking cites evidence that multi-tasking is a misnomer and yet these phone conferences are a invitation to do so. One ear on the phone, sending an e-mail and perhaps even in IM or working on a document is probably pretty standard behavior for participants in a phone conference. Frequently someone is asked a question and their response is "Pardon, please repeat the question." Their attention was elsewhere.

In the chapter 9 of his book Brain Rules, John Medina reviews studies which show that optimal learning is when multiple senses are involved . In phone conferences, we only have one of our senses involved. When you're in a meeting, you're learning, unless of course you are doing all the talking. You're learning what others know, what you have to work on, what they're working on.

The phone conference is a less than ideal learning situation. I wonder if there are any studies on the efficiency of face-to-face meetings versus phone only and phone and visuals. I wonder where the break-even point is for the travel costs versus the wasted time in the meetings.

Some suggestions for your next phone conference.
  • Send an agenda out beforehand so everyone is prepared to learn about the topics.
  • Use some sort of desktop sharing program so everyone is seeing the same information.
  • Send out minutes along with any visuals after the meeting is over so everyone can review the outcomes. (Another chance to learn what was discussed).
  • If you're a participant. Pay attention. It may seem like you're more productive when sending IMs, EMs, but every time you ask for something to be repeated you're wasting x number of other people's time.
Modern technology has made many things more efficient, but our brains are still "primitive". We learn best by incorporating multiple senses and reviewing the information multiple times.

Friday, August 28, 2009

Being Good Enough

This month, Wired magazine has an article about products that are good enough to do the job but not great. The point of the article is that this is how these products can creep up on the leaders in the field until eventually the are capable of surpassing them.

The same applies to problem solutions. Often we are perfectionists, trying to anticipate every possible angle before implementing the solution. Sometimes this can delay implementation of a solution and result in lost opportunities. Often (except when safety is involved) getting a partial solution in place quickly is more important than addressing all the issues beforehand. You can then evaluate what the weaknesses are under operation and utilize an iterative process to optimize the system. Your supporting structures (management of change, document creation, document control, etc.) need to be efficient enough so that they aren't a drag on the process.

If your support functions are more taxing to complete than the solution itself then something is wrong. Often Quality organizations become enamored with lots of checks and balances and the whole process gets bogged down. Often these people aren't users of the process but only enforcers. If you start thinking to yourself that you know how to do something but don't want to cut through all the red tape to get it implemented, then something is seriously wrong. Both because good ideas may not get implemented and because people are tempted to take shortcuts without submitting the ideas to the proper review process. This can lead to unintended consequences.

Even our best laid plans can go awry. One needs to balance the desire for a "right the first time" solution with the inevitable refining process that occurs anytime you try something new. Continuous improvement works best when it is continuous. If it proceeds in fits and starts, then often it won't re-start.

Monday, August 24, 2009

Distractions

Over time, the communications and information streams that we are exposed to have gotten more and more complex. This interferes with problem solving. Here is some recent research by a group at Stanford which demonstrates that multi-taskers are actually less capable multi-taskers than those who don't multi-task.

I've read about people who have figured out through data visualization techniques when most of their e-mails arrive. They then set aside a few times each day when the address e-mails. I only get 10s of e-mails a day. I can't imagine someone who may receive hundreds. They must develop an efficient system to wade through this smog.

If you cannot focus on the problem seriously, you will have trouble finding effective solutions. While there may be a time to get exposed to new ideas, and stimulate your creative problem solving, it is not in the midst of a problem.

Monday, August 10, 2009

Thinking Strategies

Garr Reynolds posted a presentation on Thinking like a Designer. His rules apply to problem solvers also. We must always work within constraints but need to balance this with the beginner's mind. Our solutions should be as simple as possible while still being innovative. We need to be able to communicate our ideas effectively, etc.

Check out his blog for many ideas about effective communication.

Measuring Perceptions of Uncertainty

I found the reference I mentioned in my previous post. It is Figure 18 in Chapter 12 of Psychology of Intelligence Analysis

The table shows the probability assigned by various reader to statements containing verbal expressions of uncertainty. For some terms, such as 'highly likely' or 'probably' the probability assigned by various users spanned a range as high as 50%. Clearly using verbal descriptions of probability is probably :-) a bad a idea!

Nicolas Bissantz beat me to this topic by at least 1 week in his posting. Speechless not numberless He gives examples from the financial and medical area.

Monday, August 3, 2009

Quantifying Uncertainty

Scientists and engineers are familiar with quantifying uncertainty in situations where we have data and measurements available. If we know the measurement standard deviation and the number of measurements we made, we can calculate a confidence interval in which the true answer lies.

Things are more complicated when we are asked to make a judgment call. For example, what if you are asked when you will get a report finished. You might answer, "probably tomorrow". The problem with this is that "probably" can mean something quite different to different people. "Probably" to you might mean "maybe" to another. Some will use the word "unlikely" instead of "might" or "might not".

Although it will seem uncomfortable, try to put some sort of odds on your judgment calls. If you answer, "there's a 90% chance I'll be finished tomorrow" it's a lot less ambiguous than "I'll probably be finished tomorrow". Even though the odds might only be 70% or 80%, they're certainly not 20% or even 50%. Attempting to apply some sort of numerical as opposed to verbal uncertainty will make your communication clearer.

Somewhere I've come across a study of what various verbal uncertainty terms meant to different people. I'll try to find it again and post it here.

Sunday, July 26, 2009

Creating Meaning Visually

I've posted before on the importance of visualization in problem solving. Here Tom Wujec tells us how the way the brain creates meaning from images. More evidence on the importance of visualization in getting a handle on larger problems. A good visualization is critical when presenting your ideas to a group or when working together in a group where different people have different ways of thinking about a problem. Visualization techniques (like mapping or modeling) can help to get everyone aligned.

By the way, this TED site is a great place to stimulate your imagination by seeing presentations by excellent speakers on a variety of topics.

Wednesday, June 17, 2009

Causal Bias




As humans we naturally look for a cause for every event even if none exists. A gambler will attribute a winning streak to "being hot" or "beginners luck" when in reality randomness alone prevails.



You should be aware of this tendency in problem solving also. We naturally want to find a concrete reason for every event. Take the rudimentary root cause analysis here. Our first instinct is to look for the cause of a rupture disk failure as an overpressure in the reactor. After all, isn't that what the rupture disk is there for? This can quickly lead down the path of investigating the reaction conditions and potentially getting involved in a costly solution.

Another explanation is just that the disk failed randomly. (Insert Link to Drunkard's Walk) Perhaps it was installed incorrectly? The installation can be considered random, particularly if someone who doesn't routinely do the task was involved. Maybe the disk was an outlier (not really random problem) and failed at a much lower pressure than design pressure?

What evidence do you to support either route? Is it simple evidence (like pressure readings from a data logging transducer) or the opinion of an expert who has had bad experience with runaway reactions in the past? The strength of the Apollo Root cause approach is that it helps you to explore many different pathways, but most critically you need evidence to support whatever pathway you go down.

Perhaps you have evidence for both. In the future I'll explore ways to weigh evidence in order to choose the proper route to explore.

Wednesday, June 3, 2009

Beware of a Story

Real life is complicated and messy. If you encounter a root cause analysis where the data fits the solution too neatly, be careful. It is rare that all evidence fits a theory, it if does, there's a chance that something was left out in order to make it fit or the data was handled incorrectly. This is known as 'silent evidence' since it is not presented or available. This is another form of confirmation bias, the situation where we only focus on data that fits our mind-set or theory.

For an example of this see, the "story" presented by a correlation plot of time spent eating versus Body Mass Index (BMI) for a number of countries presented on the NY Times website. Read the comments and you can see that there are many flaws in the story illustrated by the x-y scatterplot. The plot weaves a nice story supporting the author's prejudice about fast food but the data does not support that story. (nor does it not support it)

A strength of the Apollo root cause analysis approach is that it helps force you to consider many causes that contribute. However, the trap we fall into is to focus on only one cause-effect path and neglect others. This may make a nice story but won't necessarily reflect reality.

Saturday, May 23, 2009

We're all Creative

In Moving To Higher Ground, Wynton Marsalis writes that creativity is not the province of  a small, specialized group of people but that everyone is creative.  He gets kids to blast away on instruments doing whatever and points out that they are creating.

He goes on to say "I told you it was easy [to be creative].  It's only hard if you want to sound good."
We don't lack creativity.  The trick is bringing that creativity to fruition.

Saturday, May 16, 2009

Relationship of Art & Science

Too often we focus on our strengths. Scientists & engineers have strong analytical skills and cultivate those throughout their education. Others have artistic skills and cultivate those. Frequently this is of necessity because we don't have time for both. Mae Jemison, an astronaut, doctor, and dancer (not necessarily in that order) talks about the relationship between art and science in a TED talk.

Problem solving requires creativity and as I've mentioned before, developing our creativity requires developing all parts of our mind. Try doing something new. Even if you aren't successful (or even moderately good), it will stimulate your mind. Bust out the crayons and color with your kids, try photography, write a Haiku, who knows what. Just try something you wouldn't normally do today.

Thursday, May 7, 2009

Scientific Method and Problem Solving

In the scientific method one suggests a hypothesis and then tries to find evidence disproving it. Failure to find evidence that refutes the theory means the theory is true.

This approach is a more robust method of problem solving than suggesting a solution (hypothesis) and then doing experiments to support it.

As Heuer points out in Chapter 4 of his book, a hypothesis (solution) can never be proven by even a large body of evidence since many hypotheses may be consistent with the evidence, but a single instance of incompatible evidence is enough to sink a hypothesis.

When suggesting solutions, plan some experiments that seek to disprove them. If you are unable to cause your solution to fail deliberately, it will be that much stronger.

Saturday, May 2, 2009

Psychology of Intelligence Analysis

I recently came across a book written for Intelligence Analysts in the CIA. After reading just the introduction and the a couple of chapters I can tell it will be the subject of several (many?) future posts. Although this book by Richards Heuer, Jr. is directed towards the analysis of what I call "soft" information, it is equally useful for scientists and engineers working with more concrete (or simple) evidence.

Psychology of Intelligence Analysis is available on-line. There is also a 2006 version available .

This book is about avoiding pitfalls when analyzing information and looks like it will address the topic from a higher level. In Chapter 2 Heuer already addresses the issues of simple evidence and confirmation bias. Heuer points out that "We tend to perceive what we expect to perceive" It is easier to notice data that already fits into our mind-set and is similar to what we already know. This is a little different than perceiving what we want to perceive which I think is a little easier to recognize and avoid.

Thursday, April 30, 2009

Confirmation Bias

We often look for data to support our proposed solution. "The Drunkard's Walk: How Randomness Rules Our Lives " mentions this in relation to us picking only the random events that support our hypotheses. We also will plan experiments that support our existing hypothesis instead of looking for ways to disprove it.

The best solutions are those which can't be made to fail as opposed to those which need the stars aligned (or maybe that special operator) to work.

Psychologist's also have a term for this as I heard on the radio this morning. "Magical Thinking" or something like that. We tend only to remember what we want and not look at the data impartially.

Monday, March 16, 2009

Fly larvae and simple evidence.

In the Monster of Florence, the investigators ignored simple evidence that cannot be faked and instead relied on the testimony of people with vested interests in the outcome. Instead of taking the evidence of fly larvae on the corpses (fly larvae had no interest in the outcome of the murder investigation), they instead trusted the testimony of people since that testimony fit their theory as to the time of death better than the physical evidence of the body's decay.

When investigating a root cause and gathering evidence to support a solution, keep in mind the types of evidence you are gathering. Often we want a certain outcome to be true and interpret information to fit that outcome. This will often lead you down the wrong path. Try to gather evidence that is independent of the solution you seek. I call this simple evidence. It cannot be faked and can only be interpreted in one way.

If your evidence requires assumptions then it is not simple evidence. Those assumptions may be wrong. Often our assumptions are biased by our experiences our outlook. Try to avoid them.

Occam's Razor

I just finished, "The Monster of Florence ". Douglas Preston and Mario Spezi's account of a serial killer in Florence, Italy and the investigations attempting to find the killer(s).

I was struck by the complexity of the theories that the investigators proposed in order to build their case against some individuals (and groups). This brought to mind Occam's Razor's, a principle that states that you should make as few assumptions as possible when trying to explain a phenomenon.

This applies to finding solutions. The more complex the solution, you develop, the more chances there are for problems in the future. Keeps things simple and your solutions will have greater longevity, be easier to implement, and easier for others to follow. Complex solutions can be a house of cards that will come crashing down when one aspect or another isn't fully implemented as intended.

Saturday, February 7, 2009

De Bono's Six Hats

One problem solving technique commonly used is brainstorming - a technique with which I'm sure you are all familiar. However, we all see things from our perspective. One variation to try to force you out of your "common sense" is the De Bono hats. There are many references on the web and published so you can look them up for yourself.

One thing I'm contemplating is whether you can do this within your own field. Sometimes we try to solve all problems with whatever tool we're best at. Try using a tool other than your favorite for the problem.

For example, maybe Excel isn't the best tool for presenting your data. Perhaps a Word document would be better or even - dare I say - Powerpoint.

Don't use duct tape and vise grips to fix everything. Get to learn different tools and give them a try.

Monday, January 26, 2009

Changing E-Mail Subjects

Along the lines of my earlier post about e-mailing.

Have you ever been part of an e-mail chain where the subject morphed from the original to something else?

Don't be afraid to change the subject line or recipients of an e-mail chain or delete non-relevant sections. In addition to intellectual property issues, there can be a lot of waste associated with not keeping the e-mail header information current.

E-mail is pervasive and many people get hundreds of e-mails a day. If your subject does not communicate effectively, your message won't even see the light of day.

Wednesday, January 7, 2009

Conditions & Actions

When doing a cause & effect analysis, remember that for any effect there is both a condition and an action that must occur.

In order for a fire to start, you need more than the conditions of fuel, oxygen and heat. You also need an action (e.g. a spark, a match strike) for the first to start. Sometimes the conditions are actions might be so obvious that you don't think of them at first, but it helps to include them since ti broadens your thinking - leading to better brainstorming or diverse thinking. You might not think to include oxygen as a condition for a fire but you might miss an important solution if you don't. That's why we use inert gas (nitrogen) glove boxes when working with pyrophoric materials.

Looking for both conditions and actions will help you to build a more complete picture and ultimately lead to more effective solutions.