Tuesday, January 16, 2007


I have yet to see any problem, however complicated, which, when you looked at it in the right way, did not become still more complicated. ~ Paul Alderson

There are entire libraries full of books on the subject of problem-solving. Entire companies makes copious amounts of money teaching people how to identify and solve problems. I have a fair amount of experience in the field of problem resolution. first as a quality professional and more recently as a system administrator (where problems are the order of the day).

Problem-solving methodologies borrow heavily from the scientific method. The scientific method is generally attributed to Sir Isaac Newton, although I've heard credit given to Galileo, who was certainly one of the earliest true experimenters. Even some early Greek scholars have been cited. The method is as follows:
  1. Observe some aspect of the universe.
  2. Develop a hypothesis.
  3. Use the hypothesis to make predictions.
  4. Test the by way of experiments or further observations.
  5. If the predictions are borne out,publish it, wait for the notification from the Nobel committee, then go on to the next problem.
  6. If it doesn't work, modify the hypothesis in light of the additional results. Lather, rinse, and repeat as necessary.
In the day to day world, substitute “Define the problem” for number 1, which includes gathering data to ascertain the real issue.

Now this seemingly simple process is filled with pitfalls. For example, it is easy for a researcher to fall in love with a hypothesis. You have some observations, and you think about them. Then, in a bit of a flash, you have an explanation, based on inductive reasoning. When further observations are taken or experiments are conducted that don't support the hypothesis, the researcher sometimes finds fault with the deductive methods.

Or take the day-to-day term “Define the problem.” I would speculate that the single biggest difficulty in problem resolution is determining what you're really trying to resolve. A customer called one day to announce that a percentage of a given part we were making was out of tolerance, because they didn't fit on a mating component made by the customer. Since the part was molded, we suspected that a core might have been damaged in way that was not readily visible to the eye. So we asked which cavity number was bad. The customer said that it varied.

That simply doesn't happen. It turned out that the customer's own process was the problem. Our parts were fine, but his weren't.

Why had they accused our part? Ours was made of rubber; the mating part was steel. Everyone “knows” that rubber parts vary in dimension while steel parts are consistent. Except that in this case, the steel parts were machined after being cast. It was the machining that was wrong on the steel parts that our rubber parts wouldn't fit.

The customer went down the wrong path because a) he didn't bother to check his steel parts, and b) he made assumptions based on incomplete data.

Errors in calculation also lead problem solvers in the wrong direction. For example, based on a data set, a quality engineer made some recommendations that were going to cost a lot of money to implement. The boss knew that wasn't going to go over very well, particularly since the engineer admitted that his solution would not guarantee that the problem was corrected, so he asked me to look at the numbers. There was a data group that was prominently out of line with the others. This group was interpreted to show that the process was extremely unstable. In fact, the data group had been entered incorrectly. When fixed, the “instability” disappeared, and we were able to move to a good solution.

Science is full of similar examples. Einstein's refusal to accept quantum theory led to years of futility on his part trying to make Relativity work down to the subatomic level. His “part” was right; if the other guy's (the quantum theorists) didn't fit, then they had to be wrong. Today we recognize that different rules apply to quantum particles than to macro particles. The attempt to reconcile them continues, but it's recognized that it's going to take some sort of new tools to do so.

Then there's the reality of errors. The history of science is filled with great scientists (including Einstein) making blunders that should never have happened but did, because scientists are human. The classic example involves DNA. When Watson and Crick were coming down to the wire on determining the structure of DNA, they were crushed to learn Linus Pauling was about to release his own results. Thanks to Pauling's son, Watson and Crick were able to get an advance copy of the elder Pauling's paper (a story in and of itself). They found a simple mathematical error that caused Pauling to decide that the structure was a triple helix.

Of course, the true structure was the famed double helix. Watson and Crick, who determined to publish the structure themselves, did not point out Pauling's error to his son. No, rather they toasted their own good fortune.

Not exactly the scientific method at work, but, then, I did say that scientists were human.

No comments: