I recently mentioned the Prussian general and military strategist Carl von Clausewitz in a recent post ("Wicked"), who is perhaps best known today for his book On War written between the years 1816 and 1830 and published by his wife posthumously in 1832. The book is an unfinished work and consists of a series of essays that von Clausewitz wrote on military theory, strategy, and tactics following the Napoleonic wars. The book has broader application to strategy and tactics in general, often finding its place alongside a similar book by Sun Tzu (The Art of War) on the bookshelves of business leaders.
My last couple of posts have focused on one of the classic texts in safety science, Dietrich Dörner's The Logic of Failure. Dörner actually quotes a passage from Carl von Clausewitz when he talks about a concept he calls "methodism", which he defines as "the unthinking application of a sequence of actions we have once learned" to solve complex or so-called wicked problems. And why wouldn't we try to solve problems using the tools and techniques that have been successful for us in the past? As Dörner explains, we have an attachment to the tried and true. Unfortunately, what has worked for us well in the past often leads to an oversimplification of an otherwise incredibly complex and complicated problem.
Here is the passage (shortened somewhat, but hopefully the point is still clear) from von Clausewitz:
"So long as no acceptable theory, no intelligent analysis of the conduct of war exists, routine methods will tend to take over even at the highest levels. Some of the men in command have not had the opportunities of self-improvement afforded by education nor contact with the highest levels of society and government...Their only insights are those that they have gained by experience. For this reason, they prefer to use the means with which their experience has equipped them, even in cases that could and should be handled freely and individually."
von Clausewitz goes on to define war as a complex problem:
"War is not like a field of wheat, which, without regard to the individual stalk, may be mown more or less efficiently depending on the quality of the scythe; it is like a stand of mature trees in which the ax has to be used judiciously according to the characteristics and development of each individual trunk."
Dörner explains this passage further by stating, "In many complex situations, considering a few 'characteristic' features of the situation and developing an appropriate course of action in the light of them is not the essential point. Rather, the most important thing is to consider the specific, 'individual' configuration of those features and to develop a completely individual sequence of actions appropriate to that configuration."
He writes on, "The methodist is not able to cope with the specific, individual configurations, and he uses one or the other depending on the general features of the situation as a whole. He does not take into account the individuality as it is evidenced in the specific configuration of the features."
Methodism, which I think greatly oversimplifies the complexity of a wicked problem (see my posts on the High Reliability Organization principle of "Reluctance to Simplify"), can lead to failure because "a change in some minor detail that does not alter the overall picture of the situation in any appreciable way can make completely different measures necessary to achieve the same goal." The metaphor used by von Clausewitz on felling trees in a forest is very appropriate here. We have to adjust the way that we cut down each tree by carefully analyzing the tilt of the tree, the position of surrounding trees, the shape of the tree's crown, whether the tree has twists in the trunk, and even the direction of the prevailing winds. Any one of these features should force us to re-think how we cut down the trees in the forest, but collectively do not change the overall appearance of the forest when we look at it in aggregate.
Dörner mentions a famous experiment in cognitive psychology called the "water jug problem" first published in 1942 by Abraham and Edith Luchins. The problem is a great example of something known as the "Einstellung effect", and it even appeared in the 1995 movie Die Hard with a Vengeance starring Bruce Willis and Samuel L. Jackson (see the movie clip here). Basically, in the version of the problem used in the movie, you have in front of you a 5-gallon water jug and a 3-gallon water jug. You also have access to a water fountain, so you can fill up both water jugs as many times as you want. Your job is to put 4 gallons of water exactly into one of the jugs. How do you do it?
Think about it. How would you solve this problem? As Abraham and Edith Luchins, the villain in the movie Die Hard with a Vengeance, and Dietrich Dörner all believe, we would try to solve this problem using methods that have been successful for us in the past when faced with similar problems. Unfortunately, we would fail in our attempt to solve the problem then (and, in the movie, the villain would win because a bomb would explode).
Here is a step-by-step explanation of how you can solve the "water jug problem" from the movie:
- Fill the 5-gallon jug to the top (the 5-gallon jug has 5 gallons of water now)
- Pour water from the 5-gallon jug into the 3-gallon jug and fill it all the way up (the 5-gallon jug now has 2 gallons and the 3-gallon jug now has 3 gallons of water)
- Dump out the water from the 3-gallon jug (which now is empty)
- Pour the water (2 gallons) from the 5-gallon jug into the 3-gallon jug (the 3-gallon jug now has 2 gallons of water, and the 5-gallon jug is empty)
- Fill up the 5-gallon jug all the way (the 3-gallon jug still has 2 gallons, and the 5-gallon jug has 5 gallons)
- Pour water from the 5-gallon jug into the 3-gallon jug, filling up the 3-gallon jug (the 3-gallon jug now has 3 gallons and the 5-gallon jug now has 4 gallons!!)
For those of you who have been reading my last several posts, you may have noticed that Dörner recommends attacking complex problems with an algorithmic approach or sequence as well. Importantly, he is not advocating "methodism" here. If you examine the sequence that he recommends more closely, it is a general approach that involves experimentation and sounds a lot like the either the Scientific Method widely taught in schools or the PDSA cycle (Plan-Do-Study-Act) used in quality improvement today. Here is my side-to-side comparison of all three techniques:
One last point. I mentioned Gary Klein's recognition-primed decision-making model (RPD model) in the first post ("It's complicated...") on Dietrich Dörner's The Logic of Failure. On the surface, Klein's model, which explains how experts can make quick - almost automatic - decisions in complex situations, would seem to contradict Dörner's admonition against using past experience to make quick assessments and decisions. At the risk of having my cake and eating it too, I actually don't think these two models contradict each other at all. If you look closely at Klein's RPD model, it's very similar to Dörner's model above. The key differences are that time is significantly compressed in Klein's model (in terms of the time available to go through the steps in each model), and most of the recognition-primed decision-making occurs in one person's (the expert's) thought processes.
As you can see in the Figure above, the key branch points and steps in Klein's model are:
1. Is the situation familiar? If not, reassess the situation and seek additional information. If so, proceed to the next step.
2. Define goals
3. Develop a model/plan (here, Klein calls it a mental simulation of action)
4. Will the model/plan work? If not, modify it. If so, proceed to the next step.
5. Execute the plan and review the results.
Dealing with complexity is not easy. That is perhaps one of the reasons so much has been written on it. I was very interested in reading Dietrich Dörner's The Logic of Failure, and I certainly can understand why it has become a classic text in the field of safety science. I would like to move on to another classic in future posts, Barry Turner and Nick Pidgeon's book, Man-Made Disasters.
No comments:
Post a Comment