I just read one of the classic books in safety science, The Logic of Failure by Dietrich Dörner. There is more than one edition out I think, and the one I read (thank you to my local library) was the 1986 English translation of the original book that was published in German. The premise of the book is perhaps best summarized by a statement in the blurb (yes, that's apparently the proper term, though I have also seen the term "flap copy" used) from the dust jacket:
"Dietrich Dörner, winner of Germany's highest science prize, here considers why - given all our intelligence, experience, and information - we make mistakes, sometimes with catastrophic consequences. Surprisingly, he finds the answer not in negligence or carelessness, but in what he calls "the logic of failure": certain tendencies in our patterns of thought - such as taking one thing at a time, cause and effect, and linear thinking - that, while appropriate to an older, simper world, prove disastrous for the complex world we live in now."
Unfortunately, I'm not fluent enough in German to read some of the original studies that Dr. Dörner referenced in his book. Regardless, there were several interesting points made in the book that I would like to discuss in greater detail. The first is how he defines and explains the concept of complexity. There's been a lot written on the difference between complex and complicated. One of the best explanations I've found is an article by Alexandre Di Miceli ("Complex or Complicated?"). He says, "A complicated system has a direct cause and effect relationship. Its elements interact in a predictable way." Complicated systems are controllable, often by following specific rules or algorithms. Conversely, he says that complex systems are composed of elements that interact with each other in unpredictable ways. It is these interactions that differentiate complex systems from merely complicated ones. Di Miceli goes on to say:
"A car engine is complicated. Traffic is complex."
"Building a skyscraper is complicated. The functioning of cities is complex."
"Coding software is complicated. Launching a software start-up is complex."
Dietrich Dörner would whole-heartedly agree with Di Miceli's explanation. He writes, "Complexity is the label we give to the existence of many interdependent variables in a given system. The more variables and the greater their interdependence, the greater that system's complexity." He goes on to define something that he calls the "complexity quotient" as the product of the number of features within a system times the number of interrelationships that they have. For example, if there are ten variables and five links between them, the complexity quotient is fifty (10 x 5 = 50). As another example, if there are one hundred variables that are completely unrelated (no interrelationships or links between them), the system's complexity quotient is zero (100 x 0 = 0).
Dörner next makes a profound statement, at least in my opinion. He says that "complexity is not an objective factor but a subjective one." Imagine, as an example, the everyday activity of driving a car to and from work. For someone my age, who has been driving for the past few decades (I won't say how many decades!), driving a car in busy traffic might be frustrating at times, but it's fairly straightforward. However, put a new driver behind the wheel in the middle of Chicago rush hour traffic, and you may find a completely different perspective on how hard it is to drive in traffic. The key here is something that Dörner calls "supersignals." For the experienced driver, rush hour traffic is not made up of hundreds of different elements that myst be interpreted individually, but rather he or she is processing information in aggregate and by "gestalt."
Supersignals reduce complexity by collapsing a number of features together into one. Think about how we look at someone's face. We don't see all the contours, surfaces, and color variations. Instead, we see just one face in aggregate. Because of these supersignals, complexity must be understood subjectively from an individual's perspective. We learn these "supersignals" by experience and training. Dr. Gary Klein suggests that experts base their decisions by looking at the aggregate, recognizing a pattern that they've experienced before, and making a decision (he calls it recognition primed decisionmaking).
It seems like a simple concept, but I found it to be much more profound. Interestingly, some of the other topics in The Logic of Failure reminded me of the computer game SimCity. More on that in my next post.
No comments:
Post a Comment