I just finished reading The Design of Everyday Things by Don Norman (for those of you who are paying attention, the book was on the 2021 Leadership Reverie Reading List). The book is a best-seller that was fist published in 1988 with the title The Psychology of Everyday Things. I read the updated, revised, and expanded edition that was first published in 2013.
Several years ago, I took a six-week course in the science of continuous improvement at Cincinnati Children's Hospital Medical Center called the Intermediate Improvement Science Series, or I2S2 for short. One of the assignments was to choose from a list of recommended books and present an overview to the rest of the class (basically, write and present a book report). I actually selected another book for my report - The Fifth Discipline by Peter Senge. I have really wanted to read The Design of Everyday Things ever since. After a number of years, I finally looked the book up on Amazon. The reviews weren't all that great, so I decided to hold off on reading it.
Last month, I took a chance and decided to check the book out from our local library. I can honestly say that I really enjoyed the book. Perhaps I would feel differently if I was a subject matter expert (in human-centered design), but I thought the book was very informative and readable.
When I first heard of the book in my improvement science class, my first reaction was, "What does a book on human-centered design have to do with continuous improvement?" I was surprised - and I think you will be too - that the answer to my question was "plenty." There are so many parallels between human-centered design and quality improvement, that I would argue that they are really almost one and the same.
As an example, we often talk about a blame-free or just culture when it comes to patient safety in health care. Individuals working in high reliability organizations work in a blame-free environment, where they are able to report errors or near misses without fear of reprimand or punishment. Errors, almost always, result from defects in the system. Norman writes about this concept in his book, "We need to remove the word failure from our vocabulary, replacing it instead with learning experience. To fail is to learn: we learn more from our failures than from our successes." He goes on to advise, "Do not blame people when they fail use your products properly. Take people's difficulties as signifiers of where the product can be improved." In other words, humans make mistakes. When mistakes are made, re-design the system to prevent them from occurring again.
Norman describes in great detail the different ways that designers can force the desired behavior. For example, he talks about using a forcing function (also called a poka-yoke in the Toyota Production System or Lean/Six Sigma) to "force" individuals to do the right thing. As an example, drivers are unable to take a car out of park without first pressing down on the brake pedal. Similarly, most ATM's "force" you to take the cash before you can remove your bank card.
Norman goes on to talk about the different kinds of forcing functions - interlocks, lock-ins, and lock-outs. For example, have you ever walked down the stairs of a public building and noticed a gate placed at the ground floor? If you open the gate, you can walk down another flight of stairs to the basement. The gate's sole purpose is to prevent people who are rushing down the stairs to escape a fire from continuing on (mistakenly) into the basement, where they could be trapped. I have noticed those gates, but I never knew what they were for - mind officially blown!
Norman talks about something that he calls the "Iterative Cycle of Human-Centered Design," which he divides into four stages (admittedly, there are other versions of this same cycle in the design literature):
1. Observation
2. Idea generation (ideation)
3. Prototyping
4. Testing
Since this is a cyclical process (not a linear one), the "Testing" stage goes back to the "Observation" stage at the end. If you are familiar with quality improvement, you will recognize this as a slightly different version of a Plan-Do-Study-Act or PDSA cycle. Taking it one step further even, Norman recommends using small tests of change so that you an, as David Kelley (Stanford professor and co-founder of IDEO) calls, "Fail frequently, fail fast." If the results of the test are negative, you can quickly move on to the next test without wasting further time and resources.
As if all this weren't enough, Norman dedicates an entire chapter ("Human Error? No, Bad Design") to discuss the different kinds of human error and design principles that can be used to mitigate them. I have to say that this chapter was one of the best discussions of human error that I have ever read. which he classifies as either slips ("a slip occurs when a person intends to do one action and ends up doing something else") or mistakes ("a mistake occurs when the wrong goal is established or the wrong plan is formed"). Notably, slips are more likely to be made by experts, while mistakes are more common with novices. Norman talks further about rule-based, skills-based, and knowledge-based mistakes. He also talks about error reporting and detection, the use of checklists, and the root-cause analysis technique.
Overall, I think this would have been a great book to read during my improvement science class. I am glad that I finally read it, and I would highly recommend it for anyone interested in improvement science, patient safety, or human-centered design.
No comments:
Post a Comment