Friday, September 30, 2022

Find your way back!

I have always loved reading about the history of rock-n-roll!  Rock-n-roll bands are just like any other group, team, or organization in that the relationships between individuals in the group ebb and flow.  It's rare that a band's members stay together for the entire lifetime of the band (of course there are always exceptions).  As one notable example, look at the evolution of the rock-n-roll supergroup "Jefferson Starship".  The rock-n-roll band "Jefferson Airplane" was founded in San Francisco in 1965.  They famously appeared at Woodstock and had a number of hits, including "Somebody to Love" and "White Rabbit".  Due to creative differences, the band split into two groups in 1974 - "Jefferson Starship" and "Hot Tuna".  One of the former's hits was the song, "Find Your Way Back", released in 1981.  It is this exact song that came to mind when I read the following quote by the Roman emperor and Stoic philosopher Marcus Aurelius, from his book Meditations:

"When forced, as it seems, by circumstances into utter confusion, get a hold of yourself quickly.  Don't be locked out of the rhythm any longer than necessary.  You'll be able to keep the beat if you are constantly returning to it."

Life is full of surprises.  There are always going to be times when things don't go as planned.  We may wake up one morning to find that our car has a dead battery and doesn't start.  We may have to change a flat tire on our drive to the beach.  As leaders, we may have to deal with the unexpected departure of one our key employees.  These are the little surprises that we experience on an almost daily basis.  But then there are the so-called "black swan" events that we have to deal with too.  As the last two years have so clearly highlighted, as leaders we are going to have to deal with events such as natural disasters, cyber attacks, or even a generation-defining worldwide pandemic.

The important aspect to all of this is that we can't let chaos reign supreme.  What matters is that we bounce back quickly.  The synonyms for the word "resilience" include flexibility, pliability, give, spring, elasticity, and plasticity.  What do all of these words have in common (other than being synonyms for resilience)?  They all involve bouncing back.  Whenever we encounter a difficult time in our lives - either personally or professionally - the key is that we find our way back to our baseline.  Find your way back.

Wednesday, September 28, 2022

Do your best

Eddy Merckx, the greatest cyclist of all time, once said, "It is not enough to have talent alone.  You will have to train hard and motivate yourself to do your best."  Merckx certainly had talent, but it was more than just talent that helped him win the Grand Tours in professional bicycling a record eleven times (five wins in the Tour de France, five wins in the Giro d'Italia, and one win in the Vuelta a España), all five of the so-called Monuments (the five oldest and hardest one-day bicycle road races), and three World Championships.  Merckx also won several major victories in track cycling, including the hour record (bicycling the most miles in one hour).  He was nicknamed "The Cannibal" (apparently the daughter of one of his teammates called him that when her father suggested that Merckx never let anyone else win) and won over 525 victories during his 18 year career.  Merckx worked hard at his craft and excelled because of it.

I was taught from a very early age the importance of working hard and doing your best.  I enjoyed my time as a Boy Scout starting around age seven years and continuing all through the end of high school, eventually earning Scouting's highest award, the Eagle Scout Award.  Scouting has had some difficult times as of late, but the values I learned back then undoubtedly played a major role in who I am today.  Scouting helped teach me to always do your best.  For example, the Cub Scout motto is "Do your best."  Similarly, the Boy Scout Oath begins with the statement, "On my honor, I will do my best..."

I have always been inspired by reading about legends like Eddy Merckx.  This past week, I came across another inspirational story that involved former President James Carter, who has had a very long, distinguished, and successful post-presidency (some would argue that his contributions to our country have been greater after his presidency than during his term in office).  President Carter is the only one of the 45 U.S. Presidents that attended and graduated from the U.S. Naval Academy (only two other Presidents attended the military service acadamies, with Presidents Grant and Eisenhower graduating from West Point).  Incidentally, President Carter served as the Scoutmaster for his three sons, but the only U.S. President to have earned the Eagle Scout Award was President Gerald Ford.

President Carter was a submarine officer and was hand-selected by Admiral Hyman Rickover to enter the Navy's nuclear submarine program.  Importantly, during the early days of the nuclear Navy, Rickover hand-selected all of his officers (Rickover did so much to establish the Navy's nuclear program that he known as the "Father of the Nuclear Navy").  President Carter would later say that other than his parents, Rickover had the most influence on his life.

President Carter tells the story of how he earned the opportunity to interview with Admiral Rickover for a position in the fledgling nuclear submarine program in 1952.  Admiral Rickover interviewed Carter for two to three hours, asking questions about strategy, tactics, physics, literature, and history.  Towards the end of the interview, Admiral Rickover asked, "Where were you ranked in your class at the Naval Academy?"  Carter felt that the interview was definitely going his way and replied, "I was ranked 59th in a class of 840, sir."

Admiral Rickover wasn't impressed.  He followed with another question, "Did you always do your best?"  Carter thought for a moment about answering instinctively with a statement about how he always did his best.  Something inside him caused him to pause.  He later said, "I recalled several of the many times at the Academy when I could have learned more about our allies, our enemies, weapons, strategy, and so forth."  So, instead of giving Admiral Rickover the answer that he wanted to hear (or at least so Carter thought), he replied honestly, "No, sir, I didn't always do my best."

Admiral Rickover remained silent.  He looked at Carter for an uncomfortably long time before asking one final question, "Why not?"  At that point, the interview was over.  Carter never forgot that question or the lesson that accompanied it.  He ended up getting selected for the nuclear submarine program, but more importantly, he learned the importance of always doing his best.  As a matter of fact, his campaign memoir when he ran for Governor of Georgia was titled, "Why Not the Best?"  

"Why not the best?"  Merckx, "The Cannibal" said, "You need just as much drive to be a good businessman as you do to excel as an athlete."  He is right of course.  Always do your best, no matter where you are or what you do.

Monday, September 26, 2022

Aristotle's Golden Mean

Do you remember the story of Daedalus and Icarus from Greek mythology?  I have always loved this story.  Apparently, Daedalus built the famous Labyrinth (the subject of another fascinating story involving the Minotaur) for King Minos on the island of Crete.  Apparently, it was Daedalus who suggested to Princess Ariadne to give the hero Theseus the ball of twine that he used to to escape from the Labyrinth after he slayed the Minotaur.  King Minos was furious and imprisoned Daedalus and his son Icarus in a tower.  Daedalus designed two sets of wings (one for him and one for his son) made out of branches, feathers, and wax.  After learning to fly while using the wings, Daedalus taught Icarus to fly too, cautioning him never to fly too close to the water (the wings would get wet and not work) or too close to the sun (the wax in the wings would melt).  Both Daedalus and Icarus flew out of the tower and escaped.  However, Icarus soon became overconfident and flew higher and higher towards the sun.  Eventually, just as Daedalus had predicted, the sun melted the wax in the wings and Icarus tumbled quickly, falling in the water and drowning.  Daedalus, left with no choice, continued to fly towards safety.

Clearly, Icarus should have chosen the middle way.  As it turns out, there is a lot of wisdom in choosing the middle way.  Last January, I mentioned Aristotle's principle of the Golden Mean (see "Juste Milieu").  At that time, I was discussing two concepts known as Imposter Syndrome and the Dunning-Kruger Effect.  Finding a balance between these two opposite extremes (the former referring to when competent individuals lack self-confidence and the latter referring to when individuals overestimate their degree of competence) is an important aspect of leadership, which so many others have called "confident humility" and ended with a quote from Adam Grant:

Great thinkers don't harbor doubts because they're imposters.  They maintain doubts because they know we're all partially blind and they're committed to improving their sight. They don’t boast about how much they know; they marvel at how little they understand. They’re aware that each answer raises new questions, and the quest for knowledge is never finished. A mark of lifelong learners is recognizing that they can learn something from everyone they meet. Arrogance leaves us blind to our weaknesses. Humility is a reflective lens: it helps us see them clearly. Confident humility is a corrective lens: it enables us to overcome those weaknesses.

I wanted to go back to the discussion of finding the right balance.  Aristotle's basic principle of the Golden mean found in the second chapter of his Nicomachean Ethics is moderation - finding the right balance between extremes.  It is a beautiful concept, and while Aristotle laid out the philosophical principle over 2,500 years ago, similar concepts can be found in the Confucian Doctrine of the Mean and the Buddhist concept of the Middle Way.  The Oracle at Delphi in ancient Greece bore the inscription, "Nothing in excess."  And in Islam, there's a saying. "Every praiseworthy characteristic has two blameworthy poles."   Moral behavior, then, is found in the middle of two extremes - on one end there is excess and at the other end deficiency.  Every virtue is thus the mean between two opposite extremes.  For example, courage is the mean of audacity (excess) and cowardice (deficiency).

The Jewish philosopher Maimonides said, "If a man finds that his nature tends or is disposed to one of these extremes, he should turn back and improve, so as to walk in the way of good people, which is the right way.  The right way is the mean in each group of dispositions common to humanity; namely, that disposition which is equally distant from the two extremes in its class, not being nearer to the one than to the other."

I might disagree with Maimonides somewhat (I realize that is being audacious!).  I would say that the Golden mean is not necessarily a perfect arithmetical mean.  As an analogy, think of the game of baseball.  Great hitters talk about finding the "sweet spot" on a baseball bat.  If you hit a ball exactly at this spot of the bat, it will make a perfect sound, the ball will fly off the bat with maximum velocity, and the batter won't feel any shock in his or her hands.  Importantly, the "sweet spot" of the bat isn't exactly half-way between the ends of the bat (technically, it's at the center of the distribution of mass).  When we are looking for our own "sweet spot", we don't necessarily have to find the exact middle.  

Finding balance in both your personal and professional life is important.  Choosing a path towards the middle and avoiding the extremes is really a good strategy for success in leadership and life.   

Saturday, September 24, 2022

"Broken like an egg-shell or squashed like a gooseberry"

I finished a biography on Winston Churchill this past spring break by the British historian and writer Andrew Roberts called Churchill: Walking with DestinyYou've probably caught Roberts a few times this past month talking about the recent death of Queen Elizabeth II.  The book is massive (it's just over 1,100 pages long), but the subject probably deserved that amount of coverage.  I actually couldn't put the book down, and it was one of those rare books that I didn't spend the last 50 pages or so wondering when the author would finally finish or why the editors didn't do their jobs!  I can't wait to tackle William Manchester's classic The Last Lion (a three volume biography on Churchill) and Erik Larson's The Splendid and the Vile .  I will admit that I am probably biased when it comes to Churchill.  He certainly had his flaws, but I would definitely place him on my "Mount Rushmore of leadership" for what he accomplished during his career and how he led.

While I certainly learned a lot more about Churchill after reading Roberts' biography, there was one episode in his life that I found very interesting, perhaps because it was recently mentioned in The Daily Stoic.  As it turns out, there are all kinds of "what if" scenarios around this story, because in the winter of 1931, history came incredibly close to losing Churchill before he had a chance to help lead the world through World War II.

Churchill had traveled to the United States to deliver a series of lectures on "the Pathway of the English-Speaking Peoples" in December of that year.  He was attempting to generate some additional money to offset some of his financial losses in the stock market.  He was scheduled to deliver one of these lectures at the Brooklyn Academy of Music in New York City on December 14th.  At 10:30 PM the night before, he set out to meet his friend, the American financier and statesman Bernard M. Baruch, at his residence at 1055 Fifth Avenue.  

Apparently, both Churchill and his taxi cab driver were confused about the building numbers, and Churchill exited the cab on the wrong side of the street.  He tried to cross Fifth Avenue (to be specific, he was crossing Fifth Avenue between 76th and 77th Streets) against the light.  He got out of the cab in the middle of the street.  Having spent most of his life in England, where they drive on the wrong side (in the United Kingdom, traffic keeps to the left) of the street (shameless jab at my fellow Brits!), Churchill looked to his left (he should have looked right), saw no one coming, and proceeded to walk across the street.

Churchill was hit by a car traveling 30 mph.  The driver of the car was an unemployed mechanic named Edward Cantasano (misreported by journalists at the time as Mario Contasino), whose car dragged the British Prime Minister several yards before flinging him into the street.  Churchill bruised his right chest, fractured two ribs, sprained his right shoulder, and cut his forehead and nose.  He probably had a concussion as well.

Churchill recalls hearing someone shout that someone had been killed.  When he came to his senses, he saw a police officer standing over him.  He was quickly taken to a local hospital, where he would spend the next several days.

Churchill took 100% of the blame for the accident.  As a matter of fact, he asked the police to absolve Mr. Cantasano of any responsibility for the accident whatsoever.  Churchill was concerned that Cantasano would still be blamed for the accident and would later have trouble finding work.  Who would want to hire the person that almost killed the British Prime Minister, after all?  Churchill finally arranged to meet him at the Waldorf-Astoria Hotel, served him tea, and gave him an autographed copy of his book The Unknown War (the fifth and final volume of his excellent history of World War I, The World Crisis).

Churchill would later write of the incident, "I do not understand why I was not broken like an egg-shell or squashed like a gooseberry. I have seen that the poor policeman who was killed on the Oxford road was hit by a vehicle travelling at very much the same speed and was completely shattered. I certainly must be very tough or very lucky, or both."

As a humorous aside, Churchill's physician, a Dr. Otto Pickhardt, wrote a note for him (see below) prescribing alcoholic beverages "for medicinal purposes only" (remember that the U.S. still had not legalized the sale of alcohol following the passage of the Eighteenth Amendment to the U.S. Constitution in 1919, which wasn't repealed until passage of the Twenty-first Amendment in 1933).










If you read closely, it says "minimum requirements would be 250 cubic centimeters" (a little less than a standard pour of wine).  Churchill was a fan of Johnnie Walker Red Label and Pol Roger Champagne, so one can only wonder how effective this unusual prescription was at dulling the pain from his accident.  However well it worked, he was able to write up the story of his accident and sell it to The Daily Mail, who published it in two parts as "My New York Misadventure" on the 4th and 5th of January, 1932.

So what does all of this have to do with leadership?  What's the take-home message here?  First, I think this story is a great example of forgiveness.  Churchill was more worried about what this accident would do to Cantasano than how it would impact him personally.  It's not always easy to forgive someone when they cause us pain - physical (as in this case) or mental.  But we should.

Second, I think this story sends a powerful message about resilience and perseverance.  I've talked a lot about Stoic philosophy in the past, and I think one of the fundamental tenets of Stoicism certainly applies here.  There are always going to be events, problems, or issues in our lives that are beyond our control.  Rather than worry about them, we should just move on.  There's another great quote from Churchill's essay "My New York Misadventure".  He wrote, "Nature is merciful and does not try her children, man or beast, beyond their compass.  It is only where the cruelty of man intervenes that hellish torments appear.  For the rest - live dangerously; take things as they come; dread naught, all will be well."

All will be well.

Thursday, September 22, 2022

"...like a stand of trees"

I recently mentioned the Prussian general and military strategist Carl von Clausewitz in a recent post ("Wicked"), who is perhaps best known today for his book  On War written between the years 1816 and 1830 and published by his wife posthumously in 1832.  The book is an unfinished work and consists of a series of essays that von Clausewitz wrote on military theory, strategy, and tactics following the Napoleonic wars. The book has broader application to strategy and tactics in general, often finding its place alongside a similar book by Sun Tzu (The Art of War) on the bookshelves of business leaders.

My last couple of posts have focused on one of the classic texts in safety science, Dietrich Dörner's The Logic of Failure.  Dörner actually quotes a passage from Carl von Clausewitz when he talks about a concept he calls "methodism", which he defines as "the unthinking application of a sequence of actions we have once learned" to solve complex or so-called wicked problems.  And why wouldn't we try to solve problems using the tools and techniques that have been successful for us in the past?  As Dörner explains, we have an attachment to the tried and true.  Unfortunately, what has worked for us well in the past often leads to an oversimplification of an otherwise incredibly complex and complicated problem.

Here is the passage (shortened somewhat, but hopefully the point is still clear) from von Clausewitz:

"So long as no acceptable theory, no intelligent analysis of the conduct of war exists, routine methods will tend to take over even at the highest levels.  Some of the men in command have not had the opportunities of self-improvement afforded by education nor contact with the highest levels of society and government...Their only insights are those that they have gained by experience.  For this reason, they prefer to use the means with which their experience has equipped them, even in cases that could and should be handled freely and individually."

von Clausewitz goes on to define war as a complex problem:

"War is not like a field of wheat, which, without regard to the individual stalk, may be mown more or less efficiently depending on the quality of the scythe; it is like a stand of mature trees in which the ax has to be used judiciously according to the characteristics and development of each individual trunk."

Dörner explains this passage further by stating, "In many complex situations, considering a few 'characteristic' features of the situation and developing an appropriate course of action in the light of them is not the essential point.  Rather, the most important thing is to consider the specific, 'individual' configuration of those features and to develop a completely individual sequence of actions appropriate to that configuration."

He writes on, "The methodist is not able to cope with the specific, individual configurations, and he uses one or the other depending on the general features of the situation as a whole.  He does not take into account the individuality as it is evidenced in the specific configuration of the features."

Methodism, which I think greatly oversimplifies the complexity of a wicked problem (see my posts on the High Reliability Organization principle of "Reluctance to Simplify"), can lead to failure because "a change in some minor detail that does not alter the overall picture of the situation in any appreciable way can make completely different measures necessary to achieve the same goal."  The metaphor used by von Clausewitz on felling trees in a forest is very appropriate here.  We have to adjust the way that we cut down each tree by carefully analyzing the tilt of the tree, the position of surrounding trees, the shape of the tree's crown, whether the tree has twists in the trunk, and even the direction of the prevailing winds.  Any one of these features should force us to re-think how we cut down the trees in the forest, but collectively do not change the overall appearance of the forest when we look at it in aggregate.

Dörner mentions a famous experiment in cognitive psychology called the "water jug problem" first published in 1942 by Abraham and Edith Luchins.  The problem is a great example of something known as the "Einstellung effect", and it even appeared in the 1995 movie Die Hard with a Vengeance starring Bruce Willis and Samuel L. Jackson (see the movie clip here).  Basically, in the version of the problem used in the movie, you have in front of you a 5-gallon water jug and a 3-gallon water jug.  You also have access to a water fountain, so you can fill up both water jugs as many times as you want.  Your job is to put 4 gallons of water exactly into one of the jugs.  How do you do it?

Think about it.  How would you solve this problem?  As Abraham and Edith Luchins, the villain in the movie Die Hard with a Vengeance, and Dietrich Dörner all believe, we would try to solve this problem using methods that have been successful for us in the past when faced with similar problems.  Unfortunately, we would fail in our attempt to solve the problem then (and, in the movie, the villain would win because a bomb would explode).

Here is a step-by-step explanation of how you can solve the "water jug problem" from the movie:
  • Fill the 5-gallon jug to the top (the 5-gallon jug has 5 gallons of water now)
  • Pour water from the 5-gallon jug into the 3-gallon jug and fill it all the way up (the 5-gallon jug now has 2 gallons and the 3-gallon jug now has 3 gallons of water)
  • Dump out the water from the 3-gallon jug (which now is empty)
  • Pour the water (2 gallons) from the 5-gallon jug into the 3-gallon jug (the 3-gallon jug now has 2 gallons of water, and the 5-gallon jug is empty)
  • Fill up the 5-gallon jug all the way (the 3-gallon jug still has 2 gallons, and the 5-gallon jug has 5 gallons)
  • Pour water from the 5-gallon jug into the 3-gallon jug, filling up the 3-gallon jug (the 3-gallon jug now has 3 gallons and the 5-gallon jug now has 4 gallons!!)
For those of you who have been reading my last several posts, you may have noticed that Dörner recommends attacking complex problems with an algorithmic approach or sequence as well.  Importantly, he is not advocating "methodism" here.  If you examine the sequence that he recommends more closely, it is a general approach that involves experimentation and sounds a lot like the either the Scientific Method widely taught in schools or the PDSA cycle (Plan-Do-Study-Act) used in quality improvement today.  Here is my side-to-side comparison of all three techniques:







One last point.  I mentioned Gary Klein's recognition-primed decision-making model (RPD model) in the first post ("It's complicated...") on Dietrich Dörner's The Logic of Failure.  On the surface, Klein's model, which explains how experts can make quick - almost automatic - decisions in complex situations, would seem to contradict Dörner's admonition against using past experience to make quick assessments and decisions.  At the risk of having my cake and eating it too, I actually don't think these two models contradict each other at all.  If you look closely at Klein's RPD model, it's very similar to Dörner's model above.  The key differences are that time is significantly compressed in Klein's model (in terms of the time available to go through the steps in each model), and most of the recognition-primed decision-making occurs in one person's (the expert's) thought processes.  


















As you can see in the Figure above, the key branch points and steps in Klein's model are:

1. Is the situation familiar?  If not, reassess the situation and seek additional information. If so, proceed to the next step.
2. Define goals
3. Develop a model/plan (here, Klein calls it a mental simulation of action)
4. Will the model/plan work?  If not, modify it.  If so, proceed to the next step.
5. Execute the plan and review the results.

Dealing with complexity is not easy.  That is perhaps one of the reasons so much has been written on it.  I was very interested in reading Dietrich Dörner's The Logic of Failure, and I certainly can understand why it has become a classic text in the field of safety science.  I would like to move on to another classic in future posts, Barry Turner and Nick Pidgeon's book, Man-Made Disasters.

Tuesday, September 20, 2022

Beer Game

During business school, I participated in a simulation called "The Beer Game."  The "Beer Game" was developed by Jay Wright Forrester at the MIT Sloan School of Management in 1960 and is probably the best illustration of the importance of logistically.  I first learned about the "Beer Game" in the book "The Fifth Discipline" by Peter Senge.  

There are a number of simulations available for free online.  Game play is fairly straightforward.  Individuals play the role of a brewer, a distributor, a wholesaler, or the manager of a local retail store (in some games that I've seen, the distributor and the wholesaler are the same role).  The game's objective is simple in concept, but difficult in execution - keep up with the changing customer demand for beer.  The trick is to look at these individual players as being part of a system.  During the first few rounds of the game, the system establishes a certain equilibrium where beer moves through the supply chain without any significant problems.  

Once an equilibrium is established, the game adds in a new twist.  A popular singer or famous professional athlete appears in a video drinking a certain brand of beer, and when the video goes viral, demand for that particular brand of beer significantly increases.  The manager orders more beer of that brand, but the supply chain is unable to keep up.  However, the manager continues to order more in order to meet the demand.  As with all popular fads, the demand for the brand of beer quickly returns to its baseline.  Unfortunately, the orders for the brand of beer have already been placed.  Soon, the local retail store has a huge supply of the once popular brand of beer, but unfortunately the demand is just no longer there.

The "Beer Game" is a great illustration of a concept known as the "bullwhip effect".  The "bullwhip effect" (or "whipsaw effect" as it is sometimes called) is a well-described problem in supply chain logistics that describes the role played by periodical orders as one moves upstream in the supply chain toward the production end.  Even when demand is stable (as in the initial equilibrium phase of the "Beer Game" above), small variations in demand at the retail-end can dramatically amplify themselves upstream through the supply chain. The result is that order amounts become very erratic - they may be very high one week and then zero the next week.  The most recent example of the "bullwhip effect" occurred during the COVID-19 pandemic and involved the toilet paper supply chain (remember when you couldn't buy toilet paper because all the stores were out of stock?).  Just take a look at the great illustration from the website "sketchplanations" below: 















The "Beer Game" is a really fun game!  The game also illustrates one of the common pitfalls that leaders fall into when dealing with complex systems.  I've been posting a lot about complex systems in the last few weeks, particularly since reading the book The Logic of Failure by Dietrich Dörner.  My first post, "It's complicated..." defined complexity as a concept and highlighted some of the important differences between complex systems versus complicated ones.  The next post, "Sim City" talked about a series of simulations that Dörner and his team conducted in order to develop his theory of why humans typically fail when trying to solve complex problems.  The third post, "Wicked" brought in the concept that complex problems are often "wicked problems" (as opposed to "tame problems") and presented Dörner's framework for trying to solve complex or wicked problems.  

One of the reasons that makes complex or wicked problems so difficult to solve is that we, as decision-makers and problem-solvers, fail to properly deal with time lags.  One of the fundamental characteristics in our complex world is that there is frequently a time lag between the decisions we make and the subsequent impact of those decisions.  For example, consider the U.S. government's decision to pass an economic stimulus package.  When will the impact of that package be observed?  Certainly not immediately.  Unfortunately, in complex systems with time lags, we are extremely prone to "oversteer" or "overcorrect" (as in the "Beer Game" above).  As Aaron Renn writes in his review of The Logic of Failure, "We make decisions based on the present situation, without regard to the fact that our previous actions will have the intended effect in a future period."

We live in a complex and complicated world.  As leaders, we will be asked to make decisions which involve complex and complicated issues.  It's important to utilize some kind of framework to help solve these complex and wicked problems, and I plan to compare and contrast Dörner's framework with some other suggested frameworks in my final post on The Logic of Failure.

Sunday, September 18, 2022

Wicked

It's been a few years since my wife and I went to the theater.  When we lived in Cincinnati, we had season tickets to Broadway in Cincinnati.  Our original plan was to continue our once-a-month tradition and purchase season tickets to the theater in our new city.  Well, then COVID-19 happened and most of the shows were canceled.  I suppose that we've been procrastinating.  It's probably time to start going back to the theater, as one of our favorite shows is coming to town soon.  We've se the show "Wicked" a few times, even once in New York City.

I'm reminded (sorry, my brain just works that way) of something that Keith Grint has called a "wicked problem" (as opposed to a "tame problem").  I've posted about "wicked problems" at least once before. Simply stated, "wicked problems" are both complicated and complex and probably have never occurred before.  The solutions to "Wicked" problems aren't readily apparent, and they may be as complicated and complex as the problem itself.  

"Wicked problems" are particularly prone to what Dietrich Dörner (see my last two posts) calls "the logic of failure".  He recommends the following sequence of steps (an algorithm, if you will) for attacking complex problems.  I would suggest that this sequence could help solve wicked problems too.

Dörner suggests that the first step is to set clear, unambiguous goals.  I like using SMART goals - goals should be specific, measurable, actionable, realistic and relevant, and time-bound.  While Dörner doesn't necessarily state that we should use SMART goals, he does caution against focusing on general goals.  If we do not state our goals clearly, we tend to fall into the trap of what he calls a "repair shop" mentality - we try to fix whatever problems that we can find.  

It's important to delineate all the goals, including what Dörner calls "implicit goals" (goals that are just as important, but perhaps not clearly stated).  For example, "Stop insects from eating crops" is an explicit goal (but of course, not a very SMART one), but by accomplishing that goal, we don't want to destroy the local ecosystem (the implicit goal).  We usually do not take into account these implicit goals, and we may not even know they are a goal.  Dörner uses another example.  For someone who is already healthy, "maintaining health" would be an implicit goal - not clearly stated, but perhaps just as important and relevant.

The second step in Dörner's approach is to gather information and analyze data.  Again, while he does not clearly state it in these terms, he does recommend that we should avoid oversimplifying the problem (the High Reliability Organization principle of "Reluctance to Simplify").  Collecting too much data is just as problematic as not collecting enough.  I like what Jeff Bezos calls the "70% rule" - make a decision when you have about 70% of the information that you need.

Next, Dörner says that we should make predictions and extrapolate from the data that we collected in step 2.  In other words, make a plan and then act on it.  Dörner cautions against something that Carl von Clausewitz called "methodism" (more on this in a future post), that tendency we all have to restrict our actions to the ones that have worked well for us in the past.  Dörner writes, "To be successful, a planner must know when to follow established practice and when to strike out in a new direction."

Finally, after we've executed our plan, Dörner says we should review the results that we achieved and make any necessary changes to our plan.  Several of you may noticed that Dörner's approach is very similar to the PDSA cycle (Plan-Do-Study-Act) used in quality improvement today.  That makes a lot of sense to me - PDSA cycles are often used to tackle complex or wicked problems.

Since I started with the Broadway musical "Wicked", I will end with a quote from the character Elphaba ("The Wicked Witch of the West").  She said, "Some things I cannot change.  But 'til I try, I'll never know."  Complex or wicked problems are like that too.  It would be very easy to say that they are just to hard to tackle, but until we try, we will never know for sure.

Thursday, September 15, 2022

Sim City

Do you remember the computer game "SimCity"? There were several different versions of the game, but essentially the object of the game was to build and design your own city.  The player, acting as the mayor of the city, starts off with a blank geographic map (in later versions of the game, you could even change the geography).  He or she can then designate certain areas as industrial, commercial, or residential, after which they can build away!  Players frequently have to contend with natural disasters, such as tornadoes, fires, earthquakes, and floods (there was even a giant monster that could attack the city).  They can build amusement parks or industrial parks, marinas or golf courses, and apartment complexes or residential neighborhoods.  It's a really fun game - see the screen shot below of a newly constructed city:














Don't worry - this post is not just about the SimCity computer game.  During my last post, I mentioned that I recently finished the book The Logic of Failure by Dietrich Dörner.  While the main focus of the book is on why we make mistakes, at times with catastrophic consequences, Dörner primarily uses the results from two groups of experiments to support his theoretical concepts.  I was reminded of SimCity while reading the book, as the two groups of experiments involved a simulation where subjects became the leaders of a fictional country (in the first group, the results of which were also published in the journal Philosophical Transactions of the Royal Society) or town (in the second group), similar to the SimCity game.  

The first group of simulations took place in the fictional country of Tanaland, Africa.  Leaders (i.e. the study participants) were tasked with promoting the health and well-being of Tanaland's inhabitants and the surrounding region.  For example, they could improve the fertilization of the fields and orchards, install irrigation systems, or build dams.  They could introduce measures focused on improving access to medical care or build infrastructure such as power plants or roads.  Leaders were given free reign to introduce as many measures as they wanted to at six "planning sessions" over the course of a simulated ten-year period.  In this way, they could evaluate the success or failure of each introduced measure at regular intervals and cancel or modify earlier decisions.  

Several metrics were followed over the course of the simulation, including crop yield, population, birth rate, etc.   The individual results of these simulations are illustrative.  For example, one leader introduced measures to improve medical care and the supply of food to the region.  Initially, both the birth rate and life expectancy increased.  However, once the population of Tanaland hit a certain threshold, there was no longer enough food to support the growing population, and a famine occurred.  As Dörner himself explained, "Catastrophe was inevitable because a linear increase in the food supply was accompanied by an exponential increase in the population."

Similarly, the second group of simulations took place in the fictional town of Greenvale, England.  Once again (and unrealistically, but that's the nature of the simulation), leaders (study participants) could exercise near dictatorial powers for ten years.  The town's single biggest employer was a municipally-owned watch factory.  Leaders could adjust local tax rates or change the hiring practices at the watch factory, introduce measures to improve medical care, or build more houses.  Again, just like the Tanaland simulation, leaders frequently succumbed to the law of unintended consequences.  

The key lesson from these simulations is that systems are prone to human failure when they are complex (see again my last post), dynamic (i.e. evolving over time), and intransparent.  The word "intransparency" means "lack of transparency" which in this context refers to the fact that in complex systems (in particular), information is often incomplete or hidden from view.  Economists often refer to something called "information asymmetry" where two individuals, groups, or teams have unequal access to information.  Collectively, these characteristics of complex systems are really what contributes to what Dörner called "the logic of failure", which he defined as the tendencies and patterns of thought that humans make (as a natural result of our evolution), such as taking one thing at a time, cause and effect, and linear thinking, that were probably appropriate in an older, simpler world but can prove disastrous in our complex world today.  For example, the study participant who increased access to medical care and improved the irrigation in Tanaland didn't anticipate the effects of both of these initiatives on what is now called the Malthusian Trap (named after the 18th century economist Thomas Malthus, who first described it).  The population in Tanaland grew exponentially, while the food supply continued to grow linearly.  

Fortunately, there are some tricks and tools of the trade that leaders can use to help them effectively make decisions, even when the situation is complex.  Dr. Dörner had some suggestions as well.  And we will talk about some of his recommendations, as well as some of the recommendations from other experts, in my next post.

Tuesday, September 13, 2022

It's complicated...

I just read one of the classic books in safety science, The Logic of Failure by Dietrich Dörner.  There is more than one edition out I think, and the one I read (thank you to my local library) was the 1986 English translation of the original book that was published in German.  The premise of the book is perhaps best summarized by a statement in the blurb (yes, that's apparently the proper term, though I have also seen the term "flap copy" used) from the dust jacket:

"Dietrich Dörner, winner of Germany's highest science prize, here considers why - given all our intelligence, experience, and information - we make mistakes, sometimes with catastrophic consequences.  Surprisingly, he finds the answer not in negligence or carelessness, but in what he calls "the logic of failure": certain tendencies in our patterns of thought - such as taking one thing at a time, cause and effect, and linear thinking - that, while appropriate to an older, simper world, prove disastrous for the complex world we live in now."

Unfortunately, I'm not fluent enough in German to read some of the original studies that Dr. Dörner referenced in his book.  Regardless, there were several interesting points made in the book that I would like to discuss in greater detail.  The first is how he defines and explains the concept of complexity.  There's been a lot written on the difference between complex and complicated.  One of the best explanations I've found is an article by Alexandre Di Miceli ("Complex or Complicated?").  He says, "A complicated system has a direct cause and effect relationship.  Its elements interact in a predictable way."  Complicated systems are controllable, often by following specific rules or algorithms.  Conversely, he says that complex systems are composed of elements that interact with each other in unpredictable ways.  It is these interactions that differentiate complex systems from merely complicated ones.  Di Miceli goes on to say:

"A car engine is complicated.  Traffic is complex."

"Building a skyscraper is complicated.  The functioning of cities is complex."

"Coding software is complicated.  Launching a software start-up is complex."

Dietrich Dörner would whole-heartedly agree with Di Miceli's explanation.  He writes, "Complexity is the label we give to the existence of many interdependent variables in a given system.  The more variables and the greater their interdependence, the greater that system's complexity."  He goes on to define something that he calls the "complexity quotient" as the product of the number of features within a system times the number of interrelationships that they have.  For example, if there are ten variables and five links between them, the complexity quotient is fifty (10 x 5 = 50).  As another example, if there are one hundred variables that are completely unrelated (no interrelationships or links between them), the system's complexity quotient is zero (100 x 0 = 0).

Dörner next makes a profound statement, at least in my opinion.  He says that "complexity is not an objective factor but a subjective one."  Imagine, as an example, the everyday activity of driving a car to and from work.  For someone my age, who has been driving for the past few decades (I won't say how many decades!), driving a car in busy traffic might be frustrating at times, but it's fairly straightforward.  However, put a new driver behind the wheel in the middle of Chicago rush hour traffic, and you may find a completely different perspective on how hard it is to drive in traffic.  The key here is something that Dörner calls "supersignals."  For the experienced driver, rush hour traffic is not made up of hundreds of different elements that myst be interpreted individually, but rather he or she is processing information in aggregate and by "gestalt."

Supersignals reduce complexity by collapsing a number of features together into one.  Think about how we look at someone's face.  We don't see all the contours, surfaces, and color variations.  Instead, we see just one face in aggregate.  Because of these supersignals, complexity must be understood subjectively from an individual's perspective.  We learn these "supersignals" by experience and training.  Dr. Gary Klein suggests that experts base their decisions by looking at the aggregate, recognizing a pattern that they've experienced before, and making a decision (he calls it recognition primed decisionmaking).

It seems like a simple concept, but I found it to be much more profound.  Interestingly, some of the other topics in The Logic of Failure reminded me of the computer game SimCity.  More on that in my next post.

Thursday, September 8, 2022

"One home run is much better than two doubles."

The late Steve Jobs once said, "Quality is more important than quantity.  One home run is much better than two doubles."  I don't know if I completely agree, but more on that in a second.  Let's look at this strictly in baseball terms.  A home run occurs when a player hits the ball out of the ballpark (or alternatively, runs around all the bases on a hit that never makes it out of the park) and scores a run for his team.  A double occurs when a player hits the ball and makes it all the way to second base.  Notably, two doubles, particularly if they occur in the same inning, frequently score a run too.  So if the end result is the same, which is preferable?  A home run or two doubles?  Certainly the home run is more exciting, but most baseball purists wouldn't care either way, as long as the team scores a run (this approach is often called "manufacturing runs" by these same baseball purists).  To this end, I have always heard that the really good baseball players have just as many doubles as they do home runs.

All of this reminds me of a scene from the 2011 movie "Moneyball" starring Brad Pitt and Jonah Hill.  The movie is based on the non-fiction book by Michael Lewis, which tells the story of how the Oakland Athletics' general manager Billy Beane (played by Brad Pitt) built a winning baseball team, in spite of a low budget, by selecting under-valued (i.e. cheap) players using a statistical technique known as sabermetrics.  Beane is meeting with several of his baseball scouts, who are trying to select free agents using their traditional approach.  Specifically, the scouts are trying to replace three key players who signed with other teams during the off-season (Jason Giambi, Johnny Damon, and pitcher Jason Isringhausen).  Beane tells the scouts that they keep trying to replace Giambi with a comparable free agent player.  He tells the scouts, "Guys we can't do it.  Now what we might be able to do is recreate him in the aggregate."  He proceeds to tell the scouts (who are very skeptical of his new approach) that they can find three players who would be a lot less expensive whose combined on-base percentage would equal Giambi's.  Using Beane's new approach, the 2002 Oakland Athletics won the American League West Division with an overall record of 103-59, despite having one of the smallest payrolls in baseball that year ($42 million - notably, the New York Yankees had a payroll of $125 million that same season and also won their Division).  

Now, back to the quote by Steve Jobs.  Is quality better than quantity?  I would argue that the answer depends on the context.  In most cases, I would agree that quality is better than quantity, and at least in this specific example, Jobs was referring (I think) to the product release of the iPhone.  I suppose then, that if you are talking sales, leading the market with one "home run" kind of product like the iPhone is a lot better than having two good products that aren't necessarily leading the market.  But is that true in other contexts?

Let's go back to baseball.  If you were putting a team together, would you rather have a team of mediocre players and one superstar who hits a lot of home runs or a team of really good players who aren't necessarily flashy but get on base a lot and can "manufacture" a lot of runs?  I would choose the latter.  And I think the same is true for organizations in general.  Which would you have on your team (and I mean any team, not just in the sports context)?  Several above average performers or one superstar employee?  I talked a little about this in an earlier post, "I play not my eleven best, but my best eleven..."

When you are leading a group or putting together a team, resist the temptation to look at everyone's individual strengths and weaknesses.  Instead, try to look at the aggregate strength of the entire team.  Also look at the superstar employees.  Is that individual going to make us better or worse?  If you can answer that question honestly, it's an easy decision.

Tuesday, September 6, 2022

"Culture eats strategy"

If you have been paying any attention whatsoever to the management literature over the past 20 plus years, you will have heard the axiom, "Culture eats strategy for lunch" (or a different version, "Culture eats strategy for breakfast").  The phrase is often attributed to the management guru Peter Drucker, though in actual truth he probably never said it.  Regardless of who said it first, the point is that organizational culture is very important.  The important caveat is that strategy is important too, and that's what often gets lost when this axiom is loosely thrown around.  Organizations who ignore strategy do so at their own peril.

What is absolutely clear is that a bad culture will subsume a good strategy.  Whether or not one is more important than the other is probably irrelevant.  They are both necessary and critical aspects to the overall success of any group, team, or organization.

Mark Fields, the CEO of Ford Motor Company from 2014 to 2017 perhaps summarized it all best, when he said "You can have the best plan in the world, and if the culture isn't going to let it happen, it's going to die on the vine."  Adam Bryant, writing in Strategy + Business, suggests that it is often the "frozen middle" (a euphemism for middle managers who are reluctant to give up the status quo) who end up blocking or delaying strategic initiatives.  

The organizational culture, then, largely determines whether a new strategic initiative will be successful or end up failing.  If the culture is such that the organization is resistant to change or tied to strongly to the past, the initiative will undoubtedly fail.  However, if the culture is more entrepreneurial or innovative in nature, new initiatives will be embraced and ultimately successful.  Adam Bryant writes further, "Constructed properly, a healthy culture will reinforce the articulated values and the specific behaviors that leaders expect from all employees."

Christy Lake, Chief People Officer at Twilio, referred to culture as the operating system that keeps the organization on track to execute on strategy.  "It's like your phone's operating system - it works invisibly in the background to connect your apps and help you get things done.  You also expect it to be regularly updated with enhancements, performance improvements, and new features.  The same is true for compan culture.  The operating system needs to be updated to ensure that it's staying current with where the company is and where it is going."

Jacob Engel, writing for Forbes, offered three key points leaders need to bear in mind in order to make sure that the organizational culture is aligned with its strategy:

1. Culture is created by the behaviors you tolerate.  I've talked about this before in the past (see "What you permit, you promote...").  At that time, I was referring more to disruptive behaviors and incivility in the workplace, but the same is also true for a creating a culture that embraces change versus one that not just resists change, but fights it.  As I have also said before, "The need for change is not an indictment of the past".  Leaders that fight change in order to preserve the status quo are not leaders.  As Jim Collins recommends in his book Good to Great, "First Who, Then What", it's all about getting the right people on the bus.

2. Change starts at the top.  I would agree that change has to start at the top.  However, not everyone is a CEO in the organization.  The middle managers need to embrace change as well (see key point #1 above), so that they do not become part of the "frozen middle".  As Jacob Engel suggests, "You can't expect your people to change if you're not willing to change first."  Get on the bus or go home.

3. The leader needs to recognize that they are a "voice" around the table, not "the voice".  Again, Engels writes "Culture is one of those intangibles that is very hard to define but needs to be designed and implemented - and never by default."  Leaders need to listen honestly, even to those who provide a dissenting opinion.  Ultimately, the leaders have to make the right decision for the organization, but they need to make sure that people feel like they've had a chance to provide input (which is one important aspect of the High Reliability Organization principle of Deference to Expertise).  While I 100% agree, leaders in the organization also have to pay attention to the first two points above.  

Again, Jim Collins explains how leaders in "Good to Great" organizations focus on the "First Who, Then What" principle.  He writes, "Those who build great organizations make sure they have the right people on the bus and the right people in the key seats before they figure out where to drive the bus. They always think first about who and then about what. When facing chaos and uncertainty, and you cannot possibly predict what's coming around the corner, your best "strategy" is to have a busload of people who can adapt to and perform brilliantly no matter what comes next. Great vision without great people is irrelevant."

Saturday, September 3, 2022

"Unknown unknowns"

When asked about the lack of evidence linking Iraq and so-called "weapons of mass destruction" during a press briefing on February 12, 2002, former U.S. Secretary of Defense Donald Rumsfeld famously said, "Reports that say something hasn't always happened are always interesting to me, because as we know, there are known knowns; there are things we know we know.  We also know there are known unknowns; that is to say we know there are things we do not know.  But there are also unknown unknowns - the ones we don't know we don't know.  And if one looks throughout the history of our country and other free countries, it is the latter category that tends to be the difficult ones."

While Rumsfeld certainly did not invent the concept (his quote actually reminds me of the famous Johari Window, created by the psychologists Joseph Luft and Harrington Ingham to help leaders better understand their blind spots - note that "Johari" is an amalgamation of their two names), it became his most famous line, which he used in the title of his memoir Known and Unknown.  The director Errol Morris used the quotation for the title of his documentary on Donald Rumsfeld, "Unknown Known".  Mikael Krogerus adapted this quotation and subsequently referred to something he called the "Rumsfeld Matrix" in The Decision Book:

















While I was familiar with the quote and knew of the matrix, I was surprised to find a tangential reference to both in a publication while searching for something called a "fundamental surprise".  Here, rather than depicting the concept as a matrix, they used a Venn diagram and slightly changed the classification by completely eliminating the category of "unknown knowns" and including "fundamental surprises" as a special subset of "Unknown unknowns".  The investigators were studying the impact of "fundamental surprises" on errors made during the Fukushima Daiichi nuclear power plant accident in 2011.  

The term "fundamental surprise" was first used by Zvi Lanir at the Center for Strategic Studies in Tel Aviv, Israel in 1983 in reference to the Yom Kippur War.  Lanir defined “fundamental surprise” as a surprising (unexpected) event which reveals an often profound discrepancy between one's perception of the world and the reality.  In regards to the Fukushima Daiichi disaster, the operators at the nuclear power plant never envisioned the chain of events that would lead to a partial nuclear meltdown and radiation leak.  The region experienced a magnitude 9.0 earthquake, which caused a tsunami.  The tsunami caused a flood, and the flood damaged the emergency generators that were critical to the reactor's cooling systems.  The loss of power to the cooling systems led to the meltdown.  The plant's operators never envisioned this kind of event.  In fact, the plant's safety design was never designed to deal with this kind of crisis.  When they were faced with the crisis, they were paralyzed by the reality of the situation.

Interestingly enough, there was a second, perhaps less well known, nuclear power plant impacted by the 2011 earthquake in the Fukushima region of Japan.  Ranjay Gulati, Charles Casto, and Charlotte Krontiris published an excellent article in the Harvard Business Review that compares and contrasts the experience at the Fukushima Daiichi plant and its sister plant, the Fukushima Daini plant.  While the aftermath of the earthquake led to a partial nuclear meltdown at the Fukushima Daiichi plant, the Fukushima Daini plant was back under control within 2 days of the earthquake, and the reactors were safely shut down. 

Gulati, Casto, and Krontiris (and others) suggest that the key difference between the two plants was leadership.  As they write, "A crisis disrupts the familiar.  When past experience doesn't explain the current condition, we must revise our interpretation of events and our response to them."  While there is no question that the damage sustained at the Daiichi plant were more severe, the leaders at the Daini plant simply responded better by acting decisively, stepping back when necessary to make sense of the rapidly evolving situation, and responding to shifting realities.  "In the heat of the crisis, problem by problem, they acted their way toward sense, purpose, and resolution."

I am reminded of a similar situation where the differences in how leaders responded to two very similar crises significantly altered the outcome.  As told in the book Island of the Lost by Joan Druett and my blog post "A tale of two leaders", two ships wrecked off the coast of the Auckland Islands in 1864.  The crew of the Grafton fared much better than the crew of the Invercauld, and again, the key difference was leadership.  As Florence Williams writes in her New York Times book review of the book, "Their divergent experiences provide a riveting study of the extremes of human nature and the effects of good (and bad) leadership."

Leadership matters, particularly during a crisis.  The best leaders are not paralyzed by the "unknown unknowns" and the "fundamental surprises".  As the Canadian writer Robin Sharma said, "Anyone can lead when the plan is working.  The best lead when the plan falls apart."

Thursday, September 1, 2022

"A preacher, prosecutor, and politician walked into a bar..."

I realized that I haven't written about Adam Grant in a while, so I think now might be a great time to revisit one of the concepts he discussed in his book, Think Again.  I came across an article that he wrote a few weeks ago for The Guardian ("You can't say that!: How to argue, better").  At the beginning of the article, he told a story of how he once had an argument with a close friend who had decided not to vaccinate his children.  At the time, Grant and his friend decided to "agree to disagree" and avoid discussing the topic in the future.  However, they eventually found themselves discussing the topic of COVID-19 vaccination.  As Grant recalls, "We duked it out in email threads so long that we ran out of new colors for our replies."  His friend admitted to him at the end of one of those threads that they had argued more in the past year than they had spoken in almost a decade, saying "I don't know about you, but I love it!"

Unfortunately, we live in a very polarized world.  I am currently reading the book Why We're Polarized by Ezra Klein, which I hope to discuss more about in the future.  One of the main issues with society today is that we've become so polarized that people who disagree can't even have a productive conversation.  Grant cites one study that shows that the average person would rather talk to a stranger who shares their views than a friend who doesn't.  Grant suggests that the reason we can't have a productive conversation on a topic on which we disagree is that too many of us think like preachers, prosecutors, and politicians (these labels come from an article by Philp Tetlock) when we are having a disagreement. 

When someone is in preacher mode, they're trying to proselytize their views on someone else.  When they are in a prosecutor mode, they are attacking someone else's viewpoint.  Finally, when they are in politician mode, they don't even listen to someone else unless they share the same view.  

Grant offers a number of suggestions to avoid falling into the "preacher, prosecutor, or politician" trap:

1. Learn to recognize your own lazy thinking.  When it comes to logic and reason, we are really lazy.  Don't believe me?  A group of investigators conducted a really clever and interesting set of experiments.  They asked people to produce a series of arguments in response to a couple of specific problems.  Next, the study participants were asked to evaluate someone else's argument.  Unbeknownst to the participants, in some of the experiments they were asked to evaluate their own argument (i.e. they weren't told that it was their own argument).  Surprisingly, when they thought the argument was made by someone else, 57% of them rejected it!  Grant writes, "Our reasoning is selectively lazy.  We hold our own opinions to lower standards than other people's.  When someone else doesn't buy the case you're making, it's worth remembering that you might not either."

2. Stay critical, even when you're emotional.  The more politically charged the issue, the harder it is to stay in control and focus on the facts relevant to the argument.  When we allow our emotions to take over, we lose the ability to think critically.  When we get emotional, we tend to be more prone to confirmation bias.  We will seize upon facts and ideas that confirm or support our own line of reasoning, all while ignoring or discounting those that challenge them.  Remember, a difference of opinion can be just fine.  It doesn't have to damage a new or established relationship.

3. Embrace the shades of grey.  We are also subject to something that cognitive psychologists call binary bias.  Simply stated, we take a complex argument and narrow the range of possibilities into just two categories.  Going back to Grant's story above, he told his friend that the COVID-19 vaccine was effective.  Well, how effective exactly?  The world is not always black and white - sometimes we have to focus on the grey in the middle.  

4. Build up to the really toxic topics.  Grant writes, "The highest compliment from someone who disagrees with you is not, 'You were right.'  It's 'You made me think.'"  We don't always have to reach consensus.  Sometimes the whole point of debate is to help promote critical thinking.

5. Keep agreeing to disagree.  Remember, there are no winners and losers when it comes to most arguments and debates.  "Great minds don't think alike - they challenge each other to think again."