I am particularly fond of Aesop's Fables - I have been since my childhood. Aesop was apparently (at least according to tradition - there are some who suggest that he was a fictional character) a slave who lived in ancient Greece, sometime between 620 and 564 BCE. His stories have been passed down through a number of sources and remain popular to this day. Like all such stories, Aesop's fables always end with a moral or lesson.
I came across Aesop's fable of "The Fox and the Grapes" while doing some background research for my post ("Fallor, ergo sum") on Being Wrong: Adventures in the Margin of Error by Kathryn Schulz. Please feel free to read the story in its entirety (I provided the link above), but I will summarize here:
A fox is walking around and sees a beautiful bunch of grapes hanging from a vine on a tree. The grapes are just out of reach, so the fox jumps up to try to bring them down. After several unsuccessful attempts, the fox gives up and walks away, saying "What I fool I am - those sour grapes are not even worth the trouble."
And with that, we have the common idiom "sour grapes," which we use to explain why we often put something down in a negative way or make it out to be unimportant solely because it is unattainable or out of our reach. Let me give you an example from my own personal experience. Several years ago, I interviewed for a Division Head job at another children's hospital. Admittedly, in retrospect, I was still a little junior for the job, but I had been approached by a couple of other hospitals about similar positions. I thought the interview went very well. A few weeks later the head of the search committee called to let me know that while they were very impressed, they wanted someone with a little more experience (actually, to be more specific, at the time, I was still on a NIH training grant and they wanted someone with independent NIH funding). I reassured myself that I really wasn't interested in the position at that particular hospital anyway ("sour grapes").
I don't think anyone will argue that all of us have a need to feel wanted. As Kathryn Schulz discusses at length in her book, the need to be "right" is just as powerful. Not only do we hate to make mistakes, at times we will justify, explain away, rationalize, downplay, shrug off, and disregard the fact that we have made a mistake. There are powerful psychological impulses at play here.
In her book, Schulz talks about a famous case from mid-nineteenth century America called the "The Great Disappointment". William Miller was a Baptist preacher who founded a religious movement known as Millerism, predicated on the belief that the Second Coming of Jesus Christ would occur some time between 1843 to 1844 (at one time, he actually predicted the exact date, October 22, 1844). Miller published his teaching and prediction, and Millerism grew to become a national movement. Apparently, people believed his predictions enough to sell all of their worldly belongings. Farmers stopped harvesting their crops - what was the point, the world was coming to an end! When the day of Jesus' expected day of return came and went, the believers tried to rationalize or explain away Miller's mistaken prophecy. It was almost as if they had believed in the prediction so much, that they could not ever come to terms that the prediction was wrong.
A psychologist named Leon Festinger conducted a study in the 1950's that is relevant here. Apparently, Festinger and his research team infiltrated a cult-like group that followed the doomsday prophecies of a suburban housewife named Marian Keech (a pseudonym). Similar to the Millerism movement (but on a much smaller scale), Keech's followers quit their jobs, sold their possessions, and prepared for the end of the world on December 21, 1954. Again, when it became clear that the world was not ending, rather than abandoning their now discredited beliefs, the members of the group actually adhered to them more strongly! Festinger would go on to label this psychological phenomenon, "cognitive dissonance". According to his theory, when two actions or ideas are not consistent with each other, individuals will do all in their power to change them until they become consistent. In other words, if we believe in something and find it to be untrue, we may rationalize, justify, or explain away the error in such a way that our belief only grows more powerful.
There is a similar concept in economics, called the "sunk cost trap" ("throwing good money after bad"). The "sunk cost trap" is also known as the Concorde fallacy, the idea that we should continue to spend money on a project, product, etc in order not to waste the money that we've already invested in it (apparently, the French and British governments continued to spend money on the Concorde plane even after it was abundantly clear that the program was losing money). Say, for example, that you have purchased a used car for your children. A few months after you buy the car, the transmission goes out and needs to be replaced. Even though it costs a ton of money to fix the car, you go ahead and do so. A few months after that, the brake system needs to be replaced. You rationalize to yourself, "Well, I've spent all this money so far, I guess I need to go ahead and fix the brakes." A couple of months after that, the power steering needs to be fixed. Soon, you have spent more in repairs and the original cost of the car than it would take to purchase a brand new car. You have fallen into the sunk cost trap (this actually happened to me!).
Sour grapes. Cognitive dissonance. Sunk costs. The Concorde fallacy. All of these concepts are inter-related, and all of them occur because, at the end of the day, we really hate to be wrong. Leaders in particular hate to be wrong. We need to be okay with making mistakes, after all we are only human ("Fallor, ergo sum"). By accepting that, we can minimize the risk that we will fall into the sunk cost trap or Concorde fallacy.
My particular story (above) has an interesting aside. Approximately one year after being told "no", the same hospital (the same individual on the search committee actually) reached out to me again and asked if I would be interested in coming back for an interview. I politely declined, thinking to myself, "If I wasn't good enough for you last year, why should you be good enough for me now?" Of note, my NIH funding status had not changed. Talk about "sour grapes"! I turned down a potential leadership opportunity for no other reason than "sour grapes."
No comments:
Post a Comment