Monday, April 6, 2026

Hell's Kitchen

There is a long-running and popular reality cooking show starring chef Gordon Ramsay that first aired on Fox on May 30, 2005 called "Hell's Kitchen".  Chef Ramsay is notoriously abrasive and demands perfection on the show, often yelling profanity at the contestants.  There's no doubt that his famous temper is used for entertainment value, and there are some reports that he is much nicer in private.  However, there's likely at least some truth on how he acts, and many in the culinary industry claim that his behavior is more often the norm than the exception, labeling it part and parcel of the "kitchen culture".

As I recently learned, the "kitchen culture" is actually part of the overall design of something known as the "kitchen brigade" (or "brigade de cuisine"), a system of organizing the kitchen staff and operations in order to maximize efficiency.  The system was first designed by the French chef and restaurateur Auguste Escoffier in the early 20th century.  It's a hierarchical and highly structured system that assigns specific roles to specialized staff in order to reduce chaos, ensure consistent quality, and allow for smooth operations, particularly when the restaurant is very busy.  There is a clearly defined chain of command, in which the executive chef ("Chef de cuisine") is responsible for the overall management of the kitchen.  The executive chef is responsible for creating the menu and new recipes (often with the assistance and/or approval of the restaurant manager), purchasing raw food items, training apprentices, supervising the rest of the kitchen staff, and maintaining a clean, sanitary, hygienic environment.  The "Sous Chef" is the second in command and receives orders directly from the executive chef, while all the remaining roles follow.

The executive chef serves as the "captain of the ship", and his or her orders are not to be questioned.  On most ships, the captain is ultimately responsible for everything that occurs on the ship, good or bad.  There was once a tradition of the captain "going down with the ship" in battle.  It's become a metaphor for any rigid, hierarchical system in which commands are issued from the top and meant to be followed without question.  When I was a medical student, the "captain of the ship" mentality was prevalent on most clinical teams, but particularly in the operating room, where the attending surgeon was deemed the "captain of the ship" (the "ship" being the operating room).  There was even a legal doctrine in the United States called "captain of the ship" doctrine, in which the attending surgeon was responsible - and legally liable - for every action of every other member on the operating team.  Thankfully, those days are long past.  Neither the legal doctrine nor the mentality of the attending physician as the "captain of the ship" is common today.  Those of us in health care have learned that a highly functioning team in which each and every member contributes and feels free to speak up ("psychological safety") is more engaging, more collaborative, and most importantly, leads to better outcomes.

So, it naturally begs the question that if surgical teams have figured out that a less hierarchical, rigid chain of command is better for patient outcomes, why can't those in the restaurant industry adopt the same mentality?  I've heard at least one chef say, "We all learned this way.  If it worked for us, why can it work for the next generation?"  Notably, I used to here that exact same rationale in medicine. 

Unfortunately, it's not just a "captain of the ship" mentality that is the problem.  The "kitchen culture" is downright toxic.  The New York Times recently featured a number of articles detailing the allegations of physical and mental abuse by Executive Chef Rene Redzepi at the world famous restaurant Noma in Copenhagen, Denmark.  The first article appeared on March 7th (see "Punching, Slamming, Screaming: A Chef's Past Abuse Haunts Noma, the World's Top-Rated Restaurant").  Just a few days later, the New York Times reported that Rene Redzepi had resigned after 23 years at the restaurant.

The toxic "kitchen culture" is not unique to Noma or to the fictional kitchen depicted in the television show, "The Bear".  Robin Burrow, a former lecturer in management and organizational behavior at Cardiff University in the United Kingdom has studied the toxic "kitchen culture" (see "Yes, Chef: Life at the vanguard of culinary excellence" and "Bloody suffering and durability: How chefs forge embodied identities in elite kitchens") and found that bullying and physical abuse are not only common, they are normalized as part of what someone has to go through in order to become an executive chef.  Burrow says, "Chefs who neglected to suffer had little claims to membership of the culinary community in the truest sense.  They were not true and proper chefs."

I wonder if haute cuisine could learn a few things from health care?  Better yet, it seems like the culinary community could learn some important lessons from high reliability organizations, such as the nuclear power industry, commercial aviation, or U.S. Navy aircraft carrier flight operations.  The stakes in a restaurant are certainly not "life and death" as in these other industries, but the stakes are just as important.  Operating a restaurant requires efficient and timely operations.  It's time that they abandon the toxic "kitchen culture" and focus on high reliability organization theory!

Thursday, April 2, 2026

The fox, the hound, and the body...

Bear with me for just a moment.  I want to talk about two great films today.  The first film is the 1981 Disney full-length animated feature, "The Fox and the Hound".  The story is loosely based upon a novel by Daniel Mannix of the same name and tells the story of the unlikely friendship between a red fox named Tod and a hound named Copper.  As they both grow older, they struggle with the fact that they are meant to be enemies.  At one point in the story, they do in fact become enemies.  In the film's final minutes, Copper gets into a fight with a bear and is almost killed by it. Tod comes to his rescue and joins the fight, only to fall down a waterfall with the bear.  As Copper approaches Tod as he lies wounded in the lake below, his owner Amos, a hunter, appears, ready to shoot Tod. Copper positions himself in front of Tod to prevent Amos from doing so, refusing to move away.  Amos realizes that Tod had saved both his life and the life of his dog and decides to spare Tod.  As he walks away with Copper, Tod and his former best friend share one last smile before parting for good.  In the final scene, Copper lies down to take a nap, smiling as he remembers fondly the day he first met Tod.

As one blogger on Medium writes, "Friendships can be the cornerstone of our lives, providing support, joy, and companionship. However, not all friendships stand the test of time...Time marches on and always has a way of changing things, especially people. Sometimes the changes create chasms that can no longer be crossed and the challenges you were once able to tackle head-on with your friend become insurmountable."

The second film is the 1986 film "Stand by Me", a coming-of-age drama directed by the late Rob Reiner.  The film was based on a novella , The Body, written by Stephen King, and it's title comes from the song, "Stand by Me" by artist, Ben E. King.  The film (and novella) takes place in King's fictional town of Castle Rock on Labor Day 1959, although the film begins when author Gordon "Gordie" Lachance reads a newspaper article about the death of his childhood best friend, Chris Chambers in 1985.  The rest of the film is a flashback memory to when Gordie was 12 years old, and he (played by the actor Wil Wheaton) and his three friends (played by River Phoenix, Corey Feldman, and Jerry O'Connell) set out on an adventure to find the dead body of another missing boy.  "Stand by Me" is an enjoyable story, a great song, and an even better film!  It's one of my absolute favorites.

I have always remembered the film's ending, when Gordie (once again in 1985 and now played by the actor Richard Dreyfuss) talks about his childhood friendships with reverie.  When talking specifically about his best friend, Chris, who has recently died, he says, "Although I haven't seen him in more than ten years, I know I'll miss him forever. I never had any friends later on like the ones I had when I was twelve. Jesus, does anybody?"

Unfortunately, I too lost track of many, if not most, of my childhood friends.  I think that happens to most of us.  People do, in fact, change - for a variety of reasons.  Sometimes, as two friends (or even a group of friends) grow older, they grow apart.  Oftentimes, neither side is to blame.  It just happens.  According to a Dutch study published in the journal Social Networks, over a period of seven years the average size of personal networks is remarkably stable (see my posts, "Dunbar's number" and "It's a small world after all...").  However, during that same period of time, we replace the large majority of our close personal contacts and friendships with new ones.  We maintain only about 30% of our original casual contacts over those seven years and just under half of our close friendships.

I've posted a lot about the so-called "Loneliness Epidemic" in the last year (see, in particular, my posts, "The Loneliness Epidemic""To be of importance to others is to be alive...", and "We all need the human touch...").  There's another scene about halfway through "The Fox and the Hound".  Tod's adoptive human mother, the Widow Tweed has realized that a fox is better off living in nature and not in captivity.  She drives Tod out to the forest and lets him go.  As she is driving back home, she recites the poem:

Remember how we used to play?
I recall those rainy days
The fire’s glow that kept us warm
And now I find, we’re both alone

Goodbye may seem forever
Farewell is like the end
But in my heart is a memory
And there you’ll always be

The key point here is that we should cherish the memories of friends long lost.  They will always be a part of our lives, and we should keep them in our hearts forever.  Just as important (and perhaps even more so), we shouldn't find ourselves alone because of those lost friendships.  Personal connections and friendships are important to our overall health.  As both the Widow Tweed and Gordie find out, while it's hard to move on, lost friendships can be replaced with new ones.  As Genesis 2:18 reads, "It is not good for man to be alone."  I will re-visit this topic in an upcoming post soon...

Monday, March 30, 2026

Happy Doctor's Day 2026

I wanted to take a moment to wish all of my fellow physicians a Happy National Doctor's Day!  Given that a number of our physicians are on Spring Break vacation this week, we will be celebrating Doctor's Day at our hospital next week.  However, it's still important to officially recognize our physicians on what has become their official day of recognition and gratitude!

National Doctor’s Day is celebrated every year on March 30th.  The first Doctor’s Day was observed more than 90 years ago, on March 30, 1933 by the Alliance to the Barrow County Medical Society in Winder, Georgia (a small town located just east of Atlanta).  Members of the Alliance selected the date to honor all physicians on the anniversary of Dr. Crawford W. Long’s first administration of anesthesia in 1842.  Of note, Dr. Long used ether during surgery to remove a tumor from the neck of James Venable.  The first Doctor’s Day was observed by sending cards to all the physicians and their spouses, and a red carnation flower was placed on the graves of deceased doctors.  

Through a series of resolutions in the years that followed, Doctor’s Day was widely celebrated throughout the southern United States, with sponsorship by the Southern Medical Association.  Eventually, a resolution was adopted and approved by both the U.S. House of Representatives and the U.S. Senate on October 30, 1990 and signed by President George H.W. Bush, designating March 30 as “National Doctor’s Day."  The red carnation remains as the symbol of Doctor’s Day.

I have never been more proud to be a member of this great profession.  We've all had a difficult past few years with everything that has been going on in our world.  Regardless, physicians have been at the forefront leading societal change during one of the most difficult periods in our history.  Importantly, our influence is due in large measure to the trust and respect that society has for our profession.  Physicians are still one of the most trusted of all professions.  As a matter of fact, physicians traditionally rank just below the nursing profession in trust surveys.

I can honestly say that if I had the chance to do it all over again, I would still choose medicine as my life's work.  Medicine has been my passion and my calling.  Being a physician has made me a better person, and I am incredibly proud to be a member of this esteemed profession.

To all Doctor's - thank you for what you do, each and every day!

Thursday, March 26, 2026

"Shall we play a game?"

Several years ago, I posted (see "The only winning move is not to play") about the 1983 movie "War Games" starring Matthew Broderick, Ally Sheedy, and John Wood.  Broderick plays a teenage computer hacker named David Lightman who unwittingly accesses a United States military supercomputer called WOPR (War Operation Plan Response) programmed to simulate, predict, and execute global nuclear war against the Soviet Union.  At first, Lightman thinks he has found an as yet unreleased strategy game called "Global Thermonuclear War" and starts to play as the Soviet Union.  He and his friend, Jennifer Mack (played by Ally Sheedy) orders a number of nuclear missile strikes against U.S. cities, which triggers an actual warning at the North American Aerospace Defense Command (NORAD).  

Luckily, the military staff and computer programmers running WOPR figure out that the incoming missiles are not real and defuse the situation.  However, WOPR continues to "play the game" as it does not understand the difference between reality and simulation.  It continuously feeds false data, such as Soviet bomber attacks and submarine deployments to NORAD, prompting the military leaders there to further escalate the Defense Readiness Condition (DEFCON) level toward full-scale nuclear war.

David and Jennifer team up with WOPR's creator, Dr. Stephen Falken (played by John Wood), to stop the simulation.  Dr. Falken helps them realize the computer must learn that nuclear war is unwinnable.  They force WOPR to repeatedly simulate all the possible nuclear war outcomes based, each of which ends in total destruction.  The computer eventually reverts to tic-tac-toe, continuing to "learn" that no strategy for either global thermonuclear war or tic-tac-toe can win.  WOPR stops the launch sequence and declares, "A strange game. The only winning move is not to play."

As difficult as it is to believe now, no one back then could envision a scenario where artificial intelligence (because that's what WOPR essentially was in the movie) was used by the military to control our nuclear weapons.  Was artificial intelligence discussed frequently?  Yes it was.  Was the threat of nuclear war on everyone's minds?  Absolutely yes.  Was anyone thinking that artificial intelligence had advanced enough to be used by the military at that point or anytime in the near future?  Not really.

Well, guess what folks!?!  Artificial intelligence is here and likely is powerful enough to do most, if not all, of the things depicted in the movie "War Games".  In fact, Kenneth Payne, a professor in the Department of Defence Studies at King's College London recently released the results of a study in which he set three leading large language models (LLMs) – GPT-5.2, Claude Sonnet 4 and Gemini 3 Flash – against each other in 21 simulated nuclear crisis scenarios.  Dr. Payne's first three sentences in the paper are important and bear repeating, "As large language models (LLMs) are increasingly deployed in analysis and decision-support roles, it's imperative to understand more about how these systems reason about strategic conflict, particularly when the stakes involve catastrophic outcomes.  Defence ministries, intelligence agencies, and foreign policy establishments worldwide are already exploring how AI might augment human judgement in crisis decision-making, from pattern recognition in intelligence analysis to scenario planning for contingency operations.  Understanding how frontier AI models reason about escalation, deterrence, and nuclear risk is therefore a matter of AI safety..."

What happened?  As Dr. Payne admits, "Nuclear escalation was near-universal: 95% of games saw tactical nuclear use and 76% reached strategic nuclear threats."  Two of the models in particular (Claude and Gemini) treated nuclear weapons as legitimate strategic options, not as moral thresholds to cross.  While some LLMs limited nuclear strikes to military targets, a few of the others didn't necessarily avoid population centers (civilian targets).  Most concerning, unrestricted nuclear warfare didn't quite have the aura of a taboo that has restrained human decision-makers since the end of World War II, when the two atomic bombs were dropped on Hiroshima and Nagasaki.  

Likely all of us don't have to make decisions about whether to deploy nuclear weapons.  However, it's almost certain that all of us will be required to make decisions with imperfect information and in a crisis situation.  There's no question that AI will be a useful tool for helping us analyze the situation and make the best decision.  That possibility is a lot closer to us now than it was in 1983 when the movie "War Games" was so popular.  Therefore, it's just as important for us to understand how AI will help frame our choices and make decisions in the future.  The result of our decision probably won't lead to nuclear war, but the potential for an adverse outcome from our AI-driven decision-making could be equally concerning.

Monday, March 23, 2026

Eat Your Ice Cream

I recently started reading Eat Your Ice Cream: Six Simple Rules for a Long and Healthy Life by Ezekiel Emanuel.  Dr. Emanuel is a medical oncologist, bioethicist, health policy researcher, and author of several books, though he is perhaps best known for being the chief architect of the 2010 Affordable Care Act.  His latest book is all about how to live a full and healthy life.  Unfortunately, contrary to the title of the book, achieving wellness doesn't require eating a lot of ice cream!  Although he does cite a number of studies showing that ice cream has at least some health benefits, if consumed in moderation.

Dr. Emanuel argues against what he calls the "Wellness Industrial Complex", which  prescribes complicated regimens that often conflict, while at the same time promising us a longer and more productive life.  He argues that we spend too much time following "wellness" recommendations that may only add a few extra days or months to our life, and that we could better spend that time enjoying our life in the here and now.  Dr. Emanuel writes that "with so much health and wellness advice out there, it can be nearly impossible to differentiate the valid, reliable, and effective from the speculative, deceptive, and just plain stupid.  Even when the advice is scientifically sound, it's often extraneous, misrepresented, or misused."  

I would add that the evidence is often conflicting.  The best example here is the so-called French Paradox, which is based on the observation that people living in France have comparatively lower rates of coronary heart disease, including deaths, despite a higher intake of dietary cholesterol and saturated fat.  Back in the 1980's and 1990's, one popular explanation for the French Paradox was that people living in France also consumed higher amounts of red wine.  Red wine contains an anti-oxidant known as resveratrol, a compound believed to have anti-hypertensive effects and potential protective properties because of the ways it relaxes blood vessels.  Consuming moderate amounts of red wine could therefore offset the harmful effects of a diet high in cholesterol and saturated fat.  Unfortunately, as I've discussed in a couple of posts this past year (see "Raitis tammikuu" and "The world is changed..."), red wine consumption is no longer considered healthy!  The Office of the U.S. Surgeon General released a new advisory last year declaring that there is no safe level of alcohol consumption.  The advisory called out in particular the risks associated with several types of cancer, especially breast cancer in women and cancers of the digestive tract in both men and women.  The advisory states, "The more alcohol consumed, the greater the risk of cancer. For certain cancers, like breast, mouth, and throat cancers, evidence shows that this risk may start to increase around one or fewer drinks per day."  

So, red wine consumed in modest amounts was once considered healthy, but now that is no longer the case.  With that in mind, I recently read a study published in JAMA (the Journal of the American Medical Association) that found that greater consumption of coffee and tea was associated with a lower risk of dementia (see "Coffee and Tea Intake, Dementia Risk, and Cognitive Function").  The study followed over 130,000 individuals for up to 43 years of follow-up.  Detailed dietary records were collected about every 2 to 4 years, and the primary outcome of dementia was identified via death records or medical records.  After adjusting for a number of other lifestyle and health-related factors, higher caffeinated coffee intake was significantly associated with lower dementia risk and lower subjective cognitive decline.  Similarly, higher tea intake was also associated with lower dementia risk.  Decaffeinated coffee was not associated with a lower risk of dementia or cognitive decline.  The most prominent differences in dementia risk were observed with an intake of approximately 2 to 3 cups per day of caffeinated coffee or 1 to 2 cups per day of tea.

Well that's great news for me, as I drink about 2-3 cups of regular coffee (no sugar, no cream) every morning!  But it's hard for me to get excited about the results of this study, as I've read similar studies in the past about red wine consumption!  There's a good chance (better than average I'd say) that some future study will show that coffee consumption is bad for your health. 

I guess I like Dr. Emanuel's philosophy that "wellness shouldn't be so hard".  I think I've done a pretty good job of following his six rules (read the book!) so far in my life.  And I will likely continue to eat ice cream, drink my red wine, and continue my morning coffee ritual - all in moderation of course!

Thursday, March 19, 2026

Too much talent or not enough?

Several years ago, I read a great book by Geoff Colvin, Talent is Overrated.  I was skeptical when I first picked up the book, but I am now convinced beyond a doubt that, when it comes to teams, there is such a thing as "too much talent".  In keeping with the theme of two of my recent posts ("It takes 10 hands to score a basket..." and "Champs or Chumps?"), I want to discuss a LinkedIn post published by Adam Grant on May 1, 2018 called "The Problem with All-Stars".  The post was actually a transcript of an episode of Grant's WorkLife podcast, in which he interviewed former NBA player Shane Battier and author Michael Lewis.  

Lewis had written an article about Shane Battier for The New York Times Magazine entitled "The No-Stats All-Star".  He tells the story of the Miami Heat superteam, which played from 2011-2014.  Basically, superstars LeBron James, Dwayne Wade, and Chris Bosh decided that they wanted to play together on the same team.  Dwayne Wade had already won the NBA Championship with the Miami Heat in 2006, and both LeBron James and Chris Bosh were free agents during the 2010 offseason.  

LeBron James famously announced his decision to play for the Heat (and leave his hometown team, the Cleveland Cavaliers) on live television on July 8, 2010 (see "The Decision" on ESPN).  During a team press conference later that summer, in which all three superstars appeared, James was asked how many NBA championships the team would win.  Again, he famously said, "Not one, not two, not three, not four, not five, not six, not seven...".  Suffice it to say that expectations from the Miami Heat and the fans were very high.  The reality was very different.

During the 2010-2011 season, the Miami Heat initially struggled to play together as a team.  In fact, they lost a number of close games.  As Adam Grant said during the podcast, "Lots of stars means lots of egos—and lots of egos means infighting. To overcome that problem, you need humility. Humility is having the self-awareness to know what you're good at and what you're not good at. Studies show that when you have humility in a team, people are more likely to play to their strengths. Instead of going for the spotlight, they take on the roles where they can help the team win."

The Miami Heat would finish the season with a 0.707 winning percentage, but they failed to win the championship, losing to the Dallas Mavericks in the 2011 NBA Finals.  During the offseason, the Heat brought in yet another free agent player, but this time one that led to a lot of head-scratching.  They brought in Shane Battier, a talented player, but one who throughout his career failed to score a lot of points or make a lot of rebounds (during his 13-year career, Battier averaged 8.6 points per game, 4.2 rebounds per game, and 1.8 assists per game).  But Battier's presence on the Heat the following season made a huge difference!  

Whenever Battier was on the court, everyone on the Miami Heat played better, both offensively and defensively.   Michael Lewis told Adam Grant, "Shane had broadly two big effects. On his own teammates, he made everybody more efficient. When he was on the court, the shot the team took tended to be a better shot than it was when he wasn't on the court. And on the defensive end, he made the other team slightly less efficient."  Suddenly, everyone was playing better AS A TEAM.  

The Heat would go on to win back-to-back NBA Championships following the 2011-2012 and 2012-2013 seasons before losing in the NBA Finals again in 2013-2014.  Shane Battier was a leader on the court and helped to build the kind of team chemistry that turned into championships.  I've mentioned this in a couple of previous posts ("He's the glue..." and "In search of David Ross") - Shane Battier was the glue that helped make everyone work together and win (see also an article published this past fall in The Wall Street Journal "The underrated power of 'glue employees' who hold everything together"). 

Adam Grant also said something that resonates with me, "When it's time to put together a team, most people look for the best talent. I hear it in every industry. “We don't take B players, only A players.” But what actually happens when you have a whole team of stars?  The evidence is pretty clear: no matter where you work, having an entire team of superstars can be a total disaster. It turns out that if you have a team of 10 people, you're better off with six stars than eight."

He references two important studies, one from the world of sports and the other from the world of Wall Street investment banking.  The first discusses what is known as the "too much talent effect", in which investigators were able to show that greater individual talent is associated with winning in soccer and basketball, but only up to a certain point.  Past that point and more talent leads to worse performance, similar to what occurred with the Miami Heat (who, even with Shane Battier, failed to win the seven championships that LeBron James talked about in the press conference).  The second showed that the "too much talent effect" wasn't unique to sports.  In other words, when it comes to making key investment decisions, "too many cooks spoil the broth" - having a team composed of too many experts actually leads to worse financial decisions!

Hopefully, my last three posts have convinced you in the so-called "war for talent" in today's work environment, leaders should focus on building diverse teams with different levels of skill.  Bringing in too many experts is actually counterproductive.  So I ask, is it better to have too much talent or not enough talent?  My answer - it's better to have not enough talent...

Monday, March 16, 2026

"Thanks a million..."

I have to be completely honest here.  I never imagined that my "Leadership Reverie" blog would ever come close to 1 million views.  Perhaps I never believed that I would continue posting for as many years (over ten years) that I have been writing for this blog.  And I certainly never thought that there would be enough interest in what I was writing about to come close to 1 million views.  Officially, as of this morning, my blog has surpassed the 1 million views threshold!

According to Wikipedia, the word million is often used as a metaphor for any very large number.  Have you ever heard or used the phrases, "Not in a million years..." or "You are one in a million..." or even "Let's ask the million dollar question"?  Here are a few fun facts that I discovered about the word million.  Apparently, one million seconds equals 11.57 days.  The total weight of one million average sized honeybees would weigh about the same as an 80 kg (180 lb) human.  And there are approximately one million characters in a typical 600 page paperback book.  In ancient Egypt, the symbol for the word million was the Egyptian god Heh, who was the personification of infinity or eternity.  Even the ancient Egyptians used the word to describe a very large - maybe even impossibly large - number.

I'm not sure that I will ever get to two million views or not, but I will keep writing as long as people are interested in reading!  I want to take this opportunity to thank all of my readers whose leadership journeys continue to inspire me to learn and grow as a leader as well!  And, as the saying goes, "Thanks a million!"