Friday, August 15, 2025

The 40% rule

Last time (see "What makes elite individuals elite?"), I posted about Diana Nyad and the studies performed by Martin Paulus and his team suggesting that perhaps there is a little bit of Diana Nyad in all of us.  While we may never swim across the Straits of Florida like Nyad did on her fifth attempt in 2013, the work by Paulus and his team strongly suggest that we can train our minds to push beyond our own self-perceived limits.  

I am reminded of a story that I heard about the American entrepreneur, author, ultramarathoner, and former rapper (under the name "Jesse Jaymes") Jesse Itzler.  Incidentally, Itzler also happens to be married to another entrepreneur, Sara Blakely, the founder of Spanx.  Itzler was running in a 100 mile ultramarathon as part of a relay team of five other runners, when he "met" former Navy SEAL David Goggins, who was running the entire 100 mile ultramarathon by himself!  I've posted about Goggins in the past (see my post "GoRuck").  Goggins is a freak of physical fitness - he apparently joined the U.S. Air Force in the 1990's, gained a lot of weight after he left the Air Force, and then lost 106 pounds before joining the U.S. Navy and signing up for the U.S. Navy SEAL entry program, Basic Underwater Demolition/SEAL training ("BUD/S").  After completing training and qualifying as a Navy SEAL, Goggins completed U.S. Army Ranger School.  He would serve in the military for 20 years, and he has since become an ultramarathoner, triathlete, ultra-distance cyclist, motivational speaker (of course), and author.  And oh, did I mention that he once held the world pull-up record?  He once did 4,030 pull-ups in a 17 hour period.  He has since written two excellent memoirs, Can't Hurt Me: Master Your Mind and Defy the Odds and Never Finished: Unshackle Your Mind and Finish the War.

Back to Itzler.  After meeting Goggins, Itzler did what only a billionaire could do - he hired Goggins to live with him for the next 30 days and teach his family the art of mental toughness.  Itzler's experience is the subject of his book, Living with a SEAL.  Itzler writes, "The first day that “SEAL” came to live with me he asked me to do — he said how many pull-ups can you do?  I did about eight.  And he said all right. Take 30 seconds and do it again. So 30 seconds later I got up on the bar and I did six, struggling. And he said all right, one more time. We waited 30 seconds and I barely got three or four and I was done. I mean couldn’t move my arms done. And he said all right. We’re not leaving here until you do 100 more. And I thought there’s no — well we’re going to be here for quite a long time because there’s no way that I could do 100. But I ended up doing it one at a time and he showed me, proved to me right there that there was so much more, we’re all capable of so much more than we think we are. And it was just a great lesson."

Goggins refers back to a statement that the American psychologist William James made in his book, The Energies of Man.  James wrote, "Beyond the very extreme of fatigue and distress, we may find amounts of ease and power we never dreamed ourselves to own; sources of strength never taxed at all because we never push through the obstruction."  Goggins calls it "the 40% rule". Itzler explains further that the "40% rule" simply means that when your brain is telling you that you can't go on anymore, you are really only 40% done.  Deep down, your body can handle more stress and you can face even greater challenges.

I'd love to see what Goggins' right insula looks like on fMRI!

Wednesday, August 13, 2025

What makes elite individuals elite?

I posted last year (see "It's a team") about the 2023 movie Nyad starring Annette Bening and Jodie Foster.  The film is based upon open water swimmer Diana Nyad's 2015 memoir, Find a Way (which I've since had a chance to read) and tells the story of Nyad's multiple attempts in the early 2010's to swim from Cuba to Florida across the treacherous Straits of Florida.  Nyad was successful on her fifth attempt at the age of 64 years, which is absolutely incredible!

When she finally made it all the way to Key West, Nyad proudly told the crowd that had gathered at the beach to welcome her and cheer her on, "I got three messages.  One is we should never, ever give up.  Two is you are never too old to chase your dreams.  Three, it looks like a solitary sport, but it's a team."

I am not going to argue that Nyad's accomplishment was a team effort!  While my personal and professional accomplishments will never equate to swimming across the Straits of Florida, I will readily admit that everything that I've achieved in life has required a great deal of help from my wife, my family and friends, and my colleagues.  However, today I wanted to focus more on Diana Nyad the person.  There is no question that her own personal perseverance and resilience played a major role in her success.  Elite athletes are just wired differently.  The question I have is whether that is nature, nurture, or a combination of both.

The writer Elizabeth Svoboda wrote an article for Discover magazine in 2014, "The Brain Basis of Extraordinary Feats of Will" in which she wrote, "It’s easy to assume that Nyad and other champions of endurance — Olympic medalists, Navy SEALs, marathon dancers — are freaks of nature, capable of feats of will the rest of us could never accomplish. But according to University of California, San Diego, psychiatrist Martin Paulus, who studies how the brain responds to stress, perseverance isn’t just an inborn trait. His work suggests that toughing it out Nyad-style is a specialized skill that’s potentially accessible to all of us — with a little training."

Now THAT caught my attention!  One of the reasons that I find science so cool is that every new finding builds upon the last one.  Martin Paulus (now head of the Laureate Institute for Brain Research at the University of Tulsa) and his team conducted a number of studies, each building upon the last one in an iterative fashion, before Svoboda was able to make that statement.  They first used functional magnetic resonance imaging (fMRI) to show that elite military personnel in the U.S. Navy (Navy Sea, Air, Land Forces - SEALs) showed greater activation in the right insular region of the brain and attenuated activation in the left insular region of the brain when shown angry and/or happy emotion faces compared to normal subjects (see "Differential brain activation to angry faces by elite warfighters: Neural processing evidence for enhanced threat detection").  In a follow-up study, when compared to normal subjects, Navy SEALs showed an attenuated activation in the right insular region when exposed to angry emotion signals (see "Altered insula activation in anticipation of changing emotional states: neural mechanisms underlying cognitive flexibility in special operations forces personnel").

The next study (see "Subjecting elite athletes to inspiratory breathing load reveals behavioral and neural signatures of optimal performers in extreme environments") studied what is compared the interoceptive response between elite adventure racers and normal subjects.  Interoception is what the body does to sense or perceive the internal state or physiological condition of the body and is heavily involved in how we perceive effort and exertion.  Elite athletes, for example, could have an attenuated interoceptive response allowing them to push past what others "feel" as maximum effort.  Again, Paulus' team performed fMRI while subjects performed a cognitive task requiring focus and attention and while wearing a specially fitted mask that significantly increases the difficulty of breathing (without causing an increase in blood carbon dioxide or oxygen levels).  There were three key findings: (1) breathing through a straw (basically) was difficult and very unpleasant and resulted in "profound" activation of the bilateral insula, dorsolateral prefrontal cortex, and anterior cingulate; (2) adventure racers, compared to normal subjects, showed greater accuracy on the cognitive task with fewer mistakes in spite of having to breathe through a straw; (3) adventure racers showed decreased activation of the right insular cortex during the breathing load.  In other words, as predicted, elite athletes actually do show an attenuated interoceptive response, allowing them to push past the pain and discomfort of vigorous exercise or activity that would cause the rest of us to stop.

All of these findings are very interesting and make sense.  Maybe elite athletes, like Diana Nyad, and special forces military personnel, like the U.S. Navy SEALs are just wired differently from the rest of us.  But here's where things get even more interesting.  Paulus' team took things one step further and enrolled more than 200 U.S. Marines in an eight week Mindfulness Based Mind Fitness Training (MMFT) course.  Those Marines who participated in MMFT showed an attenuated right insular and anterior cingulate response after a stressful combat training session (see "Modifying resilience mechanisms in at-risk individuals: A controlled study of mindfulness training in Marines preparing for deployment").  Similarly, in yet another follow-up study, using the same inspiratory breathing load used in the Navy SEALs experiments above, Paulus' team were able to show that Marines who had participated in the MMFT course had an attenuated right insular response compared to those who did not participate in the course (see "Mindfulness-based training attenuates insula response to an aversive interoceptive challenge").  In other words, we can be trained to ignore the signals that our bodies generate to tell us that we are exerting maximal effort!

Please don't get too excited.  I'm not going to say that with extensive training, any one of us could swim across the Straits of Florida.  Diana Nyad probably does have some physiologic tools that she inherited from her parents and that allowed her to be successful at long-distance swimming.  However, the suggestion that we can all train ourselves to push beyond our self-perceived limits is an important one.  Perhaps there is a little bit of Diana Nyad in all of us.

Monday, August 11, 2025

The finest things in life...

My wife and I recently took a trip to the Willamette Valley in Oregon for some hiking and wine tasting.  Neither one of us had ever been to Oregon, so we were able to cross the state off our bucket list.  More importantly, we had a great time!  Of course, I also learned a few things about wine that I didn't know.  I've never been a huge fan of sparkling wine, but after tasting some really good sparkling wines in Oregon, perhaps I will reconsider.

I will admit that until a few years ago, I never knew that there was a difference between champagne and sparkling wine.  I thought that there was "Champagne" with a capital C (made in France), as well as "champagne" with a lower-case C (which was made everywhere else).  Contrary to popular belief, there's a difference between champagne (regardless of whether it is spelled with a capital or lower-case letter) and sparkling wine.  "Champagne" refers to a sparkling wine that is specifically made in the Champagne wine region in northeastern France.  All other varieties should just be called sparkling wine and not champagne, even though most of us do so.  

Sparkling wine is usually white (wine produced by the fermentation of the grape pulp minus the skins) or rosé (an intermediate between red and white wine), but there are also examples of red (wine produced by the fermentation of the grape pulp with the skins) sparkling wines, including the Italian sparkling red wines, Brachetto and Lambrusco.  Sparkling wine can range from dry (which is actually a technical term for wine that contains very little sugar, so it's not sweet) - also known as brut (French for "hard") to sweet (which of course is a wine containing a lot of sugar) - also known as doux (French for "soft").    

The sparkling (fizzy) nature of sparkling wine is due to higher content of carbon dioxide which is produced during secondary fermentation, either in a bottle (which is the traditional method) or in a large stainless steel tank (which is the more commonly used method today).  Apparently, the effervescence or "fizziness" of certain wines was noted as far back as Ancient Greece, but the cause was often misunderstood.  Ancient winemakers thought that the presence of bubbles was due to phases of the moon or to the influence of good versus evil spirits.  

The French Benedictine monk Dom Pierre Pérignon actually did not invent the French sparkling white wine that we now call "champagne".  As a matter of fact, his superiors at the Abbey of Hautvillers once tasked him with trying to remove the fizziness out of the sparkling wine, because the bottles had an uncanny tendency to bursting in the cellar.  Even though he didn't invent champagne, Dom Pérignon did a lot to perfect how champagne was made, which is why even today Dom Pérignon champagne is considered one of the finest brands of champagne out there.












I also recently learned another bit of trivia about champagne.  Apparently, a group of divers found a crate of 70 bottles of champagne from a 19th century shipwreck in the Baltic Sea in 2010.  The champagne was 172 years old and thought to be worth at least $4 million.  The bottles were still corked, and the champagne was perfectly preserved due to the cold temperatures in the Baltic Sea.  Imagine drinking a bottle of "shipwreck champagne" from the 19th century!

So, what is the take-home message from all of this talk about sparkling wine?  Great question!  Both the history of champagne and Dom Pérignon, as well as the story about "shipwreck champagne" prove to me once again that some of the finest things in life were once thought to be flaws.  Sometimes, when we see a flaw, we should change our perspective.  The effervescence that makes champagne so appealing to many of us was originally thought to be a flaw.  When viewed in a different way, it became the defining characteristic of a whole family of fine wines.  I am also reminded that some of our most important advances came about as a result of mistakes and accidents.  It is true that we can learn a lot by taking risks and making mistakes.  There is no better way of learning and growing.  And every once in a while, we may find ourselves with 200 year-old champagne worth millions!

Saturday, August 9, 2025

"The clothes really do make the person..."

I was working late last Friday night and caught one of the last commuter trains back to the suburbs.  I happened to be wearing a suit that day, and one of my fellow commuters noticed and acknowledged that the two of us were "probably the only two people on the train wearing a suit that night".  He was correct - everyone else was dressed for going out for a fun Friday night.  My fellow commuter told me that whenever he worked downtown, he usually wore a suit.  I don't remember his exact words, but he said something to the effect that we should always dress for the occasion and how we dress makes an impact on how we feel and how we are perceived.

His comments reminded me of the old adage that "the clothes make the man", which is often attributed to the American author, Mark Twain.  It is true that Twain wrote the following passage in his short story "The Czar’s Soliloquy" in 1905:

[One] realizes that without his clothes a man would be nothing at all; that the clothes do not merely make the man, the clothes are the man; that without them he is a cipher, a vacancy, a nobody, a nothing… There is no power without clothes.
  
Regardless of its origin, there is now scientific proof to suggest that how we dress truly impacts how we feel, and how we feel has an impact on how we show up, how we portray ourselves, and how we are perceived by others.  It's called "enclothed cognition", a term first used by the American psychologists Hajo Adam and Adam Galinsky in a study published in the Journal of Experimental Social Psychology.  Adam and Galinsky cite other examples to support their findings, including the popular book, Dress for Success by John T. Molloy or the television series, What Not to Wear.  They write, "...the clothes we wear have power not only over others, but also over ourselves."

In the first experiment, Adam and Galinsky randomly assigned college undergraduate students to one of two conditions - wearing a lab coat versus not wearing a lab coat.  Subjects were next asked to perform a series of selective attention tasks (known as the Stroop test), in which they had to focus on relevant stimuli while ignoring irrelevant stimuli.  Those students wearing a lab coat made about half as many errors as those who weren't wearing a lab coat.  It was almost as if wearing a lab coat (a status symbol of knowledge authority and expertise) increased the students' level of confidence, allowing them to successfully perform their task.

In the second experiment, Adam and Galinsky again randomly assigned college undergraduate students, this time to one of three conditions - wearing a lab coat versus wearing a painter's coat versus seeing a lab coat.  The students were told that local officials were thinking about making certain clothes mandatory for certain professions in their area, and one of the purposes of the study was to determine what people think about the clothes.  The interesting part about this experiment was that students in the lab coat and painter's coat group actually wore the same coat, it was just described as a doctor's coat in the first group and a painter's coat in the second.  Students in the third group merely saw a lab coat displayed on a table across the room.  The students were next asked to perform a sustained attention task.  Again, students in the lab coat group were more successful at the task compared to the other two groups, consistent with Adam and Galinsky's concept of "enclothed cognition".

The results of the second experiment demonstrated that wearing a lab coat led to greater success in the sustained attention task and that this effect depended on whether (1) the clothes were worn and (2) the symbolic meaning of those clothes.  Of interest, there was no difference between the painter's coat group and the group who saw the doctor's coat.  In the last experiment, college undergraduate students were randomized to one of three conditions - wearing a doctor's coat versus wearing a painter's coat versus identifying with a doctor's coat.  The experimental set-up was very similar to the second experiment, except in the "identifying with a doctor's coat", students saw the doctor's coat during the entire experiment and were asked to write an essay about how the coat represents them and has a personal meaning (this was to "prime" the students to closely identify with the lab coat).  Students who wore the doctor's coat still performed better on the sustained attention task, however this time, the students who identified with the doctor's coat performed better than those students who wore the painter's coat.

I remember once during residency at the Naval Medical Center in San Diego several years ago a fellow resident asking me why I was wearing a lab coat in clinic over my Navy uniform.  I responded, "Because I am a doctor."  I do think that there is something to this concept known as "enclothed cognition".  I do think that we should all be "dressing for success".  Regardless of our own opinions, the clothes that we wear do have an impact, not only on how we feel, but how we are perceived by others.  I can't help but wonder how the recent trends towards a "casual workplace" have adversely impacted how different professions are perceived.  

Thursday, August 7, 2025

"What does not kill me makes me stronger..."

Long before American pop singer Kelly Clarkson said it, the 19th century German philosopher Friedrich Nietzsche said "What does not kill me makes me stronger."  Whenever I hear this quote, my mind goes back to my high school Physical Education teacher, who first taught me "the principle of progressive overload".  Simply stated, if you want to get better at anything, you have to push yourself past your zone of comfort.  For example, if you want to build up your muscle strength so that you can increase your maximum bench press, add 5-10 pounds to your bench press work-out every few days.  Alternatively, if your goal is to run a marathon, start out with running one long run every Saturday and add 1 mile every week.  Slowly but surely, over time, you will build up your muscle strength (so you can bench press a couple of hundred pounds) or increase your stamina to the point where you can run that marathon.

Not surprisingly, "the principle of progressive overload" applies to more than just sports and exercise.  If you are afraid to speak in public, you have to challenge yourself by actually starting to speak in public.  You should start out with something relatively short, maybe giving a toast at a dinner with friends.  As you gain confidence and with further practice and experience, you can eventually challenge yourself with a speech in front of a small crowd.  

The same principle applies to leadership.  Sarah Horn starts off her recent article for Forbes magazine, "Why Discomfort Builds Better Leaders" by stating, "In today’s hyper-optimized world, comfort and convenience are often prized. But in doing so, we may evade the very experiences that enable deep leadership growth...Uncomfortable or challenging experiences teach leaders to perform under pressure, nurture teamwork in adversity, and recover quickly after failure.  This creates a virtuous cycle: mastering setbacks builds confidence and resilience, which enables faster progression and greater impact, which in turn attracts more growth opportunities."

Horn references a study by Kaitlin Woolley and Ayelet Fishbach, published in the journal Psychological Science ("Motivating personal growth by seeking discomfort").  Over two thousand study participants participated in five studies in which they intentionally and actively sought out personal discomfort - by taking improvisation classes, engaging in creative writing, or even exploring alternative political viewpoints.  These study participants consistently reported greater perceived goal achievement, engagement, and personal long-term growth.  

Horn writes, "Leaders who cognitively engage with discomfort learn to understand their limits, recognize their triggers, and manage their responses when stakes are high."  When they push and challenge themselves beyond their personal zone of comfort, they learn how to deal better with uncertainty, anxiety, and fear.  She continues, "The key is intentionality. Hardship does not automatically create better leaders. However, deliberately chosen challenges – whether physical, emotional, or intellectual – can strengthen neural pathways that serve leaders in high-stakes situations."

Rather than being afraid to challenge ourselves, we should embrace the opportunity to push ourselves and learn, grow, and develop.  When we take risks and move out of our own personal comfort zone, we will likely fail.  But when we fail, we learn, grow, and develop into better leaders.  Even if she didn't say it first, Kelly Clarkson maybe said it best, "What doesn't kill you makes you stronger!"

Tuesday, August 5, 2025

The Siren's Call

Earlier this year, my wife and I attended a lecture by Chris Hayes, Emmy Award-winning host of "All In with Chris Hayes" on MSNBC.  Hayes was touring in support of his new book, The Siren's Call: How Attention Became the World's Most Endangered Resource.  I've never actually watched Chris Hayes, but I thought he was a good speaker.  I ended up reading the book, which I also enjoyed.  He wrote an article based, in part, on his book for The Atlantic"You're Being Alienated From Your Own Attention".  Hayes claims that "Attention is a kind of resource: It has value, and if you can seize it, you seize that value."  He goes on to suggest that "Every single aspect of human life across the broadest categories of human organization is being reoriented around the pursuit of attention."

The Canadian-American journalist Robert MacNeil was perhaps best known for co-founding (with fellow journalist Jim Lehrer) the public television news program, the MacNeil/Lehrer News Hour, which aired from 1975-1995 (the show has since been renamed the PBS News Hour).  MacNeil wrote an essay in 1993 entitled "The Trouble with Television" (you can find it relatively easily on the Internet).  He raises many of the same issues that Neil Postman wrote about in his book Amusing Ourselves to Death: Public discourse in the Age of Show Business, which I discussed in a recent post (see "Amusing Ourselves to Death...").  

MacNeil wrote, "The trouble with television is that it discourages concentration.  Almost anything interesting and rewarding in life requires some constructive, consistently applied effort...but television encourages us to apply no effort.  It sells us instant gratification.  It diverts us only to divert, to make the time pass without pain...In short, a lot of television usurps one of the most precious of all human gifts, the ability to focus your attention yourself, rather than just passively surrender it."

There is a fight for our attention.  And we are losing.  Hayes writes, "Those who successfully extract it [attention] command fortunes, win elections, and topple regimes.  The battle to control what we pay attention to at any given instant structures our inner life - who and what we listen to, how and when we are present to those we love - and our collective public lives: which pressing matters of social concern are debated and legislated, which are neglected..."  

I've caught myself in the past "doom-scrolling" through various social media sites and wasting precious time that could have been better spent on a more productive activity.  I started to find that a lot of what I was reading was garbage, which prompted me to quit both X and Facebook a few months ago (see my post "Liberation").  

I think I agree with most of the arguments that Chris Hayes makes in his book.  He admits that his job is to capture our attention, and certainly most (if not all) media today is all about capturing attention.  Like Hayes, I'm not sure that there is a straightforward and easy fix to this dilemma.  We've been traveling down this road for quite some time (hence the article by Jim Lehrer that appeared over 30 years ago).  I think the first step is to recognize and clearly state the problem, if any, that we are needing to solve.  Once the problem is recognized, the next step is to begin a frank dialogue about the problem itself.  Once there, we can start talking about potential solutions.

Sunday, August 3, 2025

Has Gen X lost out when it comes to the C-suite?

The Wall Street Journal columnist Callum Borchers wrote an interesting article a few days ago entitled, "The Gen Xers Who Waited Their Turn to Be CEO Are Getting Passed Over".  It's well worth a read on your own, but the very first sentence in the article summarizes Borchers' point perfectly.  He writes, "When it comes to the C-suite, Gen X might be doomed to live up to its "forgotten generation" moniker."

Apparently, there are two trends happening simultaneously in the corporate world.  First, baby boomers are working past the traditional retirement age and staying on in their current leadership roles in the C-suite.  For example, 41.5% of chief executives of companies in the Russell 3000 are at least 60 years of age or older, which represents an increase from 35.1% in 2017.  As Borchers explains, many organizations have played it safe in recent years, particularly during and immediately after the COVID-19 pandemic, by either keeping their current CEOs in place or hiring experienced and/or well-established (read "older") CEOs.

Second, given the rapidity of technological change, especially with advances in computing and, in particular, artificial intelligence, companies are beginning to hire younger CEOs in their 30's and 40's.  Again, by way of example, the share of CEOs in the Russell 3000 in their 30's and 40's increased from 13.8% in 2017 to 15.1% more recently (please see the figure below from the WSJ article). 




















As Matteo Tonello from the Conference Board said, "We're starting to see a barbell phenomenon in the CEO role where Gen X is being squeezed in the middle."  Gen X is typically defined as those individuals born between 1965 and 1980.  They are starting to reach their late 50's, an age which, at least historically, many first-time CEOs have been hired.  What's happening instead is that companies are skipping a generation and hiring younger first-time CEOs.  Borchers further notes that Gen Xers are looked upon as skilled tacticians rather than visionary leaders.  They are just not being viewed by Boards as transformational leaders or rising stars with big ideas about what the future could look like.

I've previously commented on the so-called "youth movement" when it comes to head coaches in the National Football League (see my post "Youth Movement").  At that time, I also commented on the growing trend for companies outside of football to hire younger CEOs.  Similarly, Becker's Hospital Review reported last year that the average age of hospital CEOs has decreased slightly over the last decade, but it still remains higher than it was in 2014.  Health care organizations are subjected to the same challenges and trends that companies in the Russell 3000 encounter, so it wouldn't surprise me at all to see a growing "youth movement" with respect to hospital CEOs.  Whether this is the right or wrong approach is a decision that most hospital boards will have to make in the best interests of their organization.  

Friday, August 1, 2025

Give trust to build trust...

A few weeks ago, I wrote a post entitled "Deference to expertise builds trust..."  What's interesting is that, in at least the way that it is used in the High Reliability Organization (HRO) literature, the word deference has almost the same meaning as the word trust.  Please allow me to explain.

The Merriam-Webster Online Dictionary defines deference as a readiness or willingness to yield to the wishes of others.  By comparison, the word trust is defined in three ways as a verb - first, to give a task, duty, or responsibility to (as to "entrust"); second, to put (something) into the possession or safekeeping of another (as in "to hand"); and third, to regard as right or true (as in "to believe").  However, the word trust may also be used as a noun, as in a firm belief in the integrity, ability, effectiveness, or genuineness of someone or something (as in "confidence") or alternatively, responsibility for the safety and well-being of someone or something (as in "custody").

So, by deference then, we mean are placing our belief, our confidence, and our trust in someone to make the right decisions for their team(s) and organization.  We are entrusting and empowering them with taking responsibility for not only their actions but for the actions of their teams.  We are giving them responsibility, and with responsibility comes accountability.  It follows then, that by entrusting (empowering) others, we are establishing an interdependence that is based on mutual respect and trust.  When we show others that they have our confidence, we in turn increase the likelihood that they will share that confidence by trusting us in return.

If you want an example that perfectly illustrates the concept of "giving trust to build trust", look no further than the "Open Prison" concept in India.  An "open prison" is one in which prisoners serve their sentences with minimal supervision and security.  Think of a prison without walls, towers, and barbed wire.  Prisoners are not even locked up in cells.  They are essentially free to come and go as they please, often leaving the prison to go to a job outside the prison during the day, only to return at night.  In some cases, their families are allowed to stay with them.  

The "open prison" concept started in the late 1950's and early 1960's in the Indian state of Rajasthan, where it remains a popular model today.  As Kavitha Yarlagadda writes (see "India's 'Open Prisons' Are a Marvel of Trust-based Incarceration"), "Designed to foster reform as opposed to punishment, the system is based on the premise that trust is contagious. It assumes — and encourages — self-discipline on the part of the prisoners. On a practical level, letting incarcerated folks go to work also allows them to earn money for themselves and their families, build skills, and maintain contacts in the outside world that can help them once they’re released."  In other words, "trust begets trust".  

Now, what does an open prison in India have to do with HROs?  I think they illustrate a key principle that is foundational to the concept of deference to expertise.  Deference to expertise is built upon mutual trust.  By giving trust, we build further trust.  Just like what happens with the open prisons in India.  "Trust begets trust, which then begets even more trust."  It's a virtuous cycle that leads to high performance teams and high reliability organizations.

Wednesday, July 30, 2025

Another alternative to VUCA...

 Last December, I posted about the concept of BANI (see "Welcome to the age of chaos..."), which was proposed by the author and futurist Jamais Cascio in a blog post from April 29, 2020, "Facing the age of chaos".  Cascio wrote, "The concept of VUCA is clear, evocative, and increasingly obsolete.  We have become so thoroughly surrounded by a world of VUCA that it seems less a way to distinguish important differences than simply a depiction of our current default condition."  He then suggested that perhaps BANI was a more important description of the constant chaos that is characteristic of the world we live in today.  Here, B=Brittle, A=Anxious, N=Non-linear, and I=Incomprehensible. 

David Magellan Horth, writing for the Center for Creative Leadership, proposed yet another VUCA alternative - RUPT (see his post, "Navigating disruption with RUPT: An alternative to VUCA").  While RUPT is also an acronym, Horth suggests that the acronym was developed with the Latin word rumpere, meaning to break or to burst, in mind.  The English words rupture and disruption are derived from the Latin rumpere.  The acronym itself stands for the following:

R = Rapid

U = Unpredictable

P = Paradoxical

T = Tangled

The acronym suggests then that our world is characterized by rapid change (in Horth's words, overlapping like "waves emerging from different sources cashing in mid-ocean").  These changes are unexpected and defy prediction, challenging our view of the world, which makes them paradoxical.  All events are connected (as Horth describes, "everything is connected to everything else").

Perhaps we don't really need another acronym to describe the state of our world.  What's more important is Horth's suggestion about how we as leaders can navigate today's RUPT environment by:

1. Nurturing and practicing learning agility.  The CCL defines learning agility as the ability and willingness to learn from experience and subsequently apply that learning to perform successfully under new and challenging conditions

2. Developing leadership across divides.  Here, the CCL suggests that cross-collaboration between different disciplines is incredibly important.  Diverse teams with diverse backgrounds and experiences will bring different frameworks and paradigms about the world to the table.  However, in order for these diverse teams to work effectively, leaders have to establish mutual trust, respect, and psychological safety.

3. Leveraging polarities inherent in complex challenges.  A leader's natural tendency when confronted with a new challenge is to go back to what has worked well in the past.  Here, the CCL sees new challenges not as problems to be solved, but as polarities to be managed.  They encourage leaders to shift their mindset, thinking, and decision-making from either/or to both/and.  

Monday, July 28, 2025

Stress, Aging, and Psychological Wellbeing

I came across an interesting article that was recently published in the journal Health Psychology ("Cumulative Stress and Epigenetic Aging: Examining the Role of Psychological Moderators").  The study used epigenetics to determine whether life stressors are associated with aging.  Epigenetics is the study of how the changes in how different genes are expressed ("turned on" or "turned off") are passed down from generation to generation.  

Gene expression is regulated through slight chemical modifications in the genetic material (called deoxyribonucleic acid, or DNA) itself or of the proteins that are tightly bound to the genetic material (called histones).  For example, a small chemical group called a methyl group can be added to specific sites (usually the cytosine base) on the DNA molecule (called DNA methylation).  DNA methylation usually turns genes off or reduces their activity.  Factors like diet, stress, physical activity, and exposure to toxins can influence these epigenetic patterns, potentially across generations.  

Importantly, as we grow older, the number of epigenetic modifications to our genetic material increases, such that we all have an epigenetic age, so to speak.  Individuals who are exposed to chronic environmental stress accumulate more of these epigenetic changes, and when their DNA is examined closely, they appear older (from an epigenetic standpoint) than their chronologic age, a phenomenon that is called epigenetic age acceleration (EAA).  

The present study involved over 2,000 subjects in which sociodemographic data, cumulative life stressors, and measures of psychological wellbeing were collected, along with blood samples to measure the levels of DNA methylation to determine the epigenetic age.  As expected, higher levels of cumulative life stressors was associated with EAA.  In other words, lifelong stress causes us to age faster.  However, this was only true for individuals with lower levels of psychological wellbeing.  In other words, individuals who scored higher on validated measures of purpose in life, environmental mastery, self-acceptance, autonomy, positive relations with others, and personal growth did not age faster (as shown by EAA), even when they have significant and cumulative life stressors.  

Psychological wellbeing refers to that state of mental health where an individual experiences positive emotions, life satisfaction, and a sense of purpose. It involves feeling good emotionally and functioning effectively in daily life.  I've posted about psychological wellbeing in the past - see in particular "Languishing and Flourishing", "The Three Dimensions of a Complete Life", and "The Five Pillars of Happiness".  As it turns out, having a positive attitude, being satisfied and content with life, and having a sense of purpose can be incredibly powerful when it comes to our physical, mental, and spiritual health.

Saturday, July 26, 2025

Today's Phaedrus moment

The ancient Greek philosopher Plato questioned whether people who used the new invention of writing would ever develop wisdom in his book Phaedrus.  The book is a dialogue between Socrates and the Athenian aristocrat Phaedrus, While they discuss the topic of love, they eventually discuss the nature of rhetoric and, in particular, the subject of writing.  Socrates tells a brief legend of the Egyptian god Theuth, who gave the gift of writing to King Thamus, who was in turn supposed to give writing to the people of Egypt.  Here is the conversation between Theuth and King Thamus (in the words of Socrates, of course):

Theuth: This will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. 

Thamus: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

In other words, Thamus believes that the gift of writing will have the opposite effects to what Theuth intended.  Rather than helping them to remember, the ability to write their thoughts down will teach them how to forget.  They will lose the ability to remember things unless they write them down.  Writing will become a crutch.  And the people will suffer for it.  

One could certainly argue that Socrates makes a legitimate point here.  Before calculators, students would have to memorize their "math facts" in school.  I remember having to recite my multiplication tables during elementary school.  Later, I had to master multiplication of two- and three-digit numbers, as well as long division.  Now that calculators are so widely available, there is a concern that students aren't learning their "math facts" as well as they did in the past and that this may impact their ability to do more complex math problems later on (these concerns may be unfounded).  

I've already posted about how the writers Nicholas Carr and Jonathan Haidt think that the Internet and our ability to access information on the Internet via our smart devices has made us all dumb or even "uniquely stupid" (see my posts, "Are smart phones making us dumb?" and "Why the past 10 years of American life have been uniquely stupid...").  More recently (see "AI is the elevator..."), I've posted about how the blogger Arshitha S. Ashok thinks that AI is making us dumb.  The principles here are the same.  If you write something down in order to remember it, you lose the ability to memorize things.  If you use a calculator all the time, you forget your "math facts".  If you are always searching for answers on the Internet, you again lose the ability to remember things.  And finally, if you are using AI to do your work for you, your skills at completing a particular task will deteriorate.

These are legitimate concerns, even if they haven't necessarily been proven true, at least not yet.  Given these concerns, perhaps the better question to ask is whether individuals who use AI will somehow pay a penalty for doing so.  In other words, will individuals who use AI at work be perceived as lazy, unmotivated, or even unintelligent.  Jessica Reif, Richard Larrick, and Jack Soll asked this exact question in a study that was published last year in the Proceedings of the National Academy of Sciences ("Evidence of a social evaluation penalty for using AI").  They conducted a series of four small studies and found that (1) people who use AI believe that they will be evaluated as lazier, less competent, and less diligent than those who don't use AI; (2) observers do, in fact, perceive people who use AI as lazier, less competent, and less diligent; (3) even managers who use AI themselves are less likely to hire job applicants who use AI; because (4) they perceive these workers as lazier and less competent.

Admittedly, a lot has happened since this study was first published.  Most notable is the release of ChatGPT by Open AI and the seemingly overnight explosion of ChatGPT use by just about everyone for just about anything.  I wonder if the results would be similar if the study was repeated today.  More importantly, the study did not determine whether people who use AI tools are indeed lazier, less competent, or less diligent.  It merely showed that they are perceived as such.  Future studies will hopefully answer these questions and more.  For now, we are left with concern and speculation about the impact of technological progress that goes as far back as antiquity.  AI would appear to be today's Phaedrus moment...

Thursday, July 24, 2025

Amusing ourselves to death...

As I've mentioned a few times in the past (see "Hell keeps freezing over..." and "I can't tell you why..."), I am a huge fan of the rock-n-roll band, The Eagles.  After the band first broke up (some thought for good) in 1980, lead singer, co-founder, and drummer Don Henley embarked on a solo career, releasing his first album "I Can't Stand Still" in 1982.  The second hit single from the album was "Dirty Laundry", which peaked at number 3 on the Billboard Hot 100 that same year.  It was a great song about sensationalism in the media:

We got the bubble-headed bleached-blonde, comes on at five.  
She can tell you 'bout the plane crash with a gleam in her eye
It's interesting when people die
Give us dirty laundry.

Well, it was exactly that lyric that kept popping into my mind when I read Amusing Ourselves to Death: Public discourse in the Age of Show Business by the culture critic, author, and educator Neil Postman.  Postman died in 2003, so the book is a little old.  Surprisingly though, it is not outdated!  He focuses upon how television, the most important form of mass media at the time, has fundamentally changed how we view the world.  News has become entertainment.  What I found interesting was how he said our contemporary world (and I think his comments are just as true today as they were when the book first came out in 1985) was better reflected by Aldous Huxley's novel Brave New World, where the public is oppressed by their addiction to entertainment and pleasure, as opposed to George Orwell's novel 1984, in which the public is oppressed by the state.  Television has become our soma, Huxley's "opiate of the masses".  

As Terence Moran wrote in his 1984 essay, "Politics 1984:That's Entertainment", "Orwell was wrong...The dominant metaphor for our own 1984 is not Orwell's image of a boot stamping down on the race of humanity but the magical and instantaneous solutions to all our problems through technology...In this technological society, we have replaced freedom with license, dignity with position, truth with credibility, love with gratification, justice with legality, and ideas with images."  

Postman builds upon Moran's essay and particularly criticizes the news media and what he calls the "Now...this" culture that it has created.  Echoing Don Henley's "Dirty Laundry", Postman writes that ""...many newscasters do not appear to grasp the meaning of what they are saying, and some hold to a fixed and ingratiating enthusiasm as they report on earthquakes, mass killings, and other disasters...the viewers also know that no matter how grave any fragment of news may appear...it will shortly be followed by a series of commercials that will, in an instant, defuse the import of the news."

Postman also talks about the breakdown of trust in society, again largely placing the blame on television as the principal source of information in society, at least back then.  He writes, "The credibility of the teller is the ultimate test of the truth of a proposition.  'Credibility' here does not refer to the past record of the teller for making statements that have survived the rigors of reality-testing. It refers only to the impression of sincerity, authenticity, vulnerability, or attractiveness (choose one or more) conveyed by the actor/reporter...This is a matter of considerable importance, for it goes beyond the question of how truth is perceived on television news shows.  If on television, credibility replaces reality as the decisive test of truth-telling, political leaders need not trouble themselves very much with reality provided that their performances consistently generate a sense of verisimilitude."  

What is true of the television news reporter is unfortunately even more true of the politician.  Postman laments the fact that politics has focused upon the appearance of sincerity and authenticity (read here "attractiveness") as opposed to actually telling the truth.  He goes on to describe, in words that are eerily reminiscent of today's Internet, television as "...altering the meaning of 'being informed' by creating a species of information that might properly be called disinformation...Disinformation does not mean false information.  It means misleading information - misplaces, irrelevant, fragmented, or superficial information - information that creates the illusion of knowing something but which in fact leads one away from knowing it."

As he goes on to compare and contrast today's society with the dystopian novels of both Aldous Huxley and George Orwell (both of which I had to read in high school), he writes, "Censorship, after all, is the tribute tyrants pay to the assumption that a public knows the difference between serious discourse and entertainment - and cares."  In the Orwellian universe, the public falls victim to state oppression through censorship.  However, in order for censorship to be meaningfully effective, the public has to (1) know the difference between serious discourse and entertainment and (2) more importantly, care that there is a difference.  In Huxley's universe, the public neither knows the difference nor cares about it.  Postman suggests that Huxley's world is the world in which we live today.

I can only imagine what Neil Postman would think about what is happening in our world today.  Social media has taken over as the source of information for most Americans - certainly those in the younger generations.  Disinformation no longer just seems to be the norm, it is the norm.  We have become what Postman perhaps most feared.  Our world has become more like Huxley's than Postman could have ever known.

Tuesday, July 22, 2025

QWERTY

As I have shared previously, I tend to buy more books than I can read (see my two posts "Today's word is...Tsundoku" and "Anti-Library").  My wife is of course supportive, but she once asked why I just didn't check out books from our local public library instead of buying them on Amazon.  Now I have a stack of library books on my nightstand!  

I finished a book a few months ago that I am almost 100% sure that I first purchased during the COVID-19 pandemic - Jared Diamond's Pulitzer Prize-winning book, Guns, Germs, and Steel.  I really enjoyed it, and now I am ready to read his next one (which, of course, is also sitting on my bookshelf).  The theme of the book can be summarized with one simple question - "Why did history take a different course on different continents?"  Diamond begins his detailed answer and explanation with a simple story about the invention of the typewriter.  He claims that the original keyboard that is widely used today (called the "QWERTY" keyboard, because the first keys on the top left are the letters Q, W, E, R, T, and Y) came about as a result of "anti-engineering" when first designed in 1873.

Diamond writes, "QWERTY...employs a whole series of perverse tricks designed to force typists to type as slowly as possible, such as scatter­ing the commonest letters over all keyboard rows and concentrating them on the left side (where right-handed people have to use their weaker hand). The reason behind all of those seemingly counterproductive features is that the typewriters of 1873 jammed if adjacent keys were struck in quick suc­cession, so that manufacturers had to slow down typists."

The very first commercially successful typewriter was called the Sholes and Glidden typewriter (also known as Remington 1), as it was first designed by the American inventors Christopher Latham Sholes, Samuel W. Soule, James Denmore, and Carlos S. Glidden.  Their design was later purchased by E. Remington and Sons, ironically enough, a firearms manufacturer (perhaps the pen is mightier than the sword) in 1873.  Whenever a letter key was pressed on this early model (and most models that subsequently followed), the corresponding type-bar (which looked like a hammer with a letter on the end) swung upwards, striking an inked ribbon and pressing the letter onto the paper. The paper was held on a rotating cylinder that moved incrementally after each keystroke, allowing for sequential typing.  If the typist hit each key too quickly, the type-bars would get tangled and the typewriter would jam.  The QWERTY arrangement of keys reduced the likelihood that the type-bars would jam, by placing commonly used combinations of letters farther from each other inside the machine.  At least that is how the story supposedly went.

Fast forward to the 1930's, when improvements in the design of the typewriter eliminated the risk of jamming (or at least significantly reduced the risk).  The layout of the keys was changed, resulting in a significant increase in typing speed (almost doubling the number of words that could be typed per minute).  For example, August Dvorak patented his Dvorak keyboard, which not only increased the typing speed, but also reduced repetitive strain injuries because it was much more comfortable.  

Again, Diamond writes, "When improve­ments in typewriters eliminated the problem of jamming, trials in 1932 with an efficiently laid-out keyboard showed that it would let us double our typing speed and reduce our typing effort by 95 percent. But QWERTY keyboards were solidly entrenched by then. The vested interests of hundreds of millions of QWERTY typists, typing teachers, typewriter and computer salespeople, and manufacturers have crushed all moves toward keyboard efficiency for over 60 years." 

Diamond used the QWERTY analogy to explain how history may often be explained by serendipity.  In  other words, some chance event leads to an eventual outcome that is unexpected, unforeseen, and unplanned.  The economists Paul David (see "Clio and the Economics of QWERTY") and Brian Arthur ("Competing technologies, increasing returns, and lock-in by historical events") have used the QWERTY story to talk about the concepts of path-dependence ("history matters") and increasing returns ("an increase in input results in a proportionally larger increase in output"), respectively.

It's a great story.  Unfortunately, it's a somewhat controversial one.  I would also recommend taking a look at an article by Stan Liebowitz and Stephen Margolis, "The Fable of the Keys" and Peter Lewin's article "The market process and the economics of QWERTY: Two views" for a balanced argument.  

I'm not here to dispel any myths or provide a counterclaim to the QWERTY story.  If I were to be 100% honest, I'd like to believe the story as presented by Jared Diamond (although I don't think he was the first to make the case).  What is not controversial is the fact that almost every keyboard in use today is based upon the original QWERTY lay-out.  It would be hard to change at this point.  Whether you call it "first-mover advantage", "path-dependence", "network effects", or "increasing returns" probably doesn't matter.  I don't see the QWERTY lay-out being replaced anytime soon.

Sunday, July 20, 2025

"AI is the elevator..."

I want to re-visit two posts from this past year.  The first, "Are smart phones making us dumb?" talks about the journalist, Nicholas Carr, who wrote an article for The Atlantic in 2008 entitled, "Is Google Making Us Stupid?"  Carr further explored this theme in his book, The Shallows: What the Internet Is Doing to Our Brains, suggesting that our online reading habits have changed not only how we read, but also how we think.  The second post ("Why the past 10 years of American life have been uniquely stupid...") was based on an essay that the writer Jonathan Haidt (perhaps most famous for his incredibly insightful book, The Anxious Generation) wrote in The Atlantic in 2022, "Why the past 10 years of American life have been uniquely stupid".  Haidt in particular writes about the dangers of social media and the adverse impact that social media has had upon society today.

I think both Carr and Haidt have an important message that should be widely shared.  However, in today's post I want to build upon their theme with a particular focus on artificial intelligence (AI).  You've probably heard a lot about AI lately.  Chances are, you've probably used some form of AI in the last 30 minutes!  Keeping with today's theme, the blogger Arshitha S. Ashok recently wrote an excellent post on Medium that asked the question, "Is AI Making Us Dumb?"  Ashok opens her post by writing, "The human brain has always adapted remarkably well to technology.  But what happens when the technology starts doing the thinking for us?"

It's a great question.  Ashok provides an excellent example with GPS and Google Maps.  When was the last time that you actually used an old-fashioned map to find where you are going?  I can't even remember the last time.  It's so easy to just type in a location, address, or name of a store on a smart phone app and follow the directions to get anywhere these days, that old-fashioned maps have become useless.  Unfortunately, the ease of GPS navigation comes at a cost.  We have lost the ability to read maps.  If we ever have to go back to the "old days" without GPS navigation, we are going to be in big, big trouble.  Can you imagine what would happen if the London hackneys switched to GPS navigation?

Apps have become so ubiquitous, and they have made our lives easier.  But at what cost?  Have we lost important skills that will be necessary in the future?  Just think about the lost art of cursive writing and how students today can't read anything in cursive (no matter that just about everything written prior to the 21st century was written in cursive).

But so far, I've just talked about computer applications that are supposed to make our lives easier.  What happens when machines start to think for us?  Well, guess what? We are there.  I can't tell you how many people I know use ChatGPT to write business correspondence, letters of recommendation, Powerpoint presentations, etc.  Many hospitals are now using AI as scribes to document patient encounters in the electronic medical record.  

Don't get me wrong.  I'm not being a Luddite (see John Cassidy's recent article in The New Yorker "How to survive the A.I. revolution" for more).  As Andrew Maynard writes in Fast Company (see "The true meaning of the term Luddite"), "...questioning technology doesn't mean rejecting it.  Just because I question whether using AI and technology has long-term adverse effects doesn't necessarily mean that I don't support using technology.

The problem is that there is now evidence to suggest that using AI comes with a cost.  Michael Gerlich ("AI tools in society: Impacts on cognitive offloading and the future of critical thinking") found a negative correlation between frequent AI tool usage and critical thinking abilities, mediated by increased cognitive offloading.  Just as we have lost the ability to read an old-fashioned map because we use Google Maps instead, our brains have grown accustomed to using AI tools instead to analyze, evaluate, and synthesize information to make informed decisions.  As the saying goes, "Use it or lose it!"  It's as if our brain was like a muscle - the less we use it, the weaker it gets.

Similarly, a group of MIT researchers ("Your Brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant or essay writing task") used brain mapping technology to show that individuals who use ChatGPT to write essays have lower brain activity!  The study divided 54 subjects between the ages of 18 and 39 years into three groups and asked them to write several essays using OpenAI’s ChatGPT, Google’s search engine, and their own intellect, respectively.  ChatGPT users had the lowest brain engagement and "consistently underperformed at neural, linguistic, and behavioral levels" compared to the other two groups.  Not surprising, over the course of the study, which lasted several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study.  These individuals had the lowest brain activity.  Now, it's important to realize that this was a small study that hasn't gone through peer review (in other words, it hasn't been published in a science journal).  Regardless, it will be important to see further research in this area.

Whether frequent cognitive offloading with AI technology will result in true changes in brain activity remains to be seen.  However, the evidence so far is fairly concerning.  A college physics professor named Rhett Allain said it best, when he said, "AI is the elevator, thinking is taking the stairs."  If you use the elevator all the time, you aren't going to be in shape enough to take the stairs ever again...

Friday, July 18, 2025

Fourteen wolves

I recently came across one of those social media posts that I thought was worth sharing (mostly because the story is actually true this time).  The post used the 1995 reintroduction of wolves to Yellowstone National Park to emphasize how we, as leaders, can fix broken systems and broken organizations.  Yellowstone was the world's first national park.  As an aside, contrary to popular belief, the law that created Yellowstone National Park was signed by President Ulysses S. Grant, not President Theodore Roosevelt!  Gray wolves were an important part of the Yellowstone ecosystem, though that was unfortunately not recognized until much, much later.

The state of Montana instituted a wolf bounty in 1884, in which trappers would receive one dollar (a lot of money at that time) per wolf killed.  Wolves were considered a menace to the herds of elk, deer, mountain sheep, and antelope, and over the next 25-50 years, there was a concerted effort to exterminate wolves in Yellowstone National Park and the surrounding area.  By the 1940's to 1950's, wolf sightings at Yellowstone were quite rare.  The efforts at extermination had been successful.

Unfortunately, once the wolves disappeared, conditions at Yellowstone National Park drastically changed - for the worse.  In the absence of a predator, the elk population exploded.  Overgrazing led to a dramatic die-off of grass and tree species such as aspen and cottonwood, as well as soil erosion.  The National Park Service responded by trying to limit the elk population with hunting, trapping, and other methods.  Over the next several years, the elk population plummeted.  Hunters began to complain to their representatives in Congress, and the park service stopped trying to control the elk population.

Once the elk population rebounded, the same overgrazing issues returned.  Other local animal populations were adversely impacted.  Coyote populations increased, which adversely affected the antelope population.  If this sounds a lot like my post, "For want of a nail..." and "Butterfly Wings and Stone Heads", there's a good reason.  The entire history of the Yellowstone gray wolf is a great example of complexity theory and complex adaptive systems.  I am also reminded of the famous "law of unintended consequences".  

Fast forward to 1974, at which time the gray wolf was listed under the Endangered Species Act.  Gray wolves became a protected species, which subsequently led to attempts at re-introducing them into the wild.  A project to re-introduce the gray wolf to Yellowstone and the surrounding region was first proposed in 1991, and a more definitive plan was developed and made available for public comment in 1994.  By January 1995, two shipments of fourteen wolves arrived from Canada and were transferred to Yellowstone Park.  After a period of acclimation, the wolves were released into the wild.  Seventeen more gray wolves were brought to Yellowstone in January, 1996.  The population of wolves in Yellowstone National Park recovered, and importantly, as of April 26, 2017, gray wolves were removed from the list of endangered species in Montana, Idaho, and Wyoming.

Most recent estimates suggest that the population of gray wolves at Yellowstone has increased to between 90-110 wolves in the park (with a total of about 500 wolves in the surrounding region).  Just as important, the local elk population has stabilized, and as a result, the native flora and fauna of Yellowstone National Park have returned.  The population of coyotes has fallen to "sustainable levels" with similar impact.  The story of the Yellowstone wolves is a remarkable story.

Aside from being yet another great example of complex adaptive systems, the wolf story is a great metaphor for organizational health.  As Olaf Boettger says in his LinkedIn post "What 14 wolves taught me about fixing broken systems...", "Everything connects to everything else as a system."  Just as important, "Sometimes the thing that's missing is simple."  Find the gray wolf in your organization to fix the entire ecosystem.

Wednesday, July 16, 2025

The Quiet Commute

My wife and I took the Red Line "L" train to go see a Chicago White Sox game this past weekend.  It took us almost an hour to get there, so we definitely had time to "people watch".  Both of us noticed two college athletes (they were wearing T-shirts with their college name and I could read their nametags on their backpacks) who were obviously together and going someplace fun.  Both individuals were wearing headphones, and both of them spent the entire duration of their ride staring intently at their smart phones.  I don't think they said one word to each other.

I've been using public transportation a lot lately for my work commute.  Just like our experience above, I've often noticed that most people stare down at their smart phones and rarely converse with their fellow commuters.  In full disclosure, I don't engage in conversation with my fellow commuters either.  I usually bring a book to read, and I often sit alone on the upper train level, because it is quiet and the single seats allow me to remain alone.

Now, based on a few of my more recent posts blaming everything that is wrong in our world on social media ("Liberation"), smart phones ("Are smart phones making us dumb?" ), or the Internet ("Why the past 10 years of American life have been uniquely stupid..."), you're probably thinking this is going to be another anti-technology rant!  Not so!  I am going to let you come to your own conclusions this time.  I just want to point out that this issue of self-imposed isolation isn't so new.

As it turns out, back in 1946, the American filmmaker and photographer Stanley Kubrick (Kubrick directed or produced such hits as Spartacus, Lolita, Dr. Strangelove2001: A Space Odyssey, A Clockwork Orange, The Shining, and Full Metal Jacket) was a staff photographer for Look magazine and set out to photograph New York City's subway commuters.  His photographs were later published in a pictorial series entitled "Life and Love on the New York City Subway".  As you can see in the photo below, times haven't really changed much in the last 79 years.  Instead of reading a magazine or newspaper, commuters now read their iPads and smart phones, listen to music, or work on their laptop computers.


I'm not going to say whether it's right or wrong that people spend most of their time looking at their smart phones instead of interacting.  I will let you be the judge of that, and I do believe that I've been very clear on my opinion in previous posts.  However, to say that our tendency to ignore what is going on around us is a new phenomenon or is even a generational difference is completely false.  If you wish to argue that smartphones have made these tendencies worse, then I completely agree!  The so-called "quiet commute" is not new, but it's definitely worse.