Thursday, July 24, 2025

Amusing ourselves to death...

As I've mentioned a few times in the past (see "Hell keeps freezing over..." and "I can't tell you why..."), I am a huge fan of the rock-n-roll band, The Eagles.  After the band first broke up (some thought for good) in 1980, lead singer, co-founder, and drummer Don Henley embarked on a solo career, releasing his first album "I Can't Stand Still" in 1982.  The second hit single from the album was "Dirty Laundry", which peaked at number 3 on the Billboard Hot 100 that same year.  It was a great song about sensationalism in the media:

We got the bubble-headed bleached-blonde, comes on at five.  
She can tell you 'bout the plane crash with a gleam in her eye
It's interesting when people die
Give us dirty laundry.

Well, it was exactly that lyric that kept popping into my mind when I read Amusing Ourselves to Death: Public discourse in the Age of Show Business by the culture critic, author, and educator Neil Postman.  Postman died in 2003, so the book is a little old.  Surprisingly though, it is not outdated!  He focuses upon how television, the most important form of mass media at the time, has fundamentally changed how we view the world.  News has become entertainment.  What I found interesting was how he said our contemporary world (and I think his comments are just as true today as they were when the book first came out in 1985) was better reflected by Aldous Huxley's novel Brave New World, where the public is oppressed by their addiction to entertainment and pleasure, as opposed to George Orwell's novel 1984, in which the public is oppressed by the state.  Television has become our soma, Huxley's "opiate of the masses".  

As Terence Moran wrote in his 1984 essay, "Politics 1984:That's Entertainment", "Orwell was wrong...The dominant metaphor for our own 1984 is not Orwell's image of a boot stamping down on the race of humanity but the magical and instantaneous solutions to all our problems through technology...In this technological society, we have replaced freedom with license, dignity with position, truth with credibility, love with gratification, justice with legality, and ideas with images."  

Postman builds upon Moran's essay and particularly criticizes the news media and what he calls the "Now...this" culture that it has created.  Echoing Don Henley's "Dirty Laundry", Postman writes that ""...many newscasters do not appear to grasp the meaning of what they are saying, and some hold to a fixed and ingratiating enthusiasm as they report on earthquakes, mass killings, and other disasters...the viewers also know that no matter how grave any fragment of news may appear...it will shortly be followed by a series of commercials that will, in an instant, defuse the import of the news."

Postman also talks about the breakdown of trust in society, again largely placing the blame on television as the principal source of information in society, at least back then.  He writes, "The credibility of the teller is the ultimate test of the truth of a proposition.  'Credibility' here does not refer to the past record of the teller for making statements that have survived the rigors of reality-testing. It refers only to the impression of sincerity, authenticity, vulnerability, or attractiveness (choose one or more) conveyed by the actor/reporter...This is a matter of considerable importance, for it goes beyond the question of how truth is perceived on television news shows.  If on television, credibility replaces reality as the decisive test of truth-telling, political leaders need not trouble themselves very much with reality provided that their performances consistently generate a sense of verisimilitude."  

What is true of the television news reporter is unfortunately even more true of the politician.  Postman laments the fact that politics has focused upon the appearance of sincerity and authenticity (read here "attractiveness") as opposed to actually telling the truth.  He goes on to describe, in words that are eerily reminiscent of today's Internet, television as "...altering the meaning of 'being informed' by creating a species of information that might properly be called disinformation...Disinformation does not mean false information.  It means misleading information - misplaces, irrelevant, fragmented, or superficial information - information that creates the illusion of knowing something but which in fact leads one away from knowing it."

As he goes on to compare and contrast today's society with the dystopian novels of both Aldous Huxley and George Orwell (both of which I had to read in high school), he writes, "Censorship, after all, is the tribute tyrants pay to the assumption that a public knows the difference between serious discourse and entertainment - and cares."  In the Orwellian universe, the public falls victim to state oppression through censorship.  However, in order for censorship to be meaningfully effective, the public has to (1) know the difference between serious discourse and entertainment and (2) more importantly, care that there is a difference.  In Huxley's universe, the public neither knows the difference nor cares about it.  Postman suggests that Huxley's world is the world in which we live today.

I can only imagine what Neil Postman would think about what is happening in our world today.  Social media has taken over as the source of information for most Americans - certainly those in the younger generations.  Disinformation no longer just seems to be the norm, it is the norm.  We have become what Postman perhaps most feared.  Our world has become more like Huxley's than Postman could have ever known.

Tuesday, July 22, 2025

QWERTY

As I have shared previously, I tend to buy more books than I can read (see my two posts "Today's word is...Tsundoku" and "Anti-Library").  My wife is of course supportive, but she once asked why I just didn't check out books from our local public library instead of buying them on Amazon.  Now I have a stack of library books on my nightstand!  

I finished a book a few months ago that I am almost 100% sure that I first purchased during the COVID-19 pandemic - Jared Diamond's Pulitzer Prize-winning book, Guns, Germs, and Steel.  I really enjoyed it, and now I am ready to read his next one (which, of course, is also sitting on my bookshelf).  The theme of the book can be summarized with one simple question - "Why did history take a different course on different continents?"  Diamond begins his detailed answer and explanation with a simple story about the invention of the typewriter.  He claims that the original keyboard that is widely used today (called the "QWERTY" keyboard, because the first keys on the top left are the letters Q, W, E, R, T, and Y) came about as a result of "anti-engineering" when first designed in 1873.

Diamond writes, "QWERTY...employs a whole series of perverse tricks designed to force typists to type as slowly as possible, such as scatter­ing the commonest letters over all keyboard rows and concentrating them on the left side (where right-handed people have to use their weaker hand). The reason behind all of those seemingly counterproductive features is that the typewriters of 1873 jammed if adjacent keys were struck in quick suc­cession, so that manufacturers had to slow down typists."

The very first commercially successful typewriter was called the Sholes and Glidden typewriter (also known as Remington 1), as it was first designed by the American inventors Christopher Latham Sholes, Samuel W. Soule, James Denmore, and Carlos S. Glidden.  Their design was later purchased by E. Remington and Sons, ironically enough, a firearms manufacturer (perhaps the pen is mightier than the sword) in 1873.  Whenever a letter key was pressed on this early model (and most models that subsequently followed), the corresponding type-bar (which looked like a hammer with a letter on the end) swung upwards, striking an inked ribbon and pressing the letter onto the paper. The paper was held on a rotating cylinder that moved incrementally after each keystroke, allowing for sequential typing.  If the typist hit each key too quickly, the type-bars would get tangled and the typewriter would jam.  The QWERTY arrangement of keys reduced the likelihood that the type-bars would jam, by placing commonly used combinations of letters farther from each other inside the machine.  At least that is how the story supposedly went.

Fast forward to the 1930's, when improvements in the design of the typewriter eliminated the risk of jamming (or at least significantly reduced the risk).  The layout of the keys was changed, resulting in a significant increase in typing speed (almost doubling the number of words that could be typed per minute).  For example, August Dvorak patented his Dvorak keyboard, which not only increased the typing speed, but also reduced repetitive strain injuries because it was much more comfortable.  

Again, Diamond writes, "When improve­ments in typewriters eliminated the problem of jamming, trials in 1932 with an efficiently laid-out keyboard showed that it would let us double our typing speed and reduce our typing effort by 95 percent. But QWERTY keyboards were solidly entrenched by then. The vested interests of hundreds of millions of QWERTY typists, typing teachers, typewriter and computer salespeople, and manufacturers have crushed all moves toward keyboard efficiency for over 60 years." 

Diamond used the QWERTY analogy to explain how history may often be explained by serendipity.  In  other words, some chance event leads to an eventual outcome that is unexpected, unforeseen, and unplanned.  The economists Paul David (see "Clio and the Economics of QWERTY") and Brian Arthur ("Competing technologies, increasing returns, and lock-in by historical events") have used the QWERTY story to talk about the concepts of path-dependence ("history matters") and increasing returns ("an increase in input results in a proportionally larger increase in output"), respectively.

It's a great story.  Unfortunately, it's a somewhat controversial one.  I would also recommend taking a look at an article by Stan Liebowitz and Stephen Margolis, "The Fable of the Keys" and Peter Lewin's article "The market process and the economics of QWERTY: Two views" for a balanced argument.  

I'm not here to dispel any myths or provide a counterclaim to the QWERTY story.  If I were to be 100% honest, I'd like to believe the story as presented by Jared Diamond (although I don't think he was the first to make the case).  What is not controversial is the fact that almost every keyboard in use today is based upon the original QWERTY lay-out.  It would be hard to change at this point.  Whether you call it "first-mover advantage", "path-dependence", "network effects", or "increasing returns" probably doesn't matter.  I don't see the QWERTY lay-out being replaced anytime soon.

Sunday, July 20, 2025

"AI is the elevator..."

I want to re-visit two posts from this past year.  The first, "Are smart phones making us dumb?" talks about the journalist, Nicholas Carr, who wrote an article for The Atlantic in 2008 entitled, "Is Google Making Us Stupid?"  Carr further explored this theme in his book, The Shallows: What the Internet Is Doing to Our Brains, suggesting that our online reading habits have changed not only how we read, but also how we think.  The second post ("Why the past 10 years of American life have been uniquely stupid...") was based on an essay that the writer Jonathan Haidt (perhaps most famous for his incredibly insightful book, The Anxious Generation) wrote in The Atlantic in 2022, "Why the past 10 years of American life have been uniquely stupid".  Haidt in particular writes about the dangers of social media and the adverse impact that social media has had upon society today.

I think both Carr and Haidt have an important message that should be widely shared.  However, in today's post I want to build upon their theme with a particular focus on artificial intelligence (AI).  You've probably heard a lot about AI lately.  Chances are, you've probably used some form of AI in the last 30 minutes!  Keeping with today's theme, the blogger Arshitha S. Ashok recently wrote an excellent post on Medium that asked the question, "Is AI Making Us Dumb?"  Ashok opens her post by writing, "The human brain has always adapted remarkably well to technology.  But what happens when the technology starts doing the thinking for us?"

It's a great question.  Ashok provides an excellent example with GPS and Google Maps.  When was the last time that you actually used an old-fashioned map to find where you are going?  I can't even remember the last time.  It's so easy to just type in a location, address, or name of a store on a smart phone app and follow the directions to get anywhere these days, that old-fashioned maps have become useless.  Unfortunately, the ease of GPS navigation comes at a cost.  We have lost the ability to read maps.  If we ever have to go back to the "old days" without GPS navigation, we are going to be in big, big trouble.  Can you imagine what would happen if the London hackneys switched to GPS navigation?

Apps have become so ubiquitous, and they have made our lives easier.  But at what cost?  Have we lost important skills that will be necessary in the future?  Just think about the lost art of cursive writing and how students today can't read anything in cursive (no matter that just about everything written prior to the 21st century was written in cursive).

But so far, I've just talked about computer applications that are supposed to make our lives easier.  What happens when machines start to think for us?  Well, guess what? We are there.  I can't tell you how many people I know use ChatGPT to write business correspondence, letters of recommendation, Powerpoint presentations, etc.  Many hospitals are now using AI as scribes to document patient encounters in the electronic medical record.  

Don't get me wrong.  I'm not being a Luddite (see John Cassidy's recent article in The New Yorker "How to survive the A.I. revolution" for more).  As Andrew Maynard writes in Fast Company (see "The true meaning of the term Luddite"), "...questioning technology doesn't mean rejecting it.  Just because I question whether using AI and technology has long-term adverse effects doesn't necessarily mean that I don't support using technology.

The problem is that there is now evidence to suggest that using AI comes with a cost.  Michael Gerlich ("AI tools in society: Impacts on cognitive offloading and the future of critical thinking") found a negative correlation between frequent AI tool usage and critical thinking abilities, mediated by increased cognitive offloading.  Just as we have lost the ability to read an old-fashioned map because we use Google Maps instead, our brains have grown accustomed to using AI tools instead to analyze, evaluate, and synthesize information to make informed decisions.  As the saying goes, "Use it or lose it!"  It's as if our brain was like a muscle - the less we use it, the weaker it gets.

Similarly, a group of MIT researchers ("Your Brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant or essay writing task") used brain mapping technology to show that individuals who use ChatGPT to write essays have lower brain activity!  The study divided 54 subjects between the ages of 18 and 39 years into three groups and asked them to write several essays using OpenAI’s ChatGPT, Google’s search engine, and their own intellect, respectively.  ChatGPT users had the lowest brain engagement and "consistently underperformed at neural, linguistic, and behavioral levels" compared to the other two groups.  Not surprising, over the course of the study, which lasted several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study.  These individuals had the lowest brain activity.  Now, it's important to realize that this was a small study that hasn't gone through peer review (in other words, it hasn't been published in a science journal).  Regardless, it will be important to see further research in this area.

Whether frequent cognitive offloading with AI technology will result in true changes in brain activity remains to be seen.  However, the evidence so far is fairly concerning.  A college physics professor named Rhett Allain said it best, when he said, "AI is the elevator, thinking is taking the stairs."  If you use the elevator all the time, you aren't going to be in shape enough to take the stairs ever again...

Friday, July 18, 2025

Fourteen wolves

I recently came across one of those social media posts that I thought was worth sharing (mostly because the story is actually true this time).  The post used the 1995 reintroduction of wolves to Yellowstone National Park to emphasize how we, as leaders, can fix broken systems and broken organizations.  Yellowstone was the world's first national park.  As an aside, contrary to popular belief, the law that created Yellowstone National Park was signed by President Ulysses S. Grant, not President Theodore Roosevelt!  Gray wolves were an important part of the Yellowstone ecosystem, though that was unfortunately not recognized until much, much later.

The state of Montana instituted a wolf bounty in 1884, in which trappers would receive one dollar (a lot of money at that time) per wolf killed.  Wolves were considered a menace to the herds of elk, deer, mountain sheep, and antelope, and over the next 25-50 years, there was a concerted effort to exterminate wolves in Yellowstone National Park and the surrounding area.  By the 1940's to 1950's, wolf sightings at Yellowstone were quite rare.  The efforts at extermination had been successful.

Unfortunately, once the wolves disappeared, conditions at Yellowstone National Park drastically changed - for the worse.  In the absence of a predator, the elk population exploded.  Overgrazing led to a dramatic die-off of grass and tree species such as aspen and cottonwood, as well as soil erosion.  The National Park Service responded by trying to limit the elk population with hunting, trapping, and other methods.  Over the next several years, the elk population plummeted.  Hunters began to complain to their representatives in Congress, and the park service stopped trying to control the elk population.

Once the elk population rebounded, the same overgrazing issues returned.  Other local animal populations were adversely impacted.  Coyote populations increased, which adversely affected the antelope population.  If this sounds a lot like my post, "For want of a nail..." and "Butterfly Wings and Stone Heads", there's a good reason.  The entire history of the Yellowstone gray wolf is a great example of complexity theory and complex adaptive systems.  I am also reminded of the famous "law of unintended consequences".  

Fast forward to 1974, at which time the gray wolf was listed under the Endangered Species Act.  Gray wolves became a protected species, which subsequently led to attempts at re-introducing them into the wild.  A project to re-introduce the gray wolf to Yellowstone and the surrounding region was first proposed in 1991, and a more definitive plan was developed and made available for public comment in 1994.  By January 1995, two shipments of fourteen wolves arrived from Canada and were transferred to Yellowstone Park.  After a period of acclimation, the wolves were released into the wild.  Seventeen more gray wolves were brought to Yellowstone in January, 1996.  The population of wolves in Yellowstone National Park recovered, and importantly, as of April 26, 2017, gray wolves were removed from the list of endangered species in Montana, Idaho, and Wyoming.

Most recent estimates suggest that the population of gray wolves at Yellowstone has increased to between 90-110 wolves in the park (with a total of about 500 wolves in the surrounding region).  Just as important, the local elk population has stabilized, and as a result, the native flora and fauna of Yellowstone National Park have returned.  The population of coyotes has fallen to "sustainable levels" with similar impact.  The story of the Yellowstone wolves is a remarkable story.

Aside from being yet another great example of complex adaptive systems, the wolf story is a great metaphor for organizational health.  As Olaf Boettger says in his LinkedIn post "What 14 wolves taught me about fixing broken systems...", "Everything connects to everything else as a system."  Just as important, "Sometimes the thing that's missing is simple."  Find the gray wolf in your organization to fix the entire ecosystem.

Wednesday, July 16, 2025

The Quiet Commute

My wife and I took the Red Line "L" train to go see a Chicago White Sox game this past weekend.  It took us almost an hour to get there, so we definitely had time to "people watch".  Both of us noticed two college athletes (they were wearing T-shirts with their college name and I could read their nametags on their backpacks) who were obviously together and going someplace fun.  Both individuals were wearing headphones, and both of them spent the entire duration of their ride staring intently at their smart phones.  I don't think they said one word to each other.

I've been using public transportation a lot lately for my work commute.  Just like our experience above, I've often noticed that most people stare down at their smart phones and rarely converse with their fellow commuters.  In full disclosure, I don't engage in conversation with my fellow commuters either.  I usually bring a book to read, and I often sit alone on the upper train level, because it is quiet and the single seats allow me to remain alone.

Now, based on a few of my more recent posts blaming everything that is wrong in our world on social media ("Liberation"), smart phones ("Are smart phones making us dumb?" ), or the Internet ("Why the past 10 years of American life have been uniquely stupid..."), you're probably thinking this is going to be another anti-technology rant!  Not so!  I am going to let you come to your own conclusions this time.  I just want to point out that this issue of self-imposed isolation isn't so new.

As it turns out, back in 1946, the American filmmaker and photographer Stanley Kubrick (Kubrick directed or produced such hits as Spartacus, Lolita, Dr. Strangelove2001: A Space Odyssey, A Clockwork Orange, The Shining, and Full Metal Jacket) was a staff photographer for Look magazine and set out to photograph New York City's subway commuters.  His photographs were later published in a pictorial series entitled "Life and Love on the New York City Subway".  As you can see in the photo below, times haven't really changed much in the last 79 years.  Instead of reading a magazine or newspaper, commuters now read their iPads and smart phones, listen to music, or work on their laptop computers.


I'm not going to say whether it's right or wrong that people spend most of their time looking at their smart phones instead of interacting.  I will let you be the judge of that, and I do believe that I've been very clear on my opinion in previous posts.  However, to say that our tendency to ignore what is going on around us is a new phenomenon or is even a generational difference is completely false.  If you wish to argue that smartphones have made these tendencies worse, then I completely agree!  The so-called "quiet commute" is not new, but it's definitely worse.

Monday, July 14, 2025

Personal Bookshelf

When we put up our house in Cincinnati for sale about five years or so ago, our real estate agent came through and "staged" our house for showing.  One of the most peculiar things that she did was to turn every book in our home office backwards, so that the spines (and titles) of the books didn't show.  We never really asked her why she did that, but as I recently learned (thank you Google AI), the practice is fairly common and mostly is for aesthetic reasons.  The practice creates a neutral, uniform, and minimalist look and feel (you don't see all the different colors of the books on the shelf).  It also prevents distraction and de-personalizes the owners, whose personal tastes and/or political views could turn off potential buyers.  Lastly (and perhaps least important), it avoids copyright issues if they want to take photographs and post them online.  

While I don't think that our bookshelf is particularly controversial (we own a lot of history books and presidential biographies), I have to admit that the books that my wife and I own reveal a lot about who we are and what we value.  I guess I have to agree with CNN Contributor David G. Allan (who writes for online for "The Wisdom Project") and his article "Why shelfies not selfies are a better snapshot of who you are".  Like Allan, whenever I walk into someone's house (or even someone's office at work), I often catch myself looking at their bookshelf to see what kinds of books that they've read.  Allan actually met his wife this way!  He says, "Seeing someone's books offers a glimpse of who they are and what they value."

I really enjoy looking over various "book lists" of recommended reading, ranging from the Rory Gilmore Reading Challenge (from the television show "The Gilmore Girls") to former President Barack Obama's Summer Reading List.  I have looked over the Chief of Naval Operation's Professional Reading List and Boston College's Father Deenan Reading List with great interest.  I have enjoyed the series of books by Admiral (retired) James Stavridis - The Leader's Bookshelf, The Sailor's Bookshelf, and The Admiral's Bookshelf.  Call me a bibliophile for sure.

David Allan writes, "You may not have a biography written about your life, but you have a personal bibliography.  And many of the books you read influence your thoughts and life...Books, and stories in particular, are probably the greatest source of wisdom after experience."  As the English writer and politician Joseph Addison once said, "Reading is to the mind what exercise is to the body."  In other words, what you have read - your personal bookshelf (or as David Allan calls it, your "shelfie") says a lot about who you are, because what you have read in your lifetime has a lot of influence on who you are and what you value.  

Allan goes on to say that for the past 20 years, he has kept a notebook filled with drawings of his own personal bookshelf that contains the books that he has read, even if he doesn't actually own the books.











He goes on to mention the artist Jane Mount, who started a company called The Ideal Bookshelf in 2008.  Mount writes, "I believe books make us better, allowing us to visit other people's lives and understand them.  And books connect us, to characters, to authors, and most importantly, to each other."  

What books would you place on your own personal, ideal bookshelf?

Saturday, July 12, 2025

Will we get replaced by AI?

Perhaps this dropped below the radar, but back in 2017, AlphaZero, a computer program developed by artificial intelligence (AI) company DeepMind (which was purchased by Google in 2014) taught itself how to play chess in just under 4 hours and then proceeded to defeat the world's best (previously) computer chess program Stockfish.  In a mind-boggling 1,000 game match, AlphaZero won 155 games, lost 6 games, and played the remaining 839 games to a draw.  What's impressive about the feat is not that AlphaZero won 155 out of 1,000 games (which doesn't seem like an impressive win/loss percentage), but rather the AI program taught itself how to play the game on its own (check out the video on how it all happened).  Former world champion chess player and grandmaster Gary Kasparov, who famously played against IBM's computer chess program DeepBlue in the late 1990's (winning one match but losing the rematch) said, "It’s a remarkable achievement...We have always assumed that chess required too much empirical knowledge for a machine to play so well from scratch, with no human knowledge added at all."

Just a few years ago, back in September, 2023, an AI-controlled DARPA (Defense Advanced Research Projects Agency) fighter jet, the X-62 Variable In-Flight Simulator Test Aircraft (VISTA), defeated a human pilot flying an Air Force F-16 in a series of dogfights at Edwards Air Force base 5-0.  When I first read about AlphaZero and the X-62 VISTA in two books co-written by Henry Kissinger and Eric Schmidt (The Age of AI: And Our Human Future and Genesis: Artificial Intelligence, Hope, and the Human Spirit, which appeared on my 2025 Leadership Reverie Reading List), I guess I was surprised at just how far AI has come.  

You may forgive my ignorance and naivete when I point out that I am old enough to remember a world before color television, cable TV, calculators, personal computers, and cell phones.  I will also admit that when it comes to technology, I am a bit of a laggard on the adoption curve.  Unfortunately, I no longer have the luxury to be a laggard.  Technology is advancing rapidly, and those who ignore the advances in AI, in particular, will be left behind.  As the saying goes (which was also the title of a Harvard Business Review article), AI may not replace all humans, but humans who use AI will replace humans who do not.  Or is that even true anymore?

The Wall Street Journal published an article on July 3, 2025 with the headline, "CEOs start saying the quiet part out loud: AI will wipe out jobs".  As Chip Cutter and Haley Zimmerman write, "CEOs are no longer dodging the question of whether AI takes jobs.  Now they are giving predictions of how deep those cuts could go."  Jim Farley, CEO of Ford Motor, said, "Artificial intelligence is going to replace literally half of all white-collar workers in the U.S."  Farley told author Walter Isaacson at the Aspen Ideas Festival that "AI will leave a lot of white-collar people behind."

Cutter and Zimmerman go on to write, "Corporate advisers say executives' views on AI are changing almost weekly as leaders gain a better sense of what the technology can do..."  There are still those who say that the fears of AI replacing so many jobs are overblown.  Pascal Deroches, chief financial officer at AT&T said, "It's hard to say unequivocally 'Oh, we're going to have less employees who are going to be more productive.'  We just don't know."

Forbes magazine also reported on Farley's comments ("CEO said you're replaceable: Prepare for the white-collar gig economy").  Steven Wolfe Pereira, who wrote the article for Forbes, emphasized that CEOs are no longer saying that AI will replace jobs and new jobs will emerge.  They are simply stating that AI will replace jobs.  Period.  He writes, "Here's what your CEO sees that you don't: A junior analyst costs $85,000 plus benefits, PTO, and office space.  A gig analyst with AI tools costs $500 per project, no strings attached.  One requires management, training, and retention effort.  The other delivers results and disappears."

Pereira goes on to write that the transformation is already here, citing statistics from McKinsey that suggest that 36% of those responding to the American Opportunity Survey, equivalent to 58 million Americans) identify as independent workers.  The gig economy is growing three times as fast as the rest of the U.S. workforce, and AI will only accelerate this trend.  We are in what Pereira calls the first phase, when companies freeze hiring for any jobs that AI can at least partially do.  Phase two (next 6 months) will occur when companies undergo mass restructuring with elimination of entire departments.  Phase 3 (the next 18 months) will complete the transformation to a full gig economy.  The fourth and final phase (the next 3 years) will occur when the surviving companies have 20% of their previous full-time head count and 500% more gig relationships.  At this point, companies will have transformed from a hierarchical organizational structure to a hub-and-spoke model, with the hub being the permanent workers and the spokes being the gig workers.

I know that AI will be one of the important drivers of cost-reduction and improved efficiencies for health care organizations.  Not a day goes by when AI becomes a topic of conversation in my organization.  Whether the job cuts are as deep as some executives fear is an important question, and one that I don't pretend to know the answer.  I don't necessarily agree with Stephen Hawking, who said, "The development of full artificial intelligence could spell the end of the human race."  Nor do I fully agree with Sundar Pichai, CEO of Google, who said, "AI is likely to be either the best or worst thing to happen to humanity."  Perhaps the answer is somewhere in the middle of the extremes.  Rest assured, I will be reading (and posting) on this topic in the future.  

Thursday, July 10, 2025

Health care has an accountability problem...

Several years ago, two reports from the Institute of Medicine (To Err is Human and Crossing the Quality Chasm) ushered in the quality improvement and patient safety movement.  The first report, To Err is Human was published in 1999 and summarized evidence from primarily two large studies, which provided the now commonly cited estimate that approximately 98,000 Americans died every year as the result of medical errors.  These two large studies, conducted in New York ("Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study I") and Colorado and Utah ("Incidence and types of adverse events and negligent care in Utah and Colorado"), reported that adverse events occurred in 2.9 to 3.7 percent of hospitalizations.  Between 6.6 to 13.6 percent of these adverse events led to death, over half of which resulted from preventable medical errors.  When extrapolated to the over 33.6 million total admissions to U.S. hospitals occurring at the time of the study, these results suggested that at least 44,000 (based directly on the Colorado and Utah study) to as high as 98,000 Americans (based on the New York study) die each year due to preventable medical errors.  

The lay press immediately latched on to these statistics, particularly after the late Lucian Leape (who died earlier this month), one of the authors of the Harvard Medical Practice Study and a leading voice for patient safety, suggested that the number of deaths from medical errors was equivalent to a 747 commercial airplane crashing every day for a year.  Dr. Leape's point was that we wouldn't tolerate that many accidents in aviation, so why would we tolerate that many accidents in health care.  

Importantly, neither study (see also "The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II") included catheter-associated bloodstream infections (commonly known as central line infections), which are the most common hospital-acquired infections and arguably one of the most important preventable causes of death in the hospital setting.  In other words, the "44,000-98,000 deaths" was likely underestimating the issue.  

Unfortunately, despite all the attention to patient safety, progress has been slow.  Martin Makary and Michael Daniel analyzed more recent data (which estimated that preventable errors resulted in as many as 250,000 deaths per year) in a 2016 British Medical Journal article, calling medical errors the 3rd leading cause of death.  That's more like two jumbo jets crashing every day for a year and killing everyone on board!

The most recent studies (see the report from the Officer of the Inspector General and a study published in the New England Journal of Medicine"The Safety of Inpatient Health Care") suggest that as many as one in four hospitalized patients in the U.S. is harmed.  Peter Pronovost, one of the foremost authorities on patient safety, recently published a perspective piece in the American Journal of Medical Quality, "To Err is Human: Failing to Reduce Overall Harm is Inhumane".  Dr. Pronovost and his two co-authors cited a number of potential reasons why health care has not made significant progress on improving patient safety.  However, they then make a profound observation, "While other high-risk industries have faced many of these challenges, they have seen exponential reductions in harm.  The difference is they have accountability rather than excuses."  Boom!  

Dr. Pronovost and his two co-authors once again compare (and more importantly, contrast) the health care industry to commercial aviation.  They suggest four potential solutions (and I suspect all four will be necessary):

1. Federal accountability for health care safety: Whereas the U.S. Secretary for Transportation has clear accountability for aviation safety, it's less clear who is responsible at the federal level for patient. safety.  Apparently, the Secretary of Health and Human Services (or any agency head, for that matter) have clear accountability for patient safety.  That probably needs to change.

2. Timely transparent reporting of top causes of harms: The most common causes of harm reported in the OIG report above were medication errors and surgery, accounting for nearly 70% of all harm.  Unfortunately, neither types of harm are routinely measured or publicly reported.  We need better metrics for the most common types of harm, and they need to be reported more broadly.

3. Sector-wide collaboration for harm analysis and safety correction: Commercial aviation routinely reports major causes of harm, and the industry as a whole works together to eliminate or reduce the causes of harm.  By comparison, with only a few major exceptions (see the children's hospitals Solutions for Patient Safety network), hospitals remain reluctant to share their data either publicly or with other hospitals.  Dr. Pronovost writes that "instead, every hospital, often every floor within a hospital, implements efforts with the most common intervention being the re-education of staff."  That's been my experience - I can't tell you how many times that I've encountered different safety bundles on different floors of the same hospital that are purportedly addressing the same problem.

4. Establish a robust shared accountability system: Here, Dr. Pronovost and colleagues suggest that accreditation agencies such as the Centers for Medicare and Medicaid Services (CMS) and the Joint Commission, among others (and including, ultimately, oversight by the Secretary of Health and Human Services as alluded to above) should bear the responsibility to hold hospitals accountable for safety performance.

We have a lot of work to do.  What's clear is that any improvements that have been made since To Err is Human are small and incremental.  We need to do better.  Our patients deserve more.  It's time that we as an entire industry work together collaboratively with each other and with important stakeholders and partners such as the federal government, accreditation agencies, and insurers, to address this national problem once and for all.

Tuesday, July 8, 2025

"What if this isn't the storm?"

Cheryl Strauss Einhorn recently wrote an online article for the Harvard Business Review, entitled "In uncertain times, ask these questions before you make a decision".  The article recommends that leadership teams change their approach to decision-making during times of uncertainty.  There is no question that we are living in a time of great uncertainty, and leaders (and their organizations) have to learn to rapidly pivot.  

The one question that Einhorn recommended that resonated with me the most was "What if this isn't the storm - what if it's the climate?"  In other words, what if what leaders and organizations are experiencing today isn't some blip on the proverbial radar screen?  What if we are experiencing the new normal for the future?

It's a humbling (and daunting) question.  But as Einhorn suggests, this shift in thinking is "more than semantic - it's strategic."  If we as leaders believe that what we are experiencing is just temporary, we likely won't ask ourselves the hard questions.  We won't push ourselves or our organizations to change.  Instead, as Einhorn writes, we will "delay, defer, or design for an imagined return to stability..."  

I don't know what the future holds.  However, what I can be sure of is that the volatility, uncertainty, complexity, and ambiguity (VUCA) that confronts us today likely won't subside anytime soon.  I think that most worldwide transformations start with a period of turbulence and chaos.  Today's world is turbulent and chaotic, and with that in mind, it seems more appropriate to say that this is a climatic change as opposed to a temporary storm.  The leaders who will be most successful in the future are the ones who will be able to adapt to organizational life in this new climate.

Sunday, July 6, 2025

"Putting leadership on the map"

Our CEO recently forwarded a blog post written by Gary Burnison, CEO of Korn Ferry, entitled "Putting leadership on the map" to our leadership team.  It was both thought-provoking and interesting.  Burnison listed a number of leadership fundamentals that I thought were worth sharing here:

1. Leadership is inspiring others to believe and enabling that belief to become reality.  Burnison believes that this is our primary role as leaders, which is consistent with most of the descriptions of leadership that I've read.

2. It's not about you, but it starts with you.  Burnison writes, "Humility and self-awareness go hand in hand.  If you don't improve yourself, you'll never improve an organization."  In other words, improvement starts with us!

3. Coaches don't win games, players do.  I mostly agree with Burnison here.  Leaders need to surround themselves with talented individuals for sure, but I do think that coaching still matters.

4. Establish the left and right guardrails.  I love this point!  Burnison writes, "Leaders define the mission and values of the organization - then others take it from there."  In other words, leaders should provide the rules of engagement and then get out of the way!  That sounds a lot like the HRO principle of "deference to expertise" to me.

5.  Listen to what you don't want to hear.  I know that I can be a better listener at times.  Active listening is such an important skill for leaders.  Burnison says that "the difference between hearing and listening is comprehending."  While that is certainly true, I think listening to what you don't want to hear means something different, at least to me.  I think what Burnison is saying is that as leaders, we should be open and willing to hear negative feedback and see it as an important growth opportunity.  

6. Learn - always.  As leaders, we should never stop learning.  We should strive towards perfection, even if we may never achieve it.  Burnison writes, "Knowledge is what we know; wisdom is acknowledging what we don't know.  Learning is the bridge between the two."

7.  Communicate to constantly connect with others.  I've been in leadership long enough to recognize that communication is key.  Even if you think that you are overcommunicating, you're probably not communicating enough.

Burnison finishes his post by adding one more important characteristic for leadership - vulnerability.  He talks about the fact that as leaders, we need to find a balance between self-confidence and vulnerability.  As our CEO often says, leaders need to be "hungry, humble, and smart".

Friday, July 4, 2025

Happy Fourth of July!

It's Independence Day in the United States of America!  I haven't posted on the Fourth of July for a couple of years now.  I'm not really sure why, so today I wanted to revisit my post from July 4, 2018.  I think the words are just as relevant today as they were seven years ago.  I am sharing them almost verbatim, with a few modifications and updates.

Today is the day we celebrate the founding of our great country.  Independence Day has always been one of my favorite holidays.  Over the years, our family has celebrated the Fourth of July in a number of ways - watching parades in places such as Coronado Island (California), Jacksonville (North Carolina), Cincinnati (Ohio), and most recently, Highland Park (Illinois).  We have watched fireworks while laying down on a beach in Guam, from the hood of our car in downtown Indianapolis, or while sitting in a park in downtown Loveland, Ohio.  We have had cook-outs, family reunions, family baseball games, and water balloon fights.  On our first Independence Day in Highland Park, we watched and listened to the local news with sadness and fear after learning that one man with a gun shot and killed seven innocent victims and wounded dozens more during our new home's annual Fourth of July parade.  Regardless of where we have been, what we have done, or how we have celebrated the many Fourths over the years (even the one on July 4, 2022), one thing has stayed consistent - our love for this country.

We have challenges in America today.  There are those who would say that America is going through one of the most difficult periods in all of our history.  There are those who say that we are no longer great.  As I shared in a recent post a month or so ago, there's good evidence that America has some long standing challenges that need to be addressed.  There are those who claim that America's best days are behind us and not ahead of us.  Many of our citizens have been embarrassed or downright shamed by things that our current leaders have done or have said (or have posted on social media). 

To all of us who wish for more stability and hope for better days ahead, I would say one thing.  Our country - our nation - is so much more than our leaders.  We, all of us, are America.  And if we hold together, if we stay true to the ideals of our founders and the patriots of the past, we will continue to be America.  Perhaps that is why the Fourth remains one of my favorite holidays.  The Fourth of July is symbolic of these ideals.  Justice.  Duty.  Selflessness. Honor.  We are America because together, we choose to be something better and greater than we can be alone.  We are America because together, we choose to be united in these ideals. 

I love this country.  I am still proud to be an American.  I still believe that our best days lie in front of us, not in back of us.  Today, I ask God to bless each and everyone one of us, as Americans.  Happy Fourth of July!

Thursday, July 3, 2025

"Who knows what's good or bad?"

While I was writing my last post ("Benjamin Franklin's 13 necessary virtues..."), I came across an online article and TEDx talk by the CNN contributor David G. Allan, who writes for "The Wisdom Project", what he calls "a thinking person's life hacking column in which we examine behavior modification, self-help, found wisdom, and applied philosophy."  The online article ("Good and bad, it's is the same: A Taoist parable to live by") caught my attention, which next led me to Allan's 2023 TEDx talk, "Who knows what's good or bad".

Allan started his TEDx talk by stating, "For 200,000 years humans have been accumulating wisdom.  It's even in our name: homo sapiens sapiens.  The word sapiens comes from the Latin sapient meaning to be wise."  Allan then goes on to say that we accumulate wisdom primarily via experience, i.e. the good and bad things that happen during our lives.  He emphasizes that we often learn from the experiences of others through storytelling.  As Former First Lady Eleanor Roosevelt once said, "Learn from the mistakes of others.  You can't live long enough to make them all yourself."

Allan next proceeds by telling a story about a farmer who lost his horse.  The story is more than 2,000 years old and comes from the Taoist tradition.  The story goes something like this (there are several different versions):

Good luck and bad luck create each other and it is difficult to foresee their change.
A righteous man - a farmer - lived near the border.
For no reason, his horse ran off into barbarian territory.
Everyone felt sorry for him.  His neighbor apologized and said, "I'm so sorry about your horse."
The farmer replied, "Who knows if that's good or bad?"

Several months later, the farmer's horse returned with a 12 barbarian horses.
Everyone congratulated him.  His neighbor came back to celebrate, telling the farmer, "Congratulations on your great fortune!"
Once again, the farmer replied, "Who knows if that's good or bad?"

Now his house is rich in horses and the farmer's son loved riding horses.
He fell and broke his leg.
Everyone felt sorry again for the farmer.
His neighbor said, "I'm so sorry about your son!"
To which the farmer replied, once again, "Who knows if that's good or bad?"

A little while later, the barbarians invaded the farmer's country, looking for their lost horses.
The army comes to the farmer's village to conscript all able-bodied men to go and fight in the coming battle. The son is spared because of his broken leg.
All the able-bodied men men strung up their bows and went into battle.
Nine out of ten border residents were killed,
except for the son because of his broken leg.
The farmer and the son both survived.  

The moral of the story: Bad luck brings good luck and good luck brings bad luck.  This happens without end and nobody can estimate it.

It's difficult to label good experiences versus bad ones.  It's probably a false dichotomy.  The Taoists have a way of symbolizing the farmer's "Who knows if that's good or bad?"  It's commonly known as the yin and yang:

















Here, the black area represents yin, while the white area represents yang.  The dots are representative of one within the other.  In other words, there is no clear distinction between the two.  They are complementary, interconnected, and interdependent.  In fact, they give rise to each other.  It is in fact impossible to talk about one without mentioning the other.  They are two parts of a greater whole.  

The same is true of good experiences and bad experiences.  Allan says, "Good can come from bad, and bad can come from good.  Once you move past good and bad, you become less concerned about the outcome and more accepting to how things evolve naturally."  

The yin and yang is all about balance.  We should seek to achieve balance in our lives.  And one of the best ways that we can do that is to focus on what we can directly control in our own lives.  I am reminded again of the Stoic philosopher Epictetus, who said, "There is only one way to happiness and that is to cease worrying about things which are beyond the power of our will."  He went on to also say, "Happiness and freedom begin with a clear understanding of one principle.  Some things are within your control.  And some things are not."  "Who indeed knows what's good or bad?"  Things tend to work themselves out in the end.  Balance.

Tuesday, July 1, 2025

Benjamin Franklin's thirteen necessary virtues...

According to The Autobiography of Benjamin Franklin, when he was 20 years of age, Benjamin Franklin set out to make himself morally perfect.  Franklin had received a classical education and had studied the ancient Greek and Roman philosophers and their concept of the so-called ideal man.  His journey to moral perfection, as most do, importantly started with self-reflection of his own behavior.  As the Dalai Lama said, "To be aware of a single shortcoming within oneself is more useful than to be aware of a thousand in someone else."  Franklin soon found that he often fell short of the ideal man - he ate and drank too much, he spent more money than he should, he talked too much (especially about himself) and didn't listen enough.  

Next, he listed and defined thirteen virtues that he felt were desirable and necessary for his pursuit of moral perfection.  He originally started the list with 12 virtues, but he expanded his list to thirteen when a friend suggested that he add "humility" to the list.  Thirteen virtues fit nicely (not because there were thirteen original American colonies - that would be too ironic) into a calendar, which suited his methods.  Multiply 13 by 4 and you get 52, and there are 52 weeks in a year.  Franklin would work on each virtue for a week before moving on to the next virtue on the list the next week, and so on for a period of 13 weeks.  He would track his progress on a chart and share with colleagues (see the figure below).  At the end of a 13-week period, he would go back to the start of the list and repeat.














Franklin wrote, "My intention being to acquire the habitude of all these virtues, I judg’d it would be well not to distract my attention by attempting the whole at once, but to fix it on one of them at a time; and, when I should be master of that, then to proceed to another, and so on, till I should have gone thro’ the thirteen; and, as the previous acquisition of some might facilitate the acquisition of certain others, I arrang’d them with that view..."

Here are Frankin's 13 virtues that he felt necessary for moral perfection, as he defined them in The Autobiography of Benjamin Franklin:

1.  TemperanceEat not to dullness; drink not to elevation.

2.  SilenceSpeak not but what may benefit others or yourself; avoid trifling conversation.

3.  OrderLet all your things have their places; let each part of your business have its time.

4.  ResolutionResolve to perform what you ought; perform without fail what you resolve.

5.  FrugalityMake no expense but to do good to others or yourself; i.e., waste nothing.

6.  IndustryLose no time; be always employ’d in something useful; cut off all unnecessary actions.

7.  SincerityUse no hurtful deceit; think innocently and justly, and, if you speak, speak accordingly.

8.  JusticeWrong none by doing injuries or omitting the benefits that are your duty.

9.  ModerationAvoid extremes; forbear resenting injuries so much as you think they deserve.

10. CleanlinessTolerate no uncleanliness in body, cloaths, or habitation.

11. TranquilityBe not disturbed at trifles, or at accidents common or unavoidable.

12. ChastityRarely use venery but for health or offspring, never to dulness, weakness, or the injury of your own or another’s peace or reputation.

13. HumilityImitate Jesus and Socrates.

Franklin admitted that his pursuit of moral perfection was a lifelong journey.  He wrote, "...on the whole, tho’ I never arrived at the perfection I had been so ambitious of obtaining, but fell far short of it, yet I was, by the endeavor, a better and a happier man than I otherwise should have been if I had not attempted it; as those who aim at perfect writing by imitating the engraved copies, tho’ they never reach the wished-for excellence of those copies, their hand is mended by the endeavor, and tolerable, while it continues fair and legible."

David G. Allan wrote an online article for CNN.com a few years ago ("Benjamin Franklin's '13 virtues' path to personal perfection"), describing his own personal journey to moral perfection by following Franklin's "life hack" (see also several other articles by Allan on "The Wisdom Project").  Importantly, while Allan started with Franklin's list of 13 virtues, he further refined and modified the list over time to in order to meet his own needs.  Like Franklin, Allan too found that the journey was never-ending.  Allan continued to chart his progress on his list of ideal virtues (Morality, Industry, Friendliness, Erudition, Frugality, Flexibility, Civic Duty, Introspection, Patience, Spirituality, Creativity, Mindfulness and Healthfulness) for over a decade.  

Franklin's list (and Allan's as well for that matter) of virtues are important for leaders too.  His method of self-reflection followed by a quest towards personal improvement is one that we all could easily adopt.  What virtues would you add or subtract to the list?  I encourage you to try Franklin's method out on your own journey of self-discovery and self-improvement.

Sunday, June 29, 2025

Just-in-time training

I came across a very interesting study published in December, 2024 in the British Medical Journal ("Coaching inexperienced clinicians before a high stakes medical procedure: Randomised clinical trial").  The study was conducted in the operating rooms at Boston Children's Hospital by investigators in the Departments of Pediatrics and Anesthesia at Harvard Medical School and involved what is commonly referred to as "just-in-time training" (JIT).  Specifically, investigators wanted to test whether JIT would increase the first-attempt success rate of intraoperative tracheal intubation.  

Tracheal intubation involves placing a breathing tube into the trachea, and while it is one of the most commonly performed procedures in the operating room, it does require a certain degree of knowledge and technical skill.  Tracheal intubation can be a potentially life-saving procedure for patients who are having difficulty breathing.  However, in the operating room environment, tracheal intubation is performed in order to facilitate protection of the airway reflexes and maintenance of breathing during the administration of general anesthesia.  If not performed correctly, attempts at tracheal intubation can result in cardiac arrest.

The team of investigators in this study recognized that like physicians and nurses, many professions outside of the health care industry (e.g. musicians, athletes, pilots, etc) receive many hours of highly structured coaching and practice in order to develop expertise.  However, unlike physicians and nurses, the individuals in these professions also universally rehearse these skills right before a performance or sporting event, similar to "just-in-time training".  Given that tracheal intubation is a technical procedure that requires practice and expertise, why wouldn't physicians (tracheal intubation is usually performed by physicians or specialized nurses called nurse anesthetists) rehearse or practice right before being called upon to perform the procedure?

Just over 150 physicians in training participated in the study.  All study participants performed intraoperative tracheal intubation in infants less than 12 months of age.  They were divided into two different groups.  The first group received "just-in-time training" via a coaching session and simulated tracheal intubation (on a manikin) prior to stepping into the operating room.  The second group did not receive "just-in-time training" and instead followed routine practice, which included unstructured coaching in the operating room while being supervised by an attending anesthesiologist.

Overall, the 150 physicians performed over 500 intubations on children less than 12 months of age during the study period, which lasted approximately 2 years.  The first-attempt success rate for tracheal intubation was significantly higher in the group that received "just-in-time training" (91.4% versus 81.6%), regardless of their previous level of training.  Other factors were better in the "just-in-time training" group as well - including decreased time to intubation, fewer advancement maneuvers, and fewer technical difficulties.  

The key factor to "just-in-time training" is to bring the time between training and implementation for a procedure much closer together.  This elegant study shows that "just-in-time training" can significantly increase the success rate of one of the most commonly performed procedures in the operating room.  Based on these results, one is tempted to ask whether we should be using "just-in-time training" more commonly in medicine.  I think so.

Friday, June 27, 2025

"Beyond a reasonable doubt..."

Just call me Juror Number 67!  I was recently "invited" to serve as a juror at our local judicial district circuit court.  The minimum age to serve as a juror is 18 years, so surprisingly this was only the third time in my 40 years or so of eligibility that I've had the opportunity to be a juror.  I was excused the first time because I was serving overseas on active duty in the United States Navy.  I actually had to show up at the courthouse the second time, but there was only one jury case that was scheduled for that week and the defendant ended up changing to a guilty plea (to a lesser charge) right before they started jury selection.  I guess the third time is a charm, because on this most recent opportunity I actually ended up serving on a jury for a criminal case.

Overall, my experience serving on a jury was very interesting.  I would like to share some of my observations and perspectives here today.  First, serving on a jury is one of the duties and responsibilities we have as citizens of the United States.  Citizenship is something that many of us unfortunately take for granted.  Being a citizen is not a right per se, but rather it is a responsibility and something that we have to work towards and earn.  Serving on a jury is one of the core duties and responsibilities of being a citizen.  Our judicial system is dependent upon having citizens that are willing to serve on a jury when requested to do so.  Unfortunately, I learned from my experience that some of my fellow citizens view serving on a jury as optional (similar to voting, I guess).  There were more than a few prospective jurors who were willing to say almost anything to avoid having to serve on a jury.  The judge at our trial made every attempt to convince them otherwise, but in the end, they persisted and were excused.  In my opinion, if you are not willing to vote, serve on a jury, or fulfill any of the other duties and responsibilities of citizenship, then you forfeit the right to complain about our government!

Second, it's clear that some of my fellow U.S. citizens lack a fundamental understanding of how the judicial system is supposed to work.  I wonder if this stems from an even greater misunderstanding of how our government is supposed to function.  When I was in high school, all seniors had to take a full semester class on the U.S. government in order to graduate.  I don't think that is the case anymore, at least in certain states.  It's interesting to me that individuals who want to become a U.S. citizen through naturalization have to pass a proficiency test on U.S. history and government.  Unfortunately, a recent survey found that just 1 in 3 Americans (who have been granted citizenship at birth) would be able to pass this test!  Again, just my opinion here, but if we are mandating that naturalized citizens demonstrate a minimum proficiency on matters that are deemed important to citizenship, shouldn't we do so for natural-born citizens as well?  

Third, contrary to popular belief, there are some really, really good public defenders out there!  I don't think that it ultimately impacted the results of our case, but the defendant had an excellent attorney (who happened to be a public defender) that was clearly better than the prosecuting attorney.  After the trial was over, the judge met with all of us in the jury deliberation room to personally thank us (a nice gesture), answer questions, and listen to feedback.  He also stated that he wanted to dispel the notion that public defenders aren't good attorneys, but on the contrary, there are some really great ones out there!  I couldn't agree more.

Overall, my experience serving on a jury was very interesting, highly educational, and surprisingly enjoyable.  Serving on a jury is an important duty and responsibility that we all have as U.S. citizens.  If you are ever called to serve, please answer the call!

Wednesday, June 25, 2025

The next greatest generation...

I wanted to follow-up on a post from earlier this year, "The first step is to clearly state the problem...", which I started with a video clip from the premier episode of the HBO television series, The Newsroom, which aired on June 24, 2012.  The scene begins when a fictional television news anchor named Will McAvoy (played by the actor Jeff Daniels) was asked the question, "Can you say why America is the greatest country in the world?"  McAvoy replied, "America is not the greatest country in the world."  When asked to elaborate, he launches on a diatribe about all of the statistics that prove that America is not the greatest country in the world, many of which I reviewed in my post.  

I failed to mention that in the scene, McAvoy was speaking to a group of college students at Northwestern University, and the individual who originally asked him the question was a student there.  McAvoy unfortunately was mean to the student, calling her a "sorority girl" and saying that she was clueless and a member of the "Worst. Period. Generation. Period. Ever. Period."  

At some point in the second season of The Newsroom, the college student, named Jenna Johnson, applies for an internship, which sets McAvoy off once again.  Here's a video clip of the scene, appropriately entitled "Sorority Girl No More".  When McAvoy asks Jenna why she is applying for an internship, she tells him that she read an article about him that talked about the "greater fool".  She tells McAvoy, "I want to be one."

Remember that I have never actually watched this television show.  I didn't understand the context or the reference to the "greater fool", which is actually an economic theory which suggests that one can sometimes make money through speculation on overvalued assets — items with a purchase price drastically exceeding the intrinsic value — if those assets can later be resold at an even higher price.  The hope here is that if you are foolish enough to purchase the overvalued asset, you can find an even "greater fool" to sell it to at a higher price, making you a nice profit.

The tenth and final episode of the first season is called "The Greater Fool", which is actually the title of a cover story written about McAvoy in a fictional edition of New York Magazine (which the character Jennifer claims to have read in the scene from season 2 above).  In this episode, McAvoy's fellow news anchor, Sloan Sabbith, tries to console him about the article, explaining "The greater fool is actually an economic term: it’s a patsy...For the rest of us to profit, we need a greater fool, someone who will buy long and sell short. Most people spend their lives trying not to be the greater fool: we toss him the hot potato, we dive for his seat when the music stops. The greater fool is someone with the perfect blend of self-delusion and ego to think that he can succeed where others have failed. This whole country [the United States] was made by greater fools."

I have to mention one last pop culture reference to fully set up the scene, "Sorority Girl No More".  After Jenna says that she wants to be a "greater fool", McAvoy points to his colleague and says, "Camelot, she's the kid at the end of Camelot."  Here, he's referring to the musical Camelot about King Arthur and the Knights of the Round Table.  At the end of the musical, the Round Table has disbanded and the kingdom is falling apart.  Arthur becomes disillusioned - all of his lofty ideals for a Golden Age of moralistic men has come to ruin.  When all seems lost, Arthur encounters a boy named Tom of Warwick, who has come to join the Round Table.  Tom declares his fealty to King Arthur and all of the ideals that he once stood for.  Once again inspired, Arthur tells Tom to run and tell everyone about Camelot.  His hope for Camelot to live on in the hearts and lives of all is revived.  In other words, McAvoy's hope for America is once again revived.

McAvoy asks Jenna to ask him the question ("What makes America the greatest country in the world?").  When she reluctantly asks him, McAvoy answers, "You do."  He then proceeds to hire her on the spot.  

There's a lot of talk and concern about what we are leaving for future generations in this country.  There's just as much talk and perhaps even greater concern about whether the next generation will be able to respond.  From what I've seen, we are in good hands.  I was once again reminded of this fact when I saw this scene.  America may not be the greatest country in the world, at least based on all of the statistics.  But what makes us great is the "can do" attitude of the next generation.  

I want to finish today's post with a passage from Sea Stories: My Life in Special Operations by Admiral (retired) William McRaven.  Admiral McRaven is talking about a time when he visited all of the wounded soldiers and sailors that were in a military hospital in Germany.  He talked about how all of the soldiers and sailors, some of whom lost eyes, arms, or legs, always asked when they would be able to go back to their units.  They never once complained about their situation, and they always told Admiral McRaven that they would be "just fine."  In response, Admiral McRaven wrote the following passage in his memoir:

If a nation is to survive and thrive it must pass on the ideals that made it great and imbue in its citizens an indomitable spirit, a will to continue on regardless of how difficult the path, how long the journey, or how uncertain the outcome.  People must have a true belief that tomorrow will be a better day - if only they fight for it and never give up.  I saw this indomitable spirit in my parents and those who lived through the Great Depression and World War II - and I saw it again in the soldiers, sailors, airmen, and Marines whom I served with in Iraq and Afghanistan.  And later when I was the chancellor of the University of Texas system, I saw it in equal amounts in the young students who sat in school-houses across Texas.  

From the battlefields to the classrooms, I have seen the young men and women of this generation, the oft-maligned millennials.  They are supposed to be pampered, entitled, and soft.  I found them anything but.  They are as courageous, heroic, and patriotic as their parents and grandparents before them.  Those who fought and died or were wounded in Iraq and Afghanistan are the same young Americans who are building our bridges, finding the cures, and teaching our youth.  They are the men and women who are volunteering to wear the uniform, fight the fires, and protect the people.  They are not like my generation.  They are better.  They are more inclusive.  They don't see color, or ethnicity, or orientation.  They value people for their friendship and their talents.  They are more engaged.  They will not stand by and watch bad things happen to good people.  They are more questioning.  They want to know why.  Why are we going to war. why are we increasing our debt, why can't we do something new and different?  They are risk takers, entrepreneurs, givers of their time and energy.  Above all, they are optimists - and as challenging as the times may seem right now, this generation believes that tomorrow will be a better day.  

I am convinced that history will someday record that these young Americans were the greatest generation of this century, and I know, beyond the shadow of a doubt, that we will all be just fine.

To the next greatest generation, when I am asked, "What makes America the greatest country in the world?"  I answer, "You do."