Wednesday, July 31, 2019

Word choice matters

My last post talked about how failure to address our basic physiological and psychological needs can lead us (all of us) to less than ideal behavior.  At times, this failure can lead us to do things that we normally wouldn't do.  I used a few examples and provided a few studies (there are many more) that supports this assertion (see an additional one that even has a name for it - "the Valjean Effect" named after the main character in Victor Hugo's powerful story, Les Misérables).  I cited a recent study showing that residents who were experiencing burnout were more likely to demonstrate signs of implicit bias.  I probably need to better define "implicit bias" in more detail, so here I go.


I talked about this topic a few times in previous posts (see for example, Do we need a "National Women Physician's Day" - one year later" and A life of privilege - part III), but one of the most important topics in medicine today is the issue of implicit bias, defined by the Ohio State University's Kirwan Institute for the Study of Race and Ethnicity as the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner.  The definition goes on - these biases are activated involuntarily and without an individual's awareness or intentional control.  Implicit bias is generally used in the context of gender, race, ethnicity, and sexual orientation. 


One of the best examples of implicit bias in regards to gender issues in medicine comes as a result of a study first published in the journal Critical Care Medicine ("Influence of Gender on the Performance of Cardiopulmonary Rescue Teams: A Randomized, Prospective Simulator Study").  In this case, I would argue that the study's conclusions were an example of explicit bias, but you can judge for yourself:


We found important gender differences, with female rescuers showing inferior cardiopulmonary resuscitation performance, which can partially be explained by fewer unsolicited cardiopulmonary resuscitation measures and inferior female leadership. Future education of rescuers should take gender differences into account.


The study has generated a number of editorials and at least one follow-up (and I would say a study with a much, much better design) study ("Female Physician Leadership During Cardiopulmonary Resuscitation Is Associated With Improved Patient Outcomes" with the exact opposite conclusion:


In contrast to data derived from a simulated setting with medical students, real life female physician leadership of cardiopulmonary resuscitation is not associated with inferior outcomes. Appropriately, trained physicians can lead high-quality cardiopulmonary resuscitation irrespective of gender.


However, as well-intentioned as the authors of the second study were in showing that female physicians can lead resuscitation teams and do a great job, the conclusion as written above (copied verbatim from the abstract) is a great example of implicit bias (credit some of my colleagues in pediatric critical care for pointing this out in a recent editorial).  Take a second look at the conclusion ("female physician leadership...is not associated with inferior outcomes") and contrast with the conclusion from the first study ("...with female rescuers showing inferior cardiopulmonary resuscitation performance, which can partially be explained by...inferior female leadership").  Can you see the difference? 


The first study clearly points out and emphasizes that outcomes from a cardiopulmonary resuscitation were worse when females led the team and that this was due to the fact that females were not as good at leading the team.  Conversely, the second study suggests that females might be just as good as males when leading resuscitation teams - at least they are not inferior.  Some experts would justify this working because this is how study results have to be interpreted with respect to study design and principles of evidence-based medicine.  I disagree - there is a clear difference in wording, and clearly implicit bias is playing a major role here.


Diversity and inclusion are important - I would even say the most important - issues for leaders to address today both inside and outside health care.  Diversity and inclusion doesn't happen on its own - it takes significant (and dare I say, explicit) effort.  However, study after study after study shows that diversity and inclusion pays significant dividends in the end.  Teams with inclusive leaders function much better than those without (see a great article in the Harvard Business Review here - credit my daughter for sending me this one).  So, how do you become an inclusive leader?  The authors of the aforementioned Harvard Business Review article conducted a survey of over 4,100 employees and asked them specifically about the behaviors and traits that separated inclusive leaders from those who are not.  Here were the six behaviors that inclusive leaders consistently demonstrated:


1. Visible commitment to diversity and inclusion (these leaders were authentic about their commitment to diversity and inclusion and clearly made diversity and inclusion a major priority).


2. Humility (There is nothing wrong with being confident, but it is also important for leaders to recognize that they don't know everything.  Part of being an inclusionary leader means being able to admit your mistakes, recognize that everyone has something to contribute, and learning from different perspectives).


3. Awareness of bias (This is a big one!  Inclusionary leaders are aware of implicit bias and take steps to eliminate them.  Recognizing the important link between addressing basic needs - there's Maslow's Hierarchy again - and implicit bias is clear here).


4. Curiosity about others (Inclusionary leaders seek out opportunities to learn and grow about different cultures, different perspectives, and different ways of approaching life in general).


5. Cultural intelligence is a natural consequence that follows from having a curiosity about others above.


6. Effective collaboration (Inclusionary leaders empower others, delegate often, and provide opportunities for professional growth and development).


Implicit bias is so prevalent in our world today.  It's just as important to address implicit bias as it is to address explicit bias.  Word choice matters.  Inclusionary leadership requires it.

Sunday, July 28, 2019

The Curious Case of Richard Parker

I have written more than a few blog posts on topics such as "impulse control" and "ego depletion" - remember, for example, Roy Baumeister's radish experiments ("What do I know of man's destiny? I could tell you more about radishes."), the Stanford marshmallow experiments ("Life is a marshmallow - easy to chew, but hard to swallow."), or the study showing that married couples are more likely to fight when they are hungry ("You're not you when you're hungry")?  How about the one about me, one of my personal favorites ("Dad is hangry again!")?  There's a consistent theme here - perhaps a review of Maslow's Hierarchy of Needs is in order (or just look at my previous posts, "What can we learn about leadership from a movie?" or "What do you really want?").  The bottom line, "take home" message of all of these posts, especially with Maslow's Hierarchy of Needs in mind, is that we, as individuals, need to make sure that we are meeting the basic physiological needs (food, water, shelter) and security needs (safety) before trying to fulfill the higher level needs of a sense of belonging or self-fulfillment.  More importantly, these studies tell us that if we fail to meet our basic physiological needs, then we are much more likely to do things that we wouldn't normally do (e.g., eat that marshmallow, argue with our spouse, get angry with our kids, or take shortcuts at work).  In some cases, as I most recently discussed in "Be honest, are we honest?", failing to adequately satisfy our basic physiological needs at the bottom of Maslow's Hierarchy leads us to be dishonest or even unethical in some cases.

Which brings us to the case of Richard Parker.  Here is a case that is so coincidental that it's almost hard to believe.  It really is kind of freaky.  The American writer, Edgar Allan Poe once wrote a story in his novel, The Narrative of Arthur Gordon Pym of Nantucket about a fictional crew of sailors on a ship called the Grampus.  The sailors find themselves lost at sea with limited supplies of food and water (due to a shipwreck).  They eventually catch a tortoise and eat it, but eventually the meat from the tortoise runs out too.  They soon draw lots to determine which one of them will be sacrificed to provide meat for everyone else (there are several well-known examples of cannibalism over the course of history, so this is certainly an unusual story but not at all the freaky part I am referring to).  The sailor, Richard Parker, draws the short straw and is promptly stabbed to death, and the crew survives long enough to be saved.

Here's the freaky part.  Poe wrote his novel in 1838.  Less than 50 years later, in 1884, a British yacht named the Mignonette sinks in a storm far off the coast of England.  The four-man crew barely escaped to a lifeboat, but found out that they didn't have any supplies of food or fresh water.  Guess what happens next?  They catch a turtle and survive for a few days on turtle meat.  But after several more days, they find themselves starving to death.  The youngest, a cabin boy named - you guessed it - Richard Parker (WOW!) became ill after drinking sea water.  The other three sailors propose to draw lots to see who will be sacrificed to save the others, but then figure out that if Parker is going to die anyway, why not just kill him?  The Captain of the ship stabs Parker in the neck with a pen knife, killing him for food.  One of the sailors, Tom Dudley, later said, "I can assure you I shall never forget the sight of my two unfortunate companions over that ghastly meal.  We all was like mad wolks who should get the most, and for men - fathers of children - to commit such a deed, we could not have our right reason."  

In other words, desperate times lead to desperate measures.  Deep in the throes of starvation, the three men considered something that they would never had imagined before, ultimately committing an act of murder to satisfy their hunger.  Once they were rescued, they were all open and candid about the events that followed the Mignonette's shipwreck.  Two of the sailors end up going to court to be tried for murder (see the case, R v Dudley and Stephens).  By the time of their trial, public opinion was strongly in their favor - it was almost as if the general public asked the question, "Can we truly say that we would have done differently in the same situation?" 

We see similar issues today in a leadership context.  Failure to satisfy our basic needs leads many of us to lie or cheat - these facts have been demonstrated over and over in both laboratory and real-world settings.  I just came across a research article the other day ("Association of Racial Bias with Burnout Among Resident Physicians") that suggested that residents experiencing signs and symptoms of professional burnout (as determined by the Maslach Burnout Inventory) were more likely to show signs of implicit racial bias.  The study included over 3,300 second-year residents - burnout didn't lead to racist behavior, but there was a strong association with something known as implicit bias, attitudes or stereotypes that affect our understanding, actions, and behavior in an unconscious manner (think: "All people who wear glasses are intelligent.").  In other words, implicit bias leads us to act in ways that we wouldn't ordinarily act.  Burnout, in this case, represents a failure to address some of our most basic needs on Maslow's Hierarchy.

I am not saying that we should excuse implicit racial bias just because someone is "burned out."  I am also not suggesting that we should allow individuals to eat their friends just because they were starving.  What I am saying is that in order for us to be our best selves, we absolutely have to address our basic needs.

If you've read the novel or seen the movie, "Life of Pi" you will no doubt recognize the name, Richard Parker.  The main character gave that name to the tiger who was on the lifeboat with him.  Or was he really a tiger? 

Pi:  "So tell me, since it makes no factual difference to you and you can't prove the question either way, which story do you prefer?  Which is the better story, the story with animals or the story without animals?"

Mr. Chiba: "The story with animals."

Pi: "Thank you.  And so it goes with God."

Tuesday, July 23, 2019

First pitch

Last Friday night, I got to do something that I never had dreamed of doing.  The Cincinnati Reds hosted our hospital's Employee Appreciation Weekend on Friday night, Saturday night, and Sunday afternoon.  About a month ago, one of the organizers of the event asked me if I wanted to throw out the first pitch at the first game on Friday night (our CEO threw out the pitch on Saturday night).  It was an easy decision, and one that I made in about a millisecond.  Of course, after saying yes, I started thinking about how I would do throwing a baseball from a pitcher's mound to one of our employees (selected by random drawing) 60 feet and 6 inches away at home plate.  Oh, and did I mention that there would probably be at least 15,000 people in the stands waiting for me to screw up? 

I remembered watching a video from a few years back when our city's mayor (at the time) threw out the first pitch on Opening Day.  It wasn't pretty (check out the video here).  To borrow a quote from Bob Uecker's character in the 1989 movie, "Major League", it was "Just a bit outside."  I didn't want to make the same mistake.  So, I did practice in our front yard with my wife catching. 

I've been to enough baseball games to know that the ceremonial "first pitch" happens at least two or three times (on my night, I was the fourth person to throw out the ceremonial "first pitch", so technically, I guess it was the fourth pitch!).  I underestimated my level of adrenaline.  I threw the ball over home plate, but I was well outside of the strike zone.  In fact, the ball went right over my catcher's outstretched glove (she was jumping).  Oh well, at least I made it over the plate.

So, what did I learn from this experience?  These are all things that I've learned before, but my experience at the ball park was an important reminder of several key points.

1.  Take advantage of your opportunities.  It's not every day that get the chance to throw out the first pitch - some would argue (and I would agree) that this was a once in a lifetime kind of opportunity.  When the opportunity presents itself, say "yes".

2.  Be ready.  When you get an opportunity of a lifetime, it's important to be ready and prepare ahead of time.  Would I have been okay had I not practiced?  I don't know, but I am glad I won't find out.  I felt a whole lot more comfortable knowing that I had made several successful throws at home in my front yard.

3.  Have fun.  If you are not enjoying yourself, whose fault is it?  It's yours.  Have fun.  Relax.  And don't worry about it.

4.  Never take yourself too seriously.  It's okay to laugh at yourself.  Was it a big deal that I threw it over my catcher's head?  Probably not.  Would it have been a big deal if I had bounced the ball in front of the plate?  Probably not.  We both had fun, and we both had an experience of a lifetime.

It was a really good ending to a good week.  Oh, and if you are interested, here's a video:

Thursday, July 18, 2019

"You promote what you permit..."

I was a shy, quiet, overweight kid when I was growing up (now, I'm just overweight!).  It's funny what events seem to stick in your mind as you grow older.  Just the other day, I remembered a time when I was around 8 or 9 years old.  The local YMCA had started, for the first time ever, a youth soccer league.  My team was called the Cosmos (named after the North American Soccer League's New York Cosmos - some of you may remember that the famous Brazilian soccer star, Pele played for the Cosmos in the 1970's).  Several of my teammates and I had played on the YMCA youth football league team together earlier in the fall, and we actually had the same coach.  We only practiced two nights per week, and we played our games on Saturday mornings.  Our coach had pulled me aside to give me a special assignment for the upcoming game.  One of the players on the team that we were playing that week was known to be a bit of a bully.  He always played rough, and he even broke the rules at times.  Coach had showed me how to "push back."  To this day, I have no idea on what you can and cannot do in soccer, but basically I remember that he told me as long as I kept my arm straight against my side, I could bump him as hard as I wanted to and wouldn't get a penalty.

Well, Saturday finally came.  About half way through the first half of the game, "Bully" (I don't remember his name) gave me a push.  I followed coach's instructions exactly and pushed just as hard back.  "Bully" fell down, one of my teammates stole the ball and ended up scoring.  I remember coach giving me a high five (or at least the version of "high five" that existed way back then) and told me, "Way to go!"  Next time, I didn't wait for him to push me.  I gave him another shove and knocked him to the ground.  The coach and several of my teammates cheered for me.  I was beaming!  We won the game (we would eventually finish the year as league champions, so we were pretty good).

We ended up scrimmaging another team at the next practice.  I was playing defense, and one of my school friends had made a break for a shot on our goal.  I was the only one between my friend and the goalkeeper.  So, what did I do?  I gave him a rough push, knocked him to the ground, and kicked the ball back to the other side of the field.  He fell to the ground, writhing in pain and crying.  The coach on the other team blew his whistle and came up to me and yelling, "You can't do that!"  The boy's father came on the field and also yelled at me, "What are you doing?"  My coach called to me from the sideline and told me to come off the field.  No more high fives this time.  He pulled me aside and said, "You can't do that son.  That's a foul." My friend ended up being okay, and I ended up going home.  I remember my father telling me that I shouldn't do things like that.  I knew I had been wrong, but I was completely confused.  Why was it okay during the game and not now?

As I look back on that episode, I can't help but think of the old adage, "You promote what you permit."  It's really true.  Coach gave me specific instructions to play rough, and he even rewarded me when I did play rough.  What kind of response would you expect from an eight year-old kid?  The same thing happens in organizations today.

Think about it.  If leaders and managers look the other way when one of their employees stretches the rules a little bit, what do you think happens over and over in the future?  If leaders and managers inadvertently reward bad behavior by not calling out their employees when they behave unprofessionally or unethically, what kind of behavior do you think happens again and again?

One of the most difficult things to do as a leader or manager is to call someone out.  But it goes beyond bad behavior.  If an employee sees someone from the leadership team walking through the hallway and ignoring a piece of trash on the floor, what do you think the employee thinks?  You can bet your paycheck that the employee is thinking, "Well, if it's not important enough for you to pick it up, why should I?"

They say that the lessons we learn in childhood are relevant throughout our lives.  I completely and 100% agree.  I learned my lesson the hard way, but I've never forgotten it.  It is up to us, as leaders and managers, to put a stop to bad behavior.  It is up to us, as leaders and managers, to hold our teams accountable when they are being disrespectful.  And it is up to us, as leaders and managers, to set the example.  You promote what you permit.  Don't forget that.

Monday, July 15, 2019

The 70-20-10 Rule

I recently heard about something called the "70-20-10 Rule", developed by the Center for Creative Leadership in the 1980's.  The "70-20-10 rule" describes a leadership and management training module that was based on a survey of 200 executives, in which executives were asked to report how they believed that they learned most effectively.  Based on these results, 70% of leadership development should come from so-called experiential learning, 20% should come from relationships and feedback, and 10% should come from formalized education and training.

Here is the model in more detail:



The major point to be made is that the bulk of leadership development should focus on exposure to a wide range of different experiences (sounds a little like my recent post "Jack of all trades, Master of none", doesn't it?), both within the organization as well as external to the organization.  We learn best by doing - we learn best "on the job."  In contrast, a much smaller proportion of leadership development should focus on formalized education and training, such as what one would typically get through an executive education or graduate degree program.  These formal training programs should complement, but not replace, experiential learning. 

One last point should be made - the evidence for the so-called "70-20-10 rule" is weak at best.  The sample size was relatively small (only 200 executives were surveyed).  Moreover, the results may have been biased by the fact that they asked already successful executives to reflect on their own personal experiences.  A more rigorous study design - for example, a randomized, controlled trial - would have been preferable. 

Regardless of the lack of evidence demonstrating that this model is the absolute best way to go, I do think the model provides a good place to start when designing a personal or organizational leadership development program.  Development should emphasize experiential learning (70% on-the-job training), but coaching/mentoring (20%) and formalized curricula (10%) should be included as well.




Wednesday, July 10, 2019

"Art imitates life, or is it the other way around?"

Searching the Internet can be scary.  You never know what you might find.  Okay, bear with me for a minute.  This post may wander around a bit before I get to the point.  Hopefully it's worth it!  I was sitting in our Pediatric Grand Rounds (PGR) yesterday morning - actually, it was my turn to introduce the speaker, who just happened to be our Department Chair.  The title of her talk was simply, "Pediatrics."  Her presentation was phenomenal - check it out if you want, the link on our website will be active for about six months, as all of our PGR presentations are now.  Our Chair happens to be a specialist in pediatric infectious diseases, so her talk focused mostly on how the treatment of infections such as diphtheria, scarlet fever, and tetanus, previously almost uniformly fatal, has evolved over time.  One of her most profound observations, however, was how novels published in the early 19th century helped change the attitudes around childhood diseases. 

At one point, infectious diseases were thought to be due to character flaws.  Only those with some underlying character flaw or immoral lifestyle died of infectious disease (again, this was long before the discovery of the erm theory of disease by the French scientist, Louis Pasteur).  However, the publication of novels, such as Emily Bronte's Wuthering Heights, Louisa May Alcott's Little Women, or Charles Dickens' Bleak House drew attention to the poor living conditions in the cities of Victorian England (in the case of Dickens' Bleak House and some of his other works) or simply showed that characters who were otherwise of good character succumbed to some of the common diseases of the time (tuberculosis, commonly known as consumption, in Wuthering Heights and scarlet fever in Little Women).  It probably helped that our Chair was an English Literature major in college.  Regardless, she made a statement that art often drives understanding, inquiry, and eventually improvement.

I thought her observations were incredibly interesting.  As a matter of fact, I started trying to remember a common phrase that I had once heard.  Was it "Art imitates Life"?  Or maybe it was the other way around?  I wasn't sure.  So, during my free time at lunch, I looked it up on Google.  Big mistake!  As it turns out, there are two schools of thought.  The ancient Greeks, most prominently Aristotle and Plato, believed in a concept that they called "Mimesis" - the representation or imitation of the real world in art and literature (according to Merriam Webster's online dictionary).  In other words, according to the ancient Greeks, "art imitates life."

Okay, sounds good and makes sense, right?  Well, apparently there is another school of thought, perhaps made most famously by the author, Oscar Wilde (see his 1889 essay, The Decay of Lying).  Wilde states rather emphatically, "Life imitates Art far more than Art imitates Life."  Okay, now I am confused!  Which is correct - mimesis (Aristotle and Plato) or anti-mimesis (Oscar Wilde, among others)?  It probably doesn't matter, and after reading a little about both, my brain started to hurt.

So, back to the whole point of this post!  What I thought was most profound about our Chair's talk was the fact that she drew attention to something that feel is very important - the Arts.  We need the Arts.  Recall how Arts helped change the opinions and attitudes of people living in Victorian England.  The Arts, in this case, led to an appreciation that, counter to what was widely believed at the time, even those of good moral character can die of infectious disease.  One could even argue that this changing paradigm indirectly led to the discovery of the germ theory of disease!  Art is important, because it makes us think.  We can learn to appreciate different perspectives about the world around us.  We can learn about the great world that exists outside of our own immediate environment. 

Our Chair ended her talk with a challenge.  With the growing emphasis on the STEM disciplines in high school and college, coupled with the decreased emphasis (and funding) of the Arts, who will write the next novel or paint the next painting that leads us to shift our paradigm and see the world differently?  It's a good question.  Who indeed?

Sunday, July 7, 2019

"Jack of all trades, Master of None"

In my last post, I admitted that my career has meandered at times.  I would even go further and say that I have reinvented my career at least once, if not twice, as my professional interests have changed over the years.  I have never been 100% comfortable recommending this particular path to younger physicians.  Instead, I have often led these mentor/mentee conversations by asking, "Where do you want to be in your career five years from now?"  It's as if I am recommending that younger physicians map out their careers more rigorously than I did. 

I recently read a great book by the author David Epstein called, Range: Why Generalists Triumph in a Specialized World.  It's really a great book, and it definitely changed my mind on the kind of career planning advice that I will give in the future.  The book focuses on the particularly germane argument - is it better to be a generalist or a specialist?  You've probably heard that old saying that labels anyone claiming to be a generalist as a "Jack of all trades, Master of none."  I've even used it myself to describe my own career at times.  Most of the time that this particular phrase is used, it is not meant to be a compliment.  However, did you know that there's a second part of this old proverb?  The complete saying is actually, "A jack of all trades is a master of none, but oftentimes better than a master of one."  The complete saying actually is suggesting that being a generalist is not such a bad thing after all.

Epstein provides all kinds of evidence suggesting that being good at many things is better than being great at just one thing.  He starts out his book contrasting the careers of two professional athletes - the golfer, Tiger Woods and the tennis player, Roger Federer.  Tiger will likely finish his career as one of the greatest golfers to ever play the game - the same is true for Roger.  Importantly, Tiger started playing golf at a very early age (around the age of 2 years) and played nothing else.  Roger played a number of different sports before focusing on tennis relatively late in life (at least for most professional tennis players).  Tiger was a specialist from the beginning, while Roger, a generalist, developed his athleticism playing a variety of different sports.  Both ultimately found incredible success in their respective sports, but they clearly didn't get there in the same way. 

I know what you are thinking.  It sounds like Epstein doesn't necessarily believe in the infamous 10,000 hour rule, popularized by Malcolm Gladwell (I've posted a few times on Gladwell's 10,000 hour rule - see "Practice makes better, but does Practice make perfect?" and "Premeditated, purposeful, intentionally-focused training").  As a matter of fact, Epstein and Gladwell debateddebated this very topic following Epstein's first book, The Sports Gene.  I wouldn't necessarily throw out the "10,000 hour rule" just yet, but I do think that Epstein makes a compelling case that perhaps we (as most of us are not elite athletes) should focus on being the best person that we can be - almost everything that we learn makes us better, and almost everything that we learn makes us better in whatever profession we happen to be working.

Epstein mentions something known as the The Dark Horse Project, which I found both interesting and reassuring.  The Dark Horse Project is a research study being conducted by a group of investigators at Harvard whose sole objective is to determine how men and women develop expertise in their chosen fields.  So far, what they have found supports Epstein's argument.  In other words, there are many, many individuals who, like both Roger Federer and me (this is the one and only time that I get to compare myself to Roger Federer) followed a different, at times even meandering path, before finally settling in one particular field or area of expertise. 

The computer scientist, entrepreneur, author, and venture capitalist, Paul Graham once wrote a high school commencement speech that he never ultimately gave - it can be found on the Internet (see "What You'll Wish You'd Known").  In the speech, Graham provides the following advice, which I believe provides a good closure to this post:

Instead of working back from a goal, work forward from promising situations.  This is what most successful people actually do anyway. 

In the graduation-speech approach, you decide where you want to be in twenty years, and then ask: what should I do now to get there?  I propose instead that you don't commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward.

It's not so important what you work on, so long as you're not wasting your time.  Work on things that interest you and increase your options, and worry later about which you'll take.

It's really great advice, when you think about it in the context of everything that David Epstein has said.  Being a "Jack of all trades" isn't such a bad thing after all.  In fact, it may be the best way to get where you are going.

Wednesday, July 3, 2019

You never can tell...

I often meet with younger physicians who are looking for career guidance and mentorship - it's really one of my favorite aspects of my current job.  Our typical conversation starts with my own personal story, not because I think anyone should necessarily model the path I traveled, but rather to give these physicians the idea that you really never know where you will end up in life.  Here's what I mean.


I remember having multiple conversations with classmates and colleagues throughout medical school, residency, and the beginning of fellowship training about laboratory-based research.  I distinctly remember saying on several occasions something to the effect of "Why should I do research?  I want to take care of patients - that's why I went to medical school and not graduate school."  There were other times when I even said, "Why would I want to do basic science research, I am not a PhD."  Ironically, I was fortunate to be exposed to several excellent physician-scientists during my fellowship training, the so-called "triple threat" physicians who were excellent at patient care, teaching, and research.  At some point, I got hooked on the idea that I too could become a physician-scientists and ultimately chose to pursue a career combining both patient care and laboratory-based research.  I spent the first 10 years or so of my career working at the bedside and the bench.  I was lucky enough to be moderately successful at it too, as at least some of my research was funded by the National Institutes of Health (a common benchmark for physician-scientists).  So, I found myself doing what I never imagined that I would be doing.


After my first year of medical school, I was fortunate to be awarded a military scholarship to pay for the rest of medical school.  I completed my residency training while on active duty in the United States Navy, after which time I served for three additional years as a general pediatrician before starting fellowship training.  I really enjoyed my time in the Navy, and there was certainly an opportunity to remain on active duty, even as a pediatric subspecialist.  As you can imagine, however, the opportunities for career advancement as a pediatrician in the Navy are such that if you want to be promoted through the ranks, sooner or later you will have to go into health care administration.  I remember talking about the decision on whether to stay in the Navy or become a civilian again with my wife.  I don't remember the exact words, but I think I said something along the lines of, "I wouldn't be caught dead in health care administration!"  So, in other words, I chose to forego a career in the Navy - something that I really enjoyed and miss, even to this day - largely because I did not want to go into health care administration.  Again, rather ironically, I spend most of my time these days in health care administration.


I use these stories to illustrate an important point - you never can tell where you will end up.  You can certainly plan what your career path might look like, but there are absolutely no guarantees that your plans will be set in stone.  Plans can change - indeed, plans should change as you learn more information.  The important points are:


1.  Always, always, always, keep your options open. 
2.  Never exclude yourself from opportunities, just because you think you might not be interested.
3.  Be open to change.


My career path has meandered a bit - and that's okay.  I might be the best example of a "Jack of All Trades, Master of None."  My next post will delve into this point further.  But for now, just remember, you never can tell...