Thursday, September 15, 2022

Sim City

Do you remember the computer game "SimCity"? There were several different versions of the game, but essentially the object of the game was to build and design your own city.  The player, acting as the mayor of the city, starts off with a blank geographic map (in later versions of the game, you could even change the geography).  He or she can then designate certain areas as industrial, commercial, or residential, after which they can build away!  Players frequently have to contend with natural disasters, such as tornadoes, fires, earthquakes, and floods (there was even a giant monster that could attack the city).  They can build amusement parks or industrial parks, marinas or golf courses, and apartment complexes or residential neighborhoods.  It's a really fun game - see the screen shot below of a newly constructed city:














Don't worry - this post is not just about the SimCity computer game.  During my last post, I mentioned that I recently finished the book The Logic of Failure by Dietrich Dörner.  While the main focus of the book is on why we make mistakes, at times with catastrophic consequences, Dörner primarily uses the results from two groups of experiments to support his theoretical concepts.  I was reminded of SimCity while reading the book, as the two groups of experiments involved a simulation where subjects became the leaders of a fictional country (in the first group, the results of which were also published in the journal Philosophical Transactions of the Royal Society) or town (in the second group), similar to the SimCity game.  

The first group of simulations took place in the fictional country of Tanaland, Africa.  Leaders (i.e. the study participants) were tasked with promoting the health and well-being of Tanaland's inhabitants and the surrounding region.  For example, they could improve the fertilization of the fields and orchards, install irrigation systems, or build dams.  They could introduce measures focused on improving access to medical care or build infrastructure such as power plants or roads.  Leaders were given free reign to introduce as many measures as they wanted to at six "planning sessions" over the course of a simulated ten-year period.  In this way, they could evaluate the success or failure of each introduced measure at regular intervals and cancel or modify earlier decisions.  

Several metrics were followed over the course of the simulation, including crop yield, population, birth rate, etc.   The individual results of these simulations are illustrative.  For example, one leader introduced measures to improve medical care and the supply of food to the region.  Initially, both the birth rate and life expectancy increased.  However, once the population of Tanaland hit a certain threshold, there was no longer enough food to support the growing population, and a famine occurred.  As Dörner himself explained, "Catastrophe was inevitable because a linear increase in the food supply was accompanied by an exponential increase in the population."

Similarly, the second group of simulations took place in the fictional town of Greenvale, England.  Once again (and unrealistically, but that's the nature of the simulation), leaders (study participants) could exercise near dictatorial powers for ten years.  The town's single biggest employer was a municipally-owned watch factory.  Leaders could adjust local tax rates or change the hiring practices at the watch factory, introduce measures to improve medical care, or build more houses.  Again, just like the Tanaland simulation, leaders frequently succumbed to the law of unintended consequences.  

The key lesson from these simulations is that systems are prone to human failure when they are complex (see again my last post), dynamic (i.e. evolving over time), and intransparent.  The word "intransparency" means "lack of transparency" which in this context refers to the fact that in complex systems (in particular), information is often incomplete or hidden from view.  Economists often refer to something called "information asymmetry" where two individuals, groups, or teams have unequal access to information.  Collectively, these characteristics of complex systems are really what contributes to what Dörner called "the logic of failure", which he defined as the tendencies and patterns of thought that humans make (as a natural result of our evolution), such as taking one thing at a time, cause and effect, and linear thinking, that were probably appropriate in an older, simpler world but can prove disastrous in our complex world today.  For example, the study participant who increased access to medical care and improved the irrigation in Tanaland didn't anticipate the effects of both of these initiatives on what is now called the Malthusian Trap (named after the 18th century economist Thomas Malthus, who first described it).  The population in Tanaland grew exponentially, while the food supply continued to grow linearly.  

Fortunately, there are some tricks and tools of the trade that leaders can use to help them effectively make decisions, even when the situation is complex.  Dr. Dörner had some suggestions as well.  And we will talk about some of his recommendations, as well as some of the recommendations from other experts, in my next post.

No comments:

Post a Comment