Guest Post: The Decision Command Fallacy (Part 1)
How to avoid unintended consequences of decisions in complex environments

Many years ago, I was hooked on reading crime thrillers, especially the kind that ended with an unexpected twist. One of these in particular has stuck in my mind to this day because the idea behind it resonated with so much of what I was discovering in my academic and professional life, both at the time and since.

The plot line involved a rich man—let’s call him Joe—who was having an affair with his wife’s best friend. The pair had hatched a plan for Joe to leave his wife but without splitting his fortune with her, as would be the outcome of a divorce. Instead, their plot was to have Joe’s wife murdered. Joe arranged it all, emphatically instructing the hit man that there could be no absolutely witnesses linking him to the crime. A few weeks later, the hit man reported to Joe that the job is done and that there would be no witnesses. But before Joe could celebrate, the hit man added that there had been a slight complication. When the hit man arrived, there was another woman at the house, but Joe needn’t worry. The hit man had done as he was told and finished her off too. Joe’s mistress was dead. His plan, carried out to literal perfection, had ended up costing him the very objective the plan was supposed to achieve.

This anecdote—while fictitious—highlights one of the gnarliest challenges that decision makers dealing with complex systems face. Namely, the “direct” path to the objective may have unintended consequences that annihilate the benefits of achieving the objective (literally, in the case of our crime story). Another frequent pattern occurs when a “direct” decision causes the system to adapt and work around the “direct” intervention, often resulting in an even worse situation than the original one.

Prohibition

The US experiment with prohibition from 1920 to 1933 is such a case. In response to the belief that the consumption of alcohol had detrimental effects on society, the US Federal Government enacted a change to the nation’s constitution making alcohol illegal throughout the country. Far from achieving the desired goal of eliminating alcohol and the social ills it was supposed to be responsible for, the alcohol economy simply moved “underground”. Alcohol was still consumed in large quantities, but now illegally, raising its price to the point that alcohol production and distribution funded the birth of huge new organized crime operations that are still in existence today, generations after prohibition itself was ended.

The cautionary tale in these examples is that, while clearly understanding the objectives we are trying to achieve is a necessary first step in making effective decisions, it is not enough. In all but the most trivial situations, we must also try to understand how the system we are about to change will react to that change. Without this understanding, as the above anecdotes illustrate, making the “obvious” decision may not only fail to achieve the objective they set out to accomplish, but may actually make things much worse than they were to begin with. This is a fundamental characteristic of complex systems, which we ignore at our peril, and which often requires a simulation tool to be effective.

Decision Intelligence software does this, with substantially less time and effort than in the past. It arms the decision maker with the foresight that can help them better predict decision consequences. By approaching decisions at a systematic level, a well-constructed decision model can alert decision makers to outcomes that are associated with the actions they are about to take, but are in opposition to their objectives.

Continue to Part 2, where I show some important decision failure patterns >>

Mark Zangari

You may also like...