Data from the future in the presidential race

Looking back on the presidential election of 2012, one view of the Obama win is to attribute it to his team’s understanding of a phase shift in electoral dynamics: Democrats looked at historical turnout numbers and perceived a systemic change; in contrast many believed that Republican certainty in a Romney win was based on a reasonably expected regression to the mean.  This is the essential idea behind “data from the future“.   We ignore these principles in this system, as in many others, at our peril.

In light of this history, it’s worth asking if the fundamental dynamics of how elections are won is shifting this year again.

I’m gradually migrating the content from the old World Modeler blog to here, so I thought this was a good time for an “encore” of a post on this topic. It’s good food for thought as we move forward into 2016.

Here’s what Mark Zangari and I wrote back then, including a little video we made to illustrate our thinking.


Today, opinions of what happened in the 2012 presidential election are beginning to settle.  Since history is often written from the perspective of the winners, many commentators have attributed the American right’s pre-election false confidence to a sort of blindness or group-think.  But there’s another truth.  Given what was known ahead of the election, the prediction that Romney would win was more defensible than the prediction that Obama would be re-elected.

The video below shows why, and you can read below for the basic idea.

The reason is simple. Take a look at this graph, reflecting data published by the social science data analysis network (SSDAN).

As you can see, the percent of African Americans who voted in 2008 is a clear departure from the historical trend.

It’s reasonable to assume that Republicans concluded that the pattern would revert to historical levels, which given the overwhelming amount of data going back over 30 years, is a completely rational, “big data” conclusion.  (Read Nobel laureate Daniel Kahneman’s “Thinking Fast and Slow” for a clear and deep analysis of this phenomenon, which is called Regression towards the Mean).

An alternative explanation, of course, is that in 2008, something exceptional happened. And that pattern continued into 2012.  Predicting the future requires an additional ingredient, beyond simply projecting historical trends:

The Future = Historical Trends + Unique events and decisions happening today

Nassim Nicholas Taleb defines a “black swan” as an event that is 1) a surprise, that has 2) a big effect, and that 3) history rationalizes with hindsight:  “of course the right was ignoring the obvious”.  The reason Black Swans happen: ignoring the second half of the above equation.    Taleb describes dozens of historical events like this one, where misunderstanding exceptions to historical rules has led to disaster.

Simply put, the future is not always like the past.  It is essential to know when it is, and isn’t.

Obama’s win qualifies on all three Black Swan criteria.  Since such a high percentage of African Americans voted for Obama in 2008, the miscalculation that this pattern would revert to previous trends in 2012 can be seen as having misled many Republicans to believe in Romney’s guaranteed election, and perhaps even to have caused the loss through leading to an inappropriately small focus on the needs of African Americans and other minorities.

What’s missing is the ability to analyze the right-hand side of the equation above: how do we know when something new is unique enough that it changes the course of history?

Democrats appear to have mastered this capability. They built an on-the-ground voter turnout operation to bump the curve upward from its expected value in exactly the places that would affect the Electoral College result.  By being able to focus their resources on exactly where they would contribute most towards achieving victory, the Democrats effectively out-modeled the Republicans.

This “greatest bang for the buck” principle permeates many successful models, including our preliminary systems analysis of Liberia.

Before the book is closed on this election, all who chronicle its history should ensure that the modelers on both sides are appropriately acknowledged.  Because from now on, close elections will be won by campaigns who base their strategic decisions on models that are better than those of their competitors.

Drop me a line if you’d like a copy of this, or any other model.

Lorien Pratt

Pratt has been delivering AI and DI solutions for her clients for over 30 years. These include the Human Genome Project, the Colorado Bureau of Investigation, the US Department of Energy, and the Administrative Office of the US Courts. Formerly a computer science professor, Pratt is a popular international speaker and has given two TEDx talks. Her Quantellia team offers AI, DI, and full-stack software solutions to clients worldwide. Previously a leading technology analyst, Pratt has also authored dozens of academic papers, co-edited the book: Learning to Learn, and co-authored the Decision Engineering Primer. Her next book: Link: How Decision Intelligence makes the Invisible Visible (Emerald Press), is in production. With media appearances such as on TechEmergence and GigaOm, Pratt is also listed on the Women Inventors and Innovator’s Mural. Pratt blogs at www.lorienpratt.com.

You may also like...