Guest Post: Connecting Decisions to Data: a Case Study (Part 1)

In my last post I looked at the problems in getting the data to support evidence-based decision making. We get that data through decision modeling, so here I’ll begin a deep dive into Decision Modeling, creating a decision “blueprint” to solve the Problem 1 I described there: How can decision makers identify the data they need to support their decisions?

Let’s look at a fictitious—but typical—case study that will have elements familiar to anyone who has been involved in making important decisions. 

Before I begin, an important note: while the scenario in the case study considers how to plan a profitable launch for a new product, you don’t need to be a product management expert to follow it. Also, the issues it raises apply to evidence-based decisions of all kinds, whether you’re an HR director creating a new policy at a company, a lawmaker deciding on climate strategy, or a building manger deciding how to improve ventilation to reduce Covid-19 infection. What’s common to all these decisions—including this one—is that they involve actions that play out through time in non-obvious ways, and that you have some data that you think will help. It’s always a challenge to give an example like this because people think it’s only valuable in this particular problem domain. But if you can bear with me, you’ll see that the issues of how we use data for complex decisions that are illustrated here are universal.

Case Study: Planning a new product’s first production run

Suppose you are responsible for a new product that has reached a milestone in its development lifecycle. You’re ready to build it and to sell it. You are faced with three decisions:

  1. How much should I charge for each unit of the product?
  2. How many units should I order for my first run?
  3. How big an investment should I make in marketing the product?

You must submit your choices to your senior management team for approval, and they will require you to convince them with quantitative evidence both that your plan will be profitable, and that your decisions do not expose the company to unacceptable risks. Luckily, your organization has a top-shelf analytics and data science team along with a suite of state-of-the-art Business Intelligence software, and your manager has made these resources available to you to complete this task.

Tool 1: The Decision Model

Let’s assume you have a team of data analysts in your office and eager to help you. What should you ask them to do?  You explain the decisions you need to make to them, and they respond by describing the extensive range of data sources from which the answers you need can be obtained. Unfortunately, none of these contain past examples of the problem you are trying to solve, so they can’t provide you with charts that directly relate the variables in decisions 1–3 to the outcome you are measuring—profit.

You are now facing a situation very familiar to most business leaders:

With a vast collection of data available to you, along with powerful analytics, business intelligence, even AI, how do you determine what data you should use, and how this should be processed to give you the information you need to make the right decision?   

In addition to this, you are being advised by vendors eager to sell data services that “your decision is only as good as your data”, and therefore, before using data of any kind to support your decision, you need to engage in a time-consuming and expensive data cleansing exercise.  Is this true?

Returning to the analogy with software engineering from my first post in the series, the conversation that would most likely have occurred were you asking your IT team to write a business application for you would have been quite different. They may have asked you to review use cases, mapped out workflows, and gone through a series of exercises. When all that work was done, they would show you a set of diagrams that you could clearly follow, and which they could use to design and build the software that would meet your needs. 

Why was the interaction with the data science team so different? The answer is that the software engineers draw on a mature, well-tested methodology for capturing and recording business requirements, and have tools at their disposal for implementing it.  By contrast, the data scientists and analysts have almost none of these. 

How can we equip data science with a systematic approach for both capturing the needs of decision makers, and from these, specifying the deliverables they must create to meet these needs?

Again, we can borrow from software engineering.

At the foundation of modern software engineering is the Object Oriented (OO) paradigm, which is sometimes representation in the diagrammatic Universal Modeling Language (UML). OO is a set of principles for describing the elements (or “objects”) in the domain within which a software application will work, how they behave, and how they relate to one another. In an industry where the shelf-life of an idea is measured by the time it takes to develop the next idea that supersedes it, OO has survived in its dominant position for more than three decades and there is no sign of any change to this. Its long-lived success is grounded in four key characteristics:

  1. It matches the natural, intuitive way people who are not software engineers think about the needs the software must satisfy;
  2. It contains sufficient detail for technical personnel who need to create the solution in code;
  3. It organizes information in a way that can be translated into algorithms and therefore, implemented on a computer;
  4. It is most naturally represented diagrammatically (in UML) which facilitates clear, transparent communication both between the end-user community and the developers, and also among the developers themselves.

If we apply these characteristics to decision making and data science, there are four objectives a successful approach must achieve, each corresponding to one of the above points:

  1. It defines a general way of representing decisions that corresponds to how decision makers intuitively think about them;
  2. It contains sufficient detail for data scientists and other analysts to be able to efficiently develop the information assets required by the decision maker;
  3. It organizes information in a way that can be translated into algorithms and therefore, implemented on a computer;
  4. It is most naturally represented diagrammatically (in, as we shall see, Decision Diagrams) which facilitates clear, transparent communication both between the end-user community (decision makers) and the analysts, and among the analysts themselves.

The very concept of a “decision” entails that the outcomes being sought are, some extent, causally related to the chosen actions, even if there is an element of randomness. Therefore, essential to the very idea of a “decision” is that the actions taken have the ability to influence the outcomes being sought. In fact, when decision makers consider their decisions, they almost always try to piece together the causal relationships between the actions they can take and their respective outcomes. (For more background see Dr. Lorien Pratt’s Link, the original book on Decision Intelligence). It follows that effective support for decision making boils down to:

  • helping human decision makers to map the causal pathways between actions and their outcomes, and
  • using this mapping to more accurately compute the expected outcomes corresponding to any set of actions.

This mapping is referred to as a Decision Model and, along with a Decision Model Diagram notation, satisfies requirements 1-4 above.

In the next post, we’ll build the decision diagram for this first production run case study.

Mark Zangari
Co-Founder and CEO at Quantellia, LLC | Website

In addition to his duties as CEO, Zangari leads Quantellia LLC's Scalable Solutions division, where he is responsible for financial technology, telecom, and Covid-19 solutions. Zangari is also the architect of the company's World Modeler solution suite.  Before joining Quantellia, Zangari spent 15 years as CTO for a spatial GIS company, where he specialized in providing solutions to utilities and telecoms.

A physicist by training, Zangari's papers on cosmology are still referenced today.

You may also like...