How to Blend Data, Experience and Intuition for Better Decision Making

by   |   April 21, 2014 5:30 am   |   3 Comments

Pete Eppele, Senior Vice President, Products and Science, Zilliant

Pete Eppele, Senior Vice President, Products and Science, Zilliant

Given the value hiding in data, many companies have transformed their decision making by incorporating information revealed by data analysis. But others continue to struggle with making data part of the process and are slow to or even resist including the insights data has to offer.

Optimization and analytics company Zilliant has worked with clients to incorporate data insights into their pricing decisions, an area in which many had been reluctant to introduce data analysis. Data Informed sat down with Zilliant Senior Vice President of Products and Science Pete Eppele to discuss why some organizations have been slow to merge analytics with experience and intuition, and how to encourage a more data-driven approach.

Are companies in this era of big data continuing to make decisions based only on experience and intuition? It’s almost like they are ignoring this resource that’s available.

Pete Eppele: Right. I think that companies are transitioning. I would say they are transitioning slowly very often, because it’s hard to find a mix. I think that people feel as though you either need to be completely intuition based, maybe put a little bit of data behind it, or you are giving up your decision making process to a black box. And the reality is there’s a middle ground in there where you can leverage the scale that data science can bring but still have the ability to visualize and understand and influence the outcomes of those models. That’s where we find the sweet spot is for people, and so where we want to get people to is a state where they can leverage all of the value of data science because there’s an inherent limitation to looking at historical data, trying to understand what’s going on, and merge it with your business judgment. Valuable, good, but as you see from our point of view, historically there’s so much bias that’s implicit in that, the decision making is sub-optimal.

Related Stories

Data Driving More Marketing Decisions at Tech Firms, Panel Says.
Read the story »

Improving Decision Making by Accelerating Time-to-Answer.
Read the story »

Thomas Davenport: Big Data Analytics Goes Beyond Internal Decision Support.
Read the story »

Great Visualizations Convert Big Datasets into Decision-Support Tools.
Read the story »

What kind of bias would you say is part of this process?

Eppele: The bias that you see coming out of people’s decisions is a lot of times just the experiences they have been in recently will have a greater impact on their decision making process. So if I just sold a deal to a large customer who really beat me up on price, I might project that out to everybody: “Everybody’s really sensitive to price,” or, “I really need to be at this certain price level to move this product.” Every situation, every circumstance that you are in is different, and one of the nice things about applying analytics is that it can be kind of unemotional, or unattached, and just look at the situation very objectively and come up with a recommendation in that context without carrying that baggage of what’s recently been put in your mind, but equally weighing everything you have observed over a year or two years in the recommendation.

Are we talking about trying to look forward as opposed to looking backward?

Eppele: It really is. I think it’s creatively using the data in ways that helps give you a sense of the future, so you are not just looking at what’s happened in the past, but you are also looking at how things are trending and which direction they are moving. And I think the threats to your business aren’t necessarily what happened in the past; it’s the confluence of things that happen in the future. So if you are able to see trends in your data, if you are able to connect with other things environmentally that are going to have an impact and change things from history, that’s really when you can do a good job of ultimately getting in control of situations and getting out ahead and not just waiting and reacting. The more you find yourself in a position of reaction, for that information to filter through to analytical systems might take three months, it might take six months. The shorter time frame it takes to detect a market change and be able to do something about it, that adds quite a bit of value, especially in the world of pricing because small changes in pricing make huge profitability impacts.

How do you find that sweet spot between experience and leaning on data?

Eppele: One of the things that we see fairly often in businesses is that when you think about all the complexity – all the products, all the customers, everything – humans just don’t have the capacity to be able to think about every decision. What will often happen is people will start to apply their focus more to the bigger, more qualitative, aspects of the business. So they might put more of their personal focus on the highest velocity products, or they might put more of their personal focus on the biggest customers and allow optimization to handle a lot of the smaller elements of the business. So they are applying analytical rigor across everything and using the recommendations along with their own intuition in the larger parts of the business, whereas on the long tail of the business, where there’s tremendous opportunity to be had and where they often historically have spent little to no time with focus, now they are getting the benefit of this machine being able to do the analysis, come up with price recommendations, and have improved performance without a lot of personal investment of their own. So a lot of our customers have kind of shifted their focus in that way, where they may have tried to cover a broader span but still are only scratching the tip of the iceberg. Now they are able to cover a much wider span of their business and then focus their time, effort, and energy where it’s most merited, where there might be things that are outside of the scope of the data or, strategically, changes that they want to make. They can focus their time and effort there instead of having to manage this whole swath, complex set of business.

I imagine that when you are dealing with a new client and trying to wean them off taking responsibility for all the decision making and trusting data to do some of the heavy lifting for them, there’s a cultural shift, a challenge that exists there. What advice do you offer them in saying, this is how you go about it, how you learn to let go of that control?

Eppele: I think transparency is really key to that, because, like you said, it’s almost a fundamental change in how people manage business, and in some aspects it’s threatening to people as well. There’s a massive organizational change respective to it, which is for the better. It’s about getting decision makers to be more comfortable with the idea of data being behind the decisions, but also giving them the transparency to be able to see exactly how decisions are being made and see exactly how data is being used. We never work with a customer and not have some sort of a situation where we do a deep, deep, deep dive. We will give them a price recommendation. They will say, “That is wrong. That price doesn’t make any sense.” And then we go diving down into the data, literally down to the transaction lines. It takes that kind of digging a lot of times to establish the trust. And we welcome that. We encourage it because without the trust, you struggle to get adoption of the system. But with that transparency, you develop the trust and eventually you get to the point of the transformation where people feel comfortable and confident with the system.

Do you find you get a lot of resistance at first?

Eppele: I would say it varies. I think that people who define their value to the company in terms of the intuition and the feel and the art that they bring are the most threatened and the most resistant in a lot of cases. The moment that it clicks is when people realize, “I’m not perfect, this thing isn’t perfect, but me and it together is going to make me more money.” Getting to that moment is what’s key, and that’s where the change management process comes into play. We just talk about what thinking is going into the decisions we make, what factors  we consider – the customers’ geography, their size, their recent purchase habits – so we basically can help people understand that this thinks like you think, and if you want to see how it thinks specifically, we can drill down into that and hopefully within weeks and months you can see how you are getting value and benefitting from it being in place.

Do you find that there’s a large learning curve for companies that adopt this kind of decision making?

Eppele: One of the things that we focus on is making it very easy for people to be able to move over to the new world. A lot of times what we will do is fit into an existing business process and an existing structure and framework that they already have. If your expectation is that you are going to have people completely rethink the pricing process or how they go about interacting with their customers, I think you have a steeper learning curve ahead of you.

What’s the best process for the most effective decision making?

Eppele: I think the most effective processes are a mix of man and machine. Because there’s always intuition that’s based within the process. Our approach is to surface to companies what we believe to be the biggest opportunities for them and then allow them to pick and choose what they act on and how they act on that. I think that picking and choosing is ultimately the perspective of intuition that gets integrated in the process. It’s a mix of expertise and data science, but the data science has to be … an evolution toward bringing together different sources of data, forward prediction, and then applying those predictions against your goals, matching those up against your goals to get recommendations, and it’s about creating a prescriptive system. That’s one of the failings of the analytical systems that people are often using, that they are not prescriptive at all. It’s just simply saying, “Here’s data. Draw your own conclusion from it.” When you can have algorithms draw the conclusions and then couple that with a person’s judgment, that’s what we have found to be the best way to enhance your decision making process.

Scott Etkin is the managing editor of Data Informed. Email him at Follow him on Twitter: @Scott_WIS.

Tags: ,


  1. Jim Vaughn
    Posted April 27, 2014 at 9:45 pm | Permalink

    Recently I wrote about how weather predictions have improved dramatically since World War II. Back then, pulling together a forecast was very laborious because of a lack of reliable weather data. The Air Force Weather Service would launch trial balloons and use a device similar to a telescope to visually track the balloon’s rate of ascent, direction of drift, and speed. And, they would have to get data from other people spread out all over Europe before they could try to predict the next 24 hours weather just in their small corner of the war. In summary, it was very hard to be accurate.

    Since the mid-70s we’ve had more and more weather satellites and now we have access to all sort of weather data. So much so that people joke about how the Weather Channel and others have made news out of weather – even before anything happens! Weather satellites gather data continuously. And there are many places all over the world gathering ground temperatures, ocean temperatures, winds, humidity, pressure and cloud cover data. We gather weather data not just 24 hours a day, we gather it at least 1440 minutes a day and we hold onto all of that data. And, by having access to all that data and new powerful weather prediction models, we can make very accurate forecasts. But, we still want a human-in-the-loop to interpret those the forecasts. Ultimately a person needs to make a judgment on how to use the competing forecasts to help us improve decision making – should I take an umbrella or should I evacuate ahead of the coming hurricane?

    Over the last 20 years businesses have similarly been putting in place large ERP systems and Business Intelligence-focused data warehouses and quoting systems. These systems gather transactional data – and not just 24 hours a day or 1440 minutes a day, but more often than that. That data can be put to use to help the sales be smarter when negotiating. Forward leaning B2B companies are starting to do what B2C companies like Amazon, Walmart, and others have been doing for the last decade. They are leveraging that data and turning it into actionable pricing guidance to help salespeople make smarter price setting and negotiation decisions – not just deciding whether to carry an umbrella. This requires business decision makers to develop some new skills and move from a gut-driven to a data-driven view of what’s best for business profitability … but many are doing just that!

  2. chen Linchevski
    Posted May 1, 2014 at 12:41 am | Permalink

    We believe that the best practice is to inject the experience into the data before you model it. The results of the analysis are much more accurate and the business acceptance is much better.

  3. Jim Vaughn
    Posted May 8, 2014 at 10:49 pm | Permalink

    Chen, I agree that a well-designed pricing model should incorporate significant information from ‘experts’…in this case, the experts are the Sales Reps leveraging their collective experiences. The transactional data captures much of that expertise – not all. Could someone inform the pricing solution with 3rd part economic trend data, sure! Could someone inject other factors from other models into the pricing model, sure! But even after they have done those tasks, someone has to interpret the pricing results — make a judgment call, because there are inherently factors like locality or data lag that typically can’t be accounted for into the model. As George Box said in his 1987 book Empirical Model-Building and Response Surfaces “Remember that all models are wrong; the practical question is how wrong do they have to be to not be useful.”

    Let‘s talk about my weather analogy again. There are numerous major weather models in use, they all use many of the same core data elements but each may use some unique data or some differing assumptions based on how the designers viewed the relative importance of certain facets of the weather problem. Despite the fact that weather forecasts are rather sophisticated, some work better under some circumstances than others. National/regional weather models probably are not going to contain the level of detail to be affected by very local ground features that might shield one neighborhood or valley from the strongest gusts or largest runoff. But a local weather forecaster likely would have some insights from experience, when combined with looking at the results of 2 or 3 major weather forecasts, that allow he/she to tweak the forecasts for the likely effects in the immediate vicinity. This is the man AND machine advantage.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>