The role of expertise and judgment in a data-driven world

| Video

Using data can drive better decision making, but numbers alone don’t paint the entire picture when it comes to forming a cohesive strategy. Civis Analytics is a data-science technology firm that works with organizations to further understand the meaning behind big data sets and pinpoint the data that merit “listening to,” subsequently empowering leaders to make smarter predictions and take action.

In this interview with McKinsey’s Rik Kirkland, Civis Analytics CEO Dan Wagner discusses the synergy between data interpretation and human intuition. Civis Analytics started out on the campaign trail, where Wagner led the Obama for America analytics team. Civis is based in Chicago with an office in Washington, DC. An edited version of Wagner’s remarks follows.

Interview transcript

Strategy in a data-driven world

Strategy to me, when you boil it down to first principles, is three things. Number one, it’s an assessment of what you think truth is today. Number two, it’s a prediction of what you think truth is tomorrow. And number three, it’s a decision of how you’re going to place your resources amongst any number of different alternatives based on your prediction of truth.

Classically, the kind of approach to strategy toward establishing truth, predicting what truth would be, and then identifying the different kinds of options for the placement of your limited resources was a human-mediated process. Through the course of years, a class of people would be promoted into a company, based on their expertise, their history, their control of context, in order to think about what kind of truth was in the world, and what, from their subjective judgment, it was going to be in five years.

[These people would have to think about] what kind of list of potential options were based on that interpretation of the future, how the organization should spend its resources accordingly, what markets it should enter, what products it should develop, and who it should hire in order to kind of satisfy that mission.

I think data has changed how you do that. Measurement has replaced intuition in terms of establishing truth and kind of replacing the classical form. Algorithmic prediction, which is essentially the use of the available bodies of data in order to predict the future, has replaced expertise inference. So, inference by experts in terms of what’s going to happen tomorrow. Or maybe not replace, but complement that in a form.

I think data, in terms of the kind of broad accessibility of digital representation of events, has broadened our ability to understand what different alternatives are available to us, and then the kind of value of those different alternatives, through simulation and estimation.

I think when we ask, “What is the nature of strategy in a data-driven world?,” I think the fundamentals of strategy stay the same. What changes is the means under which you do that, and the set of information and personnel that you have available at your disposal in order to do it.

How polling has changed

There has been an important systemic change in polling that has compromised the ability of phone-based political research.

Number one is just a replacement of telephone technology. People are dropping their landlines, and they’re using mobile phones.

Two is declining response rates across all forms of survey measurement. It used to be 10 percent of people that you would call would answer a survey. Now it’s 1 percent of people who answer a call, or even less.

That’s a phenomenal variance in response rates that you have to take into account. And if you don’t know what it’s likely to be, your measurements can be way off in terms of the panel of folks that you’re getting.

So, what does this mean for [the] kind of survey and polling research that companies do when this is the principle means that they use to understand consumer attitudes and behaviors?

A few things that you need to think about. First is, we work with a lot of companies, and survey research is often presented as a set of bars, charts, and PowerPoint. If you’re a company, you need to kind of invite healthy skepticism into the process, and ask, “What is behind these bar charts? How was the data collected? How representative is this of the general population, or of my consumers? Is there any bias present? What is the projected uncertainty around these bars and the PowerPoint chart that I’m getting?”

Because if the uncertainty is wide, I need to know that. That’s going to help me interpret whether I go whole hog toward this, or whether or not I have a more risk-abating strategy.

I think what people need to do is build literacy into their organizations in terms of understanding and a healthy skepticism toward what it actually means. And then finally interpreting that we need to make some big changes in how this type of research is done.

We’re making changes in terms of moving more toward multimodal research using controlled online panels that we can use to assess this. We’re trying to join that to larger bodies of population-level consumer files so that we can detect where those biases occur.

Companies, whether it’s with their consultants or the consultants themselves, are going to have to think deeply about how they’re revising their survey research to take these biases into account, because they are structural, and not just unique to the political sector.

The age of analytics: Competing in a data driven world

The age of analytics: Competing in a data-driven world

Nerds and experts

Data is not a religion. It is not a panacea. Data isn’t going to tell you what data you need to listen to. Humans are going to tell you what data you need to listen to. And, I think there is some kind of symbiosis in terms of executive leadership and strategy: what you’re listening to, what predictions you’re making with what levels of certainty around that information, and then what decisions you think you can [make] based on that observation.

Data has some natural superiorities to humans for some things. It can see more. It can observe more from a growing body of information-technology sensor networks, traffic lights, et cetera. Humans just don’t have the kind of eyes wide enough to see what’s going on within a traffic network.

But again, the human has to articulate what’s important, what it should measure. I think the Obama campaign is a good model for how the nerds work with the experts. I was a nerd, while David Axelrod and David Simas were the experts. I think what was special about that circumstance is that we both approached our conversation with the other with appropriate humility.

What I saw the Davids do that I don’t think the data did do, was [have] an understanding of human story and human narrative. The data isn’t going to produce that for you. I saw the way that they were understanding narrative and story and saying, “All right, I see this in your data about these populations, and the two questions you’ve answered, but here are the five I’m not clear about. And how can you answer those for me? Here are some of the limits that I see in terms of your measurement,” and having you think about that.

And then conversely, I would ask the same questions of them.

I think in terms of the relationship between the nerds and the experts, that was a pure example of how we used our mutual intelligence, one data-driven and the other based on human experience of story and context, in order to execute properly on what we thought was a good strategy.

Explore a career with us