Corporate transformations are unlikely to succeed if people can’t change their behaviors. But leaders are often at a loss to understand how investing in capability building can enable the necessary behavior change. This disconnect may not be related to an unclear strategy or intention; in many cases, it may simply be a data problem.
At one aerospace organization, for instance, leaders wanted to boost the effectiveness of individuals and teams. They launched a series of training modules to help employees build new capabilities in communications, prioritization, problem solving, and meeting hygiene. The data on these capability-building efforts showed that employees were completing a high percentage of the modules and feeling positive about their instructors and the overall learning experience. By and large, however, the employees’ day-to-day behaviors were not necessarily changing.
The data that leaders typically use to measure progress on capability building—metrics like user clicks and progress rates through learning modules—often do not paint a full picture of how and whether behaviors are changing. And without this comprehensive view, organizations may miss meaningful opportunities to instill new ways of working, which can ultimately lead to stagnation. This is where the field of behavioral insights can be a game changer.
Behavioral insights is a relatively new approach to analytics pioneered by psychologists and social scientists. It takes patterns of human behavior into account when analyzing raw data. Perhaps the best examples come from the world of sports. Teams in Major League Baseball (MLB) and the National Basketball Association (NBA) use analytical models that consider players’ individual in-game statistics alongside intangibles such as performance under pressure or contribution to team chemistry. Before considering trades, contract renewals, or other organizational changes, analysts in the MLB or NBA can use the contextualized data from behavioral-insights models to differentiate between a good player on a bad team and a weaker player whose performance is buoyed statistically by the team’s success.
Business leaders can similarly use behavioral-insights models to assess employee performance and support corporate change initiatives. Based on our experience, there are three concrete ways that business leaders, working with IT, HR, and other functional leaders throughout an organization, can structure their analytics programs to reflect a behavioral-insights approach.
First, to set a long-term data strategy, leaders can create “end statements” that define the kinds of business questions they are seeking to address and, therefore, the kind of data they need to collect. Second, to better understand when behavioral changes could be meaningful and in what context, leaders can use comparative analytics to look at important statistics over time. Finally, leaders should ensure that any employees who are working with data are also trained to understand the key terms and principles of behavioral and organizational science.
Focusing on all three areas can help organizations and their leaders unlock next-level insights and analytics. For instance, this three-pronged approach helped the aerospace organization increase the number of capability-building “champions” in the company by almost 40 percent. These are employees who, after previously reporting little or no change in their daily behaviors, later reported material changes in their behaviors, as well as a strong desire to continue improving.1
Build your data strategy using key hypotheses, or ‘end statements’
Many organizations build extensive dashboards to track key performance indicators but often do not examine whether the data they collect are actually informing their decisions or just creating noise. Before deciding which data to gather, leaders should identify a list of questions they want the data to answer—what we call “end statements”—and then work backward to isolate the information they need to answer those questions. This list can be the backbone of an organization’s behavioral-insights plan.
End statements should not confirm an existing bias; rather, they should be designed to ensure that the insights gathered are relevant and actionable (see sidebar, “Another word on end statements”). The aerospace organization, for instance, began with the following end statement: “We need to understand what is preventing our employees from holding regular feedback meetings.” Was it a matter of employees’ or leaders’ lack of capacity or interpersonal skills—or something else entirely? With this end statement in mind, the company asked employees whether they felt they needed more time to hold feedback meetings, more information on how to conduct effective feedback meetings, or rewards and recognition for having feedback meetings. With this information in hand, leaders were better able to make informed decisions about which obstacles they needed to remove. In this case, that meant establishing capability-building programs that would help employees understand how to conduct feedback meetings more effectively.
Compare key statistics over time
Leaders need to remember that data are often contradictory. For a clearer picture, it can be helpful to gather multiple sets of data and compare information over time. That way, leaders may be both better able to recognize trends and less likely to attribute those trends to a broader narrative that may be based in bias. Indeed, comparative analytics can help leaders reconcile the complexity in their data, paint a more nuanced picture of performance, and help build a change narrative based on clear evidence.
One not-for-profit organization, for instance, took a comparative approach to understand the effectiveness of its leadership trainings. Initial survey data plus anecdotal evidence about the trainings suggested lots of engagement in and satisfaction with the workshops. The mechanics of the program itself were working fine—but an important question remained unanswered: What impact was it actually having on the ground?
To find out, the company collected data on individual leaders’ and employees’ behaviors both before and after leaders had completed their leadership trainings. Among the leadership behaviors analyzed were those related to coaching, adaptability, decision making, sponsorship, and motivation. The company gathered data on these and other behaviors from the trainees themselves, as well as their peers, managers, and direct reports. The data revealed improvements in all leadership behaviors, in some cases by up to 8 percent. Even more significant was a marked increase in the number of employees rated as top performers in specific categories. Prior to the leadership trainings, 3.5 percent of employees were rated as top performers in self-awareness; after the training, the percentage swelled to nearly 40 percent. Similarly, 35 percent of employees were rated as top performers in adaptability before the leadership training. After, 67 percent achieved a top-performer rating.
The numbers confirmed that the leadership program was having an impact; but the data also revealed opportunities to scale up the program and create more opportunities for professional development at all levels. This insight, as well as increased employee capability scores and personal-growth anecdotes, allowed leaders to build support for the program throughout the organization and develop plans for broader rollout.
Give employees the vocabulary they need to work with data
For organizations to generate the most accurate interpretations of data, everyone in the company who is working with the data should share a common language and a grounding in the essentials of data, analytics, and human behavior.
Leaders, data scientists, frontline workers, back-office functional leaders—all can be trained on concepts relating to behavioral and organizational science. For instance, with appropriate upskilling, data scientists in a company could use their understanding of human behavior to inform their analytical approaches. A basic foundation in common cognitive biases, for instance, could help them begin to recognize—and battle—biases in their own interpretations of data.
Leaders, data scientists, frontline workers, back-office functional leaders—all can be trained on concepts relating to behavioral and organizational science.
At the aerospace organization, leaders, data scientists, and learning experts used their new understanding of common terms like “behavioral intervention,” “archetype,” and “role modeling” as a shared anchor for their behavioral-insights plan and related discussions. Empowered team members were better able to interpret and discuss key data findings and were more confident about the important decisions they were being asked to make.
Using behavioral insights to transform an organization isn’t easy. There are many variables and data sets for leaders to capture and monitor, and just as many possible interpretations of those data. But in our experience working with companies trying to get the most from their analytics programs, it’s worth making the effort. The leaders who start now to generate behavioral insights and incorporate end statements, comparative analyses, and upskilling in behavioral and organizational sciences into their analytics programs can improve the odds of successful transformations. More important, they can build analytical capabilities that allow them to get the most from their data now and far into the future.