Fueling utility innovation through analytics

| Article

Advanced analytics can deliver enormous value for utilities and drive organizations to new frontiers of efficiency— but only with the right approach. There’s little to be gained from just bolting on a software solution. The real value comes from embedding data analytics as a core capability in the organization and using it to detect pain points, design solutions, and enable decision making. Conservative estimates supported by rigorous use-case analysis suggest that advanced analytics can boost profitability by 5 to 10 percent, while increasing satisfaction for customers and improving health and safety for employees. But capturing impact on this scale is no easy feat, and utilities often struggle with the same few challenges, which undermines the success of an analytics transformation. Below, we look at these challenges and show how they can be overcome.

Challenge 1: Developing an analytics strategy that’s clear about what to prioritize and why

As new applications proliferate across the energy value chain (see sidebar, “Applying advanced analytics at utilities”), advanced analytics poses a strategic challenge. How do utilities prioritize use cases and set appropriate aspirations for business impact? Without clarity on these matters, companies can easily lay themselves open to excessive influence from external vendors or get caught up in chasing the latest viral use case. One US utility partnered with a technology supplier and invested millions in wind-forecasting software only to discover that the effort wouldn’t yield any returns. Another large utility spent years building in-house analytics capabilities and developing more than a dozen use cases before realizing it had yet to make any headway on the biggest and most valuable opportunities.

Develop a comprehensive inventory of use cases spanning the whole value chain, including operations and support processes. Utilities often focus on customer or operational applications first, but smart companies place equal emphasis on support functions such as human resources, procurement, safety, and internal audit—all of which can drive just as much bottom-line value.

Structure your use-case inventory into groups of applications that resolve similar pain points or address the same business processes. Applications focused on areas such as asset maintenance, contractor productivity, employee safety, or reporting are likely to span multiple business units and deliver value across the entire enterprise, rather than within a single silo.

Using simple valuation methods, quickly estimate the potential business impact for each application across all applicable dimensions, including cost, revenue, safety, reliability, and employee engagement. This requires close collaboration between business owners, analytics specialists, and the financial planning and analysis team to ensure consistency in quantification and overall approach.

Prioritize the applications using multiple criteria, including value, feasibility, alignment with corporate strategy, and business engagement. How much weight to give each factor depends on the stage a utility has reached in its advanced analytics journey. When it is starting out, business excitement and engagement are critical to achieving buy-in. At later stages, value and feasibility become more important. By the end of the journey, analytics is so critical that priorities are dictated by overall corporate strategy.

Working through these steps need not take long. One utility took just a few weeks to develop a list of nearly 200 use cases, prioritize them based on feasibility and business impact, and select a handful of products to start building immediately. In mature digital organizations, the list of potential applications can be integrated into product strategies and constantly updated and reprioritized against other ideas. Relative newcomers often start with a simple yearly process to evaluate, update, and reprioritize the list.


  • Build a prioritized roadmap of use cases to pursue, based on vetted value estimates and alignment with the organizational strategy


  • Chase after viral use cases without assessing the value at stake
  • Leave it entirely to business teams and departments to prioritize use cases

Challenge 2: Converting hype into measurable bottom-line impact

Many utilities launch use cases but struggle to capture tangible value. Success requires the coordination of a complex series of steps, and the collective impact is only as good as the weakest link. Common facets of this challenge include:

Not understanding the impact at stake and the process changes needed to capture it. We’ve seen several companies make large investments in analytics projects without a clear business case or a monetization plan. Other utilities have developed promising predictive models but failed to implement the associated process changes needed to foster adoption.

Struggling to get access to data or use all the data available. Many energy companies have observations and maintenance tickets that could yield valuable data for predictive maintenance, safety, and other use cases, yet all too often this data is wasted because digital observation tools and text-mining capabilities are lacking. Another great source of data is utilities’ vast archives of recorded calls from call centers, which can be used—but seldom are— to create insights using voice-mining analytics. External data sets such as social-media and weather data are also commonly overlooked.

Being unable to deliver an analytics solution that works well. Adoption often suffers because of a lack of collaboration with business users during the development phase. Too often, organizations rely on senior managers or subject-matter experts from the business, but fail to involve the front-line crew members who will use the tool on a day-to-day basis.

Would you like to learn more about our Electric Power & Natural Gas practice?

Moving on to the next use case before value has been captured. We’ve seen utilities have success with an initial pilot but then be too quick to redeploy resources and funding before the effort has properly bedded in. The result is lackluster front-line engagement, limited adoption, and forfeited value.

Not having the necessary capabilities and talent. That doesn’t just mean data scientists, but all the roles involved in capturing business value, such as the designers and product managers who act as “translators” between data engineers, data scientists, and the core business. Other roles often overlooked include DevOps experts and the data architects who enable access to clean data. Though expensive, these capabilities can save money in the long run by simplifying data curation and processing needs in future.

Leaders in analytics avoid these pitfalls by:

Involving actual users in the solution design, not only during pilots but from the planning stage through to implementation. This is the easiest, most cost-effective way to capture valuable feedback, build engagement, and ensure adoption.

Following a business-centric approach that starts with developing a solid understanding of the performance of an entire work flow, such as plant outage management, asset maintenance, or record-to-report in the back office. From this understanding, an organization can identify all the levers available to drive a faster, safer, and more productive way to do business. Rather than focusing only on pain points in the current process, utilities should instead map out a fully reimagined three- to five-year vision for the whole work flow as well as a prioritized set of the technical solutions required to realize the vision.

Building a strong product-management capability that is structured around business processes rather than technical solutions. Product managers have full visibility and ownership of an end-to-end business process, drive the development of the future vision for it, identify which data sources and technology solutions are needed to achieve the vision, and manage rollout and the training of end users.

Developing a set of key performance indicators (KPI) that measure progress at every stage from model development and testing to user adoption and value capture. This ensures that lessons are learned from experience and errors are quickly corrected.

Developing an inventory of required capabilities by translating planned use cases into a roadmap for talent that includes all the skills—technical and nontechnical—needed to deliver an analytics project. This effort should also include defining a framework for assessing “make or buy” decisions based on technology complexity, use-case criticality, the scale and pace of the use-case rollout, and the utility’s long-term analytics strategy. For example, a utility aspiring to become a leader in renewables may prefer to develop an asset-maintenance solution internally to create a competitive advantage over competitors that use off-the-shelf products from vendors.


  • Involve users in solution design from the planning stage onward, not just during pilots
  • Follow a business-centric approach linking analytics solutions to a clear plan for process optimization and monetization
  • Create a standard framework to measure, track, and report on impact until full run-rate value has been captured
  • Allocate time in project schedules for change management and end-user training
  • Build a strong product-management capability with end-to-end responsibility for work processes, not technologies
  • Look for talent beyond data scientists and hire translators, DevOps experts, cloud specialists, and data engineers as well


  • Deliver a solution to end users and then immediately switch all resources to the next project
  • Think analytics talent = data scientists

Challenge 3: Making data enable productivity, not inhibit it

In our experience, analytics organizations often struggle to develop the data-governance and platform practices they need to deliver value. When a clear data strategy is lacking, the data ecosystem will be underdeveloped, making the development of new use cases costly and slow. Key aspects of this challenge include:

Undocumented data sources and multiple sources of truth. Alarmingly, organizational surveys often report that data users don’t believe their company has a clear data-ownership structure or feel confident that data objects are precisely defined or accurate, particularly when it comes to similar objects from different sources. In many cases, organizations have trouble simply establishing whether data exists. It’s also common for the business to have little trust in new data systems, and little comfort in using them.

Unclear access rights and privileges. It’s not unusual for some parts of an organization to limit or block access to data that’s critical for decision making or analytics, often citing data confidentiality or cybersecurity as the reason.

Insufficient tools and capabilities for preparing data for analysis. Many utilities struggle to ingest and curate data efficiently, which increases the time and difficulty of developing new digital tools. When building an analytics product, industry leaders spend no more than 20 to 30 percent of the development time on data cleaning, preparation, and blending—tasks that may take 60 to 80 percent of the time for laggards.

A “build it and they will come” mind-set. Some utilities commit to major projects in building data platforms or datastorage infrastructure in the belief that once data is available, the business will want to use it. But they can end up investing tens of millions of dollars in new systems without any real business benefits to show for it.

Best-in-class data-governance practices allow industry leaders to fast-track value capture by:

Defining a target data structure that is aligned with the organization’s needs, enforces standardization, and serves as a catalog providing a sound basis for key use cases. To manage this data structure, organizations need to align on clear dataownership and governance policies—who has access to what data—that are respected and enforced across the organization.

Using an agile approach to build the data platform and defining a minimum viable product (MVP) that delivers just enough functionality to allow the first few products to be developed. An MVP often relies on quickly deployable open-source technologies, easily obtained data, and tools such as Tableau and Alteryx that speed up the production of a proof of concept. More advanced or specialized components and data are often added iteratively in future releases.


  • Define organization-wide processes and rules for data curation, documentation, sharing, and ownership
  • Develop and maintain a central catalog of data
  • Develop a data strategy that supports the wider analytics strategy


  • Have unclear data rights and governance
  • Skimp on investments in data-cleansing, processing, and visualization capabilities
  • Invest heavily in collecting and cleaning data before starting to develop individual use cases
Accelerating digital transformations: A playbook for utilities

Accelerating digital transformations: A playbook for utilities

Challenge 4: Embedding analytics transformation in your culture and organization

A successful advanced analytics transformation depends on the right culture and organization. That means cross-functional teams working through short, iterative, test-and-learn cycles—an unfamiliar prospect for utilities accustomed to long development timelines. Navigating this transition will involve:

Creating an environment conducive to experimentation and learning while taking care not to jeopardize strategic pillars such as reliability and customer satisfaction. Adopting agile practices and launching short sprints to test new ideas in the real world (rather than debating them in theory) can often feel uncomfortable at first—but falling back on rigid waterfall processes will result in endless planning iterations, blown deadlines, persisting pain points, and a failure to create value.

Working out what kind of structure will best support the analytics transformation: centralized, decentralized, or hybrid. Many utilities are too decentralized, leaving them unable to reap the benefits of standardization and best-practice sharing. All too often, a utility deploys different solutions from different vendors to build what is essentially the same product in different business units. Lessons learned aren’t shared, and scale benefits aren’t captured. On the other hand, a fully centralized model is seldom the answer. We’ve seen utilities where the analytics organization and the business work at arm’s length, at the cost of misaligned priorities and—worse—the development of products that disappoint the end user.

Securing senior management commitment and appointing the right leader to act as a bridge between the CEO, the analytics team, and other parts of the organization. In some utilities, senior executives in charge of analytics are hidden three or four levels down in the organization, leaving them powerless to remove any roadblocks that arise. In other companies, their responsibilities are too broad and unfocused.

In our experience, most analytics leaders have a good grasp of the organizational model best suited to their company. This enables them to:

Establish the right culture, starting with top executives who are curious to explore new analytics solutions, have a bias to action, and strike a good balance between delegation and control. This starts at the top, with the CEO and senior team emphasizing the importance of analytics, providing the right incentives, and role modeling desired behavior. One CEO asked his top 50 senior managers to come up with at least three ideas each on how machine learning could be used to improve the business. In doing so, he not only created a vast array of use cases in a short time but also role modeled the intellectual curiosity and bias toward business impact that he expected from his leaders.

Shape a digital and analytics organization that fits the company’s governance model, maturity, and potential for standardization and best-practice sharing. This includes ensuring that the executive driving the analytics transformation has direct lines of communication to the CEO, even if the role doesn’t always report there. Most analytics teams adopt a hybrid model, with data governance, tools, and standards defined centrally; a close-knit community of data scientists working both centrally and within the business; and clear roles for product owners, who form cross-functional teams to drive the day-to-day execution of use cases and have direct ties to business executives.


  • Align your analytics organizatonal structure with your overall business strategy, governance model, and level of maturity
  • Develop a structure that ensures best practices are shared enterprise-wide yet enables business units to be closely involved in solution development
  • Embed critical analytics capabilities across the whole organization, not just in an analytics center of excellence


  • Limit your analytics transformation to new software development and overlook culture and project management
  • Expect the analytics teams to develop their own mandate for driving change across the organization

Advanced analytics is transforming industries worldwide and enabling organizations to achieve unprecedented levels of productivity. For utilities, which lag other industries in digital maturity, the value at stake from such a transformation is substantial. However, making the leap is far from easy, and many utilities place big bets only to fall short of their objectives. By adopting best practices and defining the strategy, culture, and organization they need to achieve their analytics aspirations, utilities can maximize their odds of capturing the step-change improvements that industry leaders already enjoy.

Explore a career with us