Now that essential product functions have become increasingly commoditized, product design has emerged as a crucial source of differentiation. But the best companies have already extracted many obvious sources of advantage from this.
The next level of product optimization therefore not only combines the latest design thinking with multiple sources of data but also exploits sophisticated advanced-analytics methodologies to generate insights about potential cost and value improvements. For example, computer-aided design tools linked to vast pools of procurement data, social-media activity, and cost and complexity benchmarks can allow a company to quickly identify designs that maximize profitability while minimizing wasted time and effort.
Such breakthroughs are not just for the consumer sector. One of the world’s largest industrial conglomerates brings these ideas to life with products meant not for individuals but for utilities—whose traditional business model has been upended by renewable (and increasingly customer-generated) energy sources and more sophisticated consumers. The conglomerate’s improvement target: within four years, cut delivery lead times by more than half, defend and increase market share, and raise profit margins by about 30 percent.
In utilities, as in much of today’s business world, decades of acquisitions have left many companies managing dozens of systems—especially IT systems—that never get fully integrated. Meanwhile, product proliferation is a constant battle as small variants in specifications generate hundreds of mostly overlapping SKUs.
How to start building your next-generation operating model
Standard methodologies for combating this complexity not only take vast amounts of time and effort but also may not even identify the right changes. Yet with new digital analytics tools, the conglomerate completed an analysis, in just two weeks instead of several months, that identified specific commonalities the company could use to reduce variations among product families, subsystems, and components.
Analytics and automation
Analytics has made procurement a much more promising target for savings by tapping a previously impractical data source: the procurement and engineering department’s own bills of materials. New tools can upload thousands of records, held around the world in dozens of local languages and part-numbering structures, to find potential commonalities and opportunities to negotiate better pricing. An early step toward artificial intelligence,
robotic process automation, can then allow software “robots” to take over tedious processes, such as collating information from disparate systems for complex forms, and thus frees people to focus on work that uses their judgment and experience. Finally, combining multiple data streams—such as on actual spending, product cost structures, sales, and so forth—into a data “lake” allows sophisticated algorithms to engage in optimization dynamically, enabling constant adjustments as conditions change. Powerful portfolio analysis
Together, techniques such as these can generate a much more detailed analysis of an entire product portfolio. With slight modifications, the conglomerate found it could eliminate 15 to 80 percent of product variants within a category. Already, costs have improved by approximately 30 percent.
This article is adapted from “Ops 4.0: Fueling the next 20 percent productivity rise with digital analytics.”