Advanced tech elevates process modeling in batch-processing industries

| Article

Pharmaceutical, consumer health, and consumer packaged goods companies have worked for decades to achieve an expert-level understanding of their end-to-end manufacturing processes using various modeling techniques. Traditionally, process modeling has relied almost exclusively on human experience and expertise. Now, with the help of newer technologies such as machine learning, predictive modeling, and faster data processing, process manufacturers can build more advanced models that go further to optimize processes even before producing a batch, forestalling problems likely to occur during batch production.

Companies can leverage these models to improve performance and quality—even beyond what regulators may require—and save time and resources. Leading companies that use this tech-enabled approach to process modeling have seen a reduction in deviations (for example, products outside specifications) of more than 30 percent and have cut the overall cost of quality by 10 to 15 percent.

In addition to these tangible benefits, advanced process modeling deepens manufacturers’ understanding of their production processes. With these insights, plant operators can adjust critical process parameters (CPPs) such as temperature or pH and improve decision making during production. In addition, the quality and R&D functions can upgrade product and process parameters during the next R&D cycle.

This article describes advanced process modeling, introduces three use cases for it, and offers guidelines to boost the likelihood of success with its use.

Upgrading business process modeling

Process modeling is a manufacturing discipline for understanding the effects of process inputs (raw materials) and CPPs on final products. The more adept manufacturers are at process modeling, the better able they are to monitor and control the CPPs that determine a product’s critical quality attributes (CQAs). CQAs are the physical, chemical, biological, or microbiological properties or characteristics that should be within an appropriate limit, range, or distribution to ensure the desired product quality.1 In the case of shampoo, for example, pH balance is a CQA. In pharma, nearly all CQAs—including, to name just two, concentrations of active principles and excipients—are highly regulated.

The more adept manufacturers are at process modeling, the better able they are to monitor and control the CPPs that determine a product’s critical quality attributes.

Process modeling informs the controls strategy that defines what, when, and where to test to ensure that the process performs as it should and maintains the quality of the product. Controls may include, for example, testing the chemical composition of a product, and the technical parameters at every machine in the process.

An effective process to control strategy is particularly important when the product being made does not have a well-characterized chemical structure or its quality attributes are difficult to measure due to limitations in testing or detectability. This is generally the case with, for example, biologics, which are manufactured in a plant, animal cells, or other living system. Because finished biologics cannot usually be tested to confirm all their components and characteristics, the companies that produce them strive to maintain uniformity in their manufacturing processes to ensure product consistency, quality, and purity.

Companies rely heavily on statistical models—mathematical expressions of the relationships between process parameters and obtained results—for process modeling. Traditionally, these models were based on scientific-first principles (for example, chemical and physical equations) and/or experimentation and then refined through statistical analysis of historical performance.

New types of models can go further to provide a deeper understanding of the statistical relationships between raw materials and CPPs, on the one hand, and product CQAs on the other. They can help to identify the truly important drivers of the process while isolating any noise or disturbances. They also reduce the need for experimentation when setting up processes and decrease the number of testing points.

Unlocking significant value using new models

Newer, more advanced process models can be used in three ways: on a one-off basis, for set-point modifications, and for online or continuous use (Exhibit 1).

1
Product mastery models drive impact through specific uses cases

One-off use

One-off uses of advanced process models are well suited to situations in which a significant process improvement is required and the model can identify it. The model can also simulate potential process adjustments and identify those that will deliver the desired result—an important benefit if costly equipment or infrastructure modifications will be needed. For example, a large continuous manufacturing site wants to try increasing the set temperature in its reactor to reduce impurities. Rather than running a series of potentially time-consuming and costly experiments on the line, the change can be first executed within the model to predict the outcome.

Set-point modification

Use of models for set-point modification tackles the constant variability of inputs from batch to batch. Among the many sources of variability are uneven quality of raw materials, changing environmental conditions, and differences in proficiency among machine operators. Effectively responding to this variability is critical to maintaining consistent results and limiting deviations.

Prior to starting a commercial batch, the manufacturer can run the model with a set of inputs, including, for example, raw-material quality characteristics, technical parameters of the process, and room temperature (Exhibit 2). The model will then suggest adjustments to CPPs that effectively account for variability.

2
Machine learning models identify the biggest variability drivers in quality test results, and simulate effects of adjustment

Consider a pharma manufacturer that makes tablets and sources the validated active pharmaceutical ingredient (API) from two different vendors. Although both are within specifications, differences between them—for example, in tablet breakage—affect the final product. For each batch, the process engineer would run the model using as inputs the specific quality attributes of the API being used; the model would then recommend the parameter settings for an optimal batch (in the example, a batch with no tablet breakage).

Set-point modeling also can be used for root-cause analysis of problems, where it can shorten investigation times and identify effective corrective or preventive actions. During an investigation, the model could quickly simulate possible scenarios, reducing the need to run physical experiments to test various hypotheses. Additionally, the model could be used to unearth relationships that lie at the root of repeated problems and could provide solutions that would otherwise be difficult to identify.

Online modeling

An online model continuously receives process inputs and predicts outputs; thus, it can foresee a problem and raise the alarm for operators before the problem occurs. The more advanced of these models not only can predict problems but also can prescribe corrective actions. Online models have been demonstrated to increase the frequency with which the manufacturer can get a batch the first time and reduce the number of deviations. For example, in a biomanufacturing facility, the optimal end point of a reaction—the point at which an indicator shows that the proper amount of reactant has been added to a solution—is difficult to determine and varies from batch to batch. An online model could continuously monitor the process and predict the best moment to end it, alerting the operator and improving the yield.

Having a better grasp of the process could also ultimately lead to fewer quality tests—in-process and in the lab. With the ability to accurately predict a quality attribute, there is a case for only needing to control the parameters that impact it and not the quality attribute itself.

A clear application of this logic is used in “soft sensors,” which do not directly measure a quality attribute but instead use a model to calculate it based on other measurable parameters (Exhibit 3). This is particularly helpful in situations where directly measuring isn’t feasible or the process must be frequently interrupted for taking samples.

3
Soft sensors enable in-process optimization, increasing throughput by up to 15 percent

Prerequisites for advanced process modeling

By paying careful attention to several dimensions of advanced process modeling, manufacturers can bolster their chances of success. Essential dimensions to consider include an understanding of the process, the availability of relevant data, and a paradigm shift in ways of working.

Process understanding

Although models are intended to explain a process, they are by no means substitutes for process understanding. Knowledge acquired during R&D and through process runs serve as the starting point for model development.

Likewise, experts with deep process understanding are essential to confirm the results offered by the model. During the model’s development, process engineers can check to ensure all expected relationships among parameters and attributes have been identified. They also can validate other, previously unknown relationships. Later, during the continuous use of the model, knowledgeable operators can accept or reject prescriptive recommendations.

It is essential for organizations to build this process understanding from development through implementation. They can do so by involving site process and quality engineers early in the development of the model and then looping in development engineers, who are in touch with operations once the model is running.

Availability of data

Model development and testing require high-quality data, particularly for machine learning models. In addition to data about process parameters, the model may incorporate data from numerous other enterprise systems, including manufacturing execution, supervisory control and data acquisition, enterprise resource planning, laboratory information management (LIMS), and information technology/operational technology (IT/OT).

Preparing data for use in modeling can be extremely time-consuming, particularly if data sources are offline. Data will have to be stored in an accessible location for use either in development or continuous input. Additionally, data to train the model will need to be structured and cleaned in advance. This effort can take weeks to several months, depending on the status of the data, and is usually completed by a data engineer.

The types and update frequency of data will define the needs for the IT/OT infrastructure that supports the model. The ability to extract data automatically from disparate sources will significantly reduce model development time but can substantially increase infrastructure costs; therefore, companies will need to evaluate the trade-offs carefully. If a model is to be used regularly, it must follow the product life cycle, including updates, so source data must be regularly updated as well.

Paradigm shift in ways of working

Building digital capabilities will be essential to create and maintain advanced processing models. Not only will the manufacturing need to have modeling capabilities, but end users will have to be trained on how models work, when to use them, what limitations they have, and how to use them on a day-to-day basis. Additionally, IT departments will need to evolve into a true partner with a more active role in process improvement.

Leaders will play a vital role by providing the necessary opportunities for capability development, enabling changes in ways of working, and participating directly, including by strategically deciding where process models are needed and to what end. Rigorously tracking the impact of modeling efforts will help them set priorities and target investments.

The company, especially the quality organization, will need to shift its mindset with respect to process control. At a minimum, the objective of the control strategy is to ensure predicted quality. Getting to this point will necessitate changes to different quality strategies such as the validation strategy and the in-process control strategies. It is therefore crucial that the quality organization play a major role in developing process models, so it can adapt accordingly and properly explain the new models and related tools to regulators (Exhibit 4).

4
The US FDA classifies process models based on impact, and applies different requirements for each type

Using tech-enabled modeling in daily operations can help reduce deviations, ensure consistent quality, optimize outputs, and take process understanding to the next level. But it also requires that companies build in-house expertise, finding the right balance between off-the-shelf commercial software and tailored solutions. Understanding the technology is crucial for building trust in the model’s recommendations, which in turn will increase the users’ understanding of the manufacturing process. Only then can manufacturers unlock the full potential of advanced process modeling.

Subscribe to the McKinsey Talks Operations podcast

Explore a career with us