PharmiWeb.com - Global Pharma News & Resources
05-May-2023

The Urgency for Innovation in Process Development for Life Sciences

The Urgency for Innovation in Process Development for Life Sciences

Summary

The challenges of manufacturing and supply chain, the shortages of the most essential drugs highlighted during the pandemic, and the approved substances that we are not able to bring to the market fast enough, it is abundantly clear that we as an industry are inadequately prepared to meet these challenges if limited by current tools and methods used in product and process development.
Editor: Quartic.ai . Last Updated: 05-May-2023

If we look at patient needs, the challenges of manufacturing and supply chain, the shortages of the most essential drugs highlighted during the pandemic, and the approved substances that we are not able to bring to the market fast enough, it is abundantly clear that we as an industry are inadequately prepared to meet these challenges if limited by current tools and methods used in product and process development (PD).

Those needs are addressed by the tools, technologies, software, and the accompanying new work processes that we collectively refer to as “digital transformation” (of PD). The underlying technologies are data, AI, and machine learning and the modern statistical methods and software that is used to build applications specific to PD.

To do this successfully, data, process and chemistry, and the control strategy must accompany the molecule on its journey to the patient. What that means is capturing the knowledge from PD to all the successive stages all the way up to final product release can be accomplished with data and models built using machine learning, statistical methods, and sometimes mechanistic models.

This is one way of describing a digital twin (or rather a digital thread).

ICHQ12 and Quality by Design (QbD) are well-understood. It has been some time since they were introduced, but surprisingly few life sciences manufacturers are applying it. If QbD is acknowledged to be the most robust way to manage the efficacy and safety of a substance, why isn’t it applied ubiquitously?

It’s because we cannot effectively do so with the existing systems and ways of managing data. But we can with the modern data and AI applications available today, and they’re accepted by the regulators.

Understandably, there is a lot of focus on the lack of manufacturing capacity and the need to bring more online -- preferably close to the patient population with speed. But PD capacity is perhaps a bigger challenge. When it comes to bringing new products to market, PD is the biggest bottleneck to commercial manufacturing. If you are a PD practitioner reading this article, or if you chat with a colleague who is one, you can speak to the tremendous pressure PD teams are under.

And PD practices and needs apply not only to new product release, but also to adequately addressing the investigative needs for process design changes and improvements.

With these new applications and workflows, we can not only increase PD capacity, but also carry out more efficient tech transfers. With the increasing use of CMOs and CDMOs, if we can make PD output digitally accessible, we can have a more objective, transparent view of capacity.

When we look at adoption of AI and machine learning, there still are challenges and some inertia when deploying these technologies on production and manufacturing – some real and others perceived. This is because the perception is that having algorithms changing set-points is unacceptable (hint: it shouldn’t be).

The perfect step in life sciences to deploy these technologies in PD. Why? PD consists of experimentation, DoE (Design of Experiments), and discovering the design and control space of critical process parameters through these experiments.  By adopting AI at this stage, not only can we accelerate PD, but we can also build trust in these models for further stages of manufacturing because the models have been tested and validated. Deviation investigations during manufacturing using these models become more efficient and trustworthy, with continuous reference not just to the control space, but also design space. Now we can inherently implement QbD.

While PD is natural for use and adoption of AI, there is a big paradox and a challenge. Large amounts of data are never available at this stage. Almost all machine learning approaches are founded in big data. But scientific knowledge of the molecule and prior knowledge of the SME do exist at this stage. Using small data algorithms and optimization (not prediction) focused algorithms can be used to overcome this challenge, and they’re commercially available and in use by early adopters.

With minimal data generation ability in the lab equipment combined with these small-data algorithms, PD can be highly automated. Just like running batch campaigns during manufacturing, these applications can also manage PD campaigns, accelerate product development, tech transfer, and support QbD.