PharmiWeb.com - Global Pharma News & Resources
30-Nov-2021

Tackling smart quality monitoring in life sciences manufacturing in real time

Tackling smart quality monitoring in life sciences manufacturing in real time

Summary

As drug and medical device production ramped up to respond to the pandemic, manufacturers faced the challenge of more effectively monitoring, analysing, reporting and acting on quality performance. Siniša Belina, senior life sciences consultant at Amplexor, reveals the latest best practice.
Editor: PharmiWeb Editor Last Updated: 30-Nov-2021

Across a whole swathe of industries the Covid-19 pandemic has shone a light on restrictive business processes, information silos and poor supply-chain visibility. In life sciences manufacturing, for instance, a range of challenges linked to quality management have been exposed and starkly felt.

On the one hand, public safety measures over the last 18+ months have put physical distance between team members - hampering the usual form-filling, manual sign-offs and Excel-based record-keeping associated with monitoring traditional manufacturing processes. And informal discussions at the watercooler, in which patterns of emerging problems might have surfaced, have simply not happened.

Increased practical barriers to quality assurance, added to the missed opportunities to spot and pre-empt issues along the supply chain using data analytics, have helped drive a renewed business case for intelligent, joined-up quality monitoring based on a single, global, real-time graphical view of all aspects of production.

In the meantime, other parts of pharma organizations have seen first-hand the benefit of pre-emptive signal detection. This is most visible in pharmacovigilance, where use of smart systems present a department’s best chance of accurately processing reams of incoming adverse event data, and meeting deadlines - with the confidence that nothing critical will be missed.

Proactively monitoring and establishing alerts for potential manufacturing quality issues/product deviations, or process non-conformance, would be another logical use case for the same kind of software solution.

Spotting emerging patterns very early on

The case for harnessing smart, real-time quality analytics is strong and growing. Particularly where artificial intelligence/machine learning is involved, this is about being able to spot emerging patterns very early on – at the first sign of deviation/non-conformance. Issues might range from recurring problems with equipment to varying impurity levels/product instability whose cause needs further investigation.

Up to now, the tendency has been to view quality monitoring as a compliance activity, linked to the regulatory requirement for a periodic Product Quality Review. Yet this approach doesn’t necessarily invite continuous, real-time quality monitoring, nor with it the chance to stave off production line issues before avoidable risks and costs are incurred. If issues do surface during preparations for a review, these are likely to be fairly established and now require retrospective investigation to determine what has gone wrong, the likely root cause, what the impact may have been, and what remedial action is now required.

This is a wasted opportunity, especially as much of the data to support more continuous and timely quality tracking is being gathered anyway - with a view to creating that annual review report at some later date.

The issue is that this data is not being amalgamated, compared or processed in the moment - to produce actionable insights and/or to trigger alerts.

Enabling continuous, active quality monitoring

Moving to a situation that enables continuous, active quality monitoring does not require a major upheaval. The main criterion is that systems are able to draw on data from across functional or departmental silos, so that deviation details, environmental data, complaint information, and Corrective and Preventive Action (CAPA) records can be combined and cross-checked on the fly. Better still, analytics and reporting tools should be able to call on historical data too, allowing live comparisons to be made and enabling immediate smart signal detection wherever incoming data deviates from current parameters and past patterns.

All life sciences manufacturers are on a drive to be more effective and efficient with resources, driving up quality without over-extending internal resources - and smart, real-time quality monitoring and reporting plays directly to this requirement.

What’s more, it is much easier to implement such capabilities in the current ‘cloud-first’, ‘platform-based’ enterprise IT environment. Here, adding new capabilities and use cases is often simply a case of switching on additional features, or user groups who are able to draw on already-centralized, pre-integrated data, for tailored display and application for their own particular purposes.

Maximising existing data

Given that much of the necessary data exists or is being captured already, amalgamating data sources, ideally in a centralized, cloud-based repository that underpins multiple use cases, then adding in a smart analytics and reporting capability can elicit an immediate return. There is clear business case arising from saving resources/reducing waste, and ultimately by preventing a sub-standard product batch from leaving the production line.

Rather than conducting quality reviews and further investigations retrospectively, smart on-the-fly reporting gives manufacturing teams a chance to explore emerging issues and perform root-cause analyses in real time. The ability to configure the analytics, viewing, and reporting experience to fit particular user requirements will be important, as well as the capability to set thresholds or parameters to trigger push notifications. Moving the business case beyond compliance and focusing instead on the efficiency, cost and risk reduction benefits that result from smart real-time quality analytics is the future for life sciences manufacturing.