PharmiWeb.com - Global Pharma News & Resources
24-Apr-2020

Continuous data improvement: the foundation for effective AI

Continuous data improvement: the foundation for effective AI

Summary

Intelligent analytics and smart process automation rely 100 per cent on the trustworthiness of underlying data. Steve Gens of Gens & Associates and Remco Munnik of Iperion Life Sciences Consultancy offer five best-practice tips for achieving and maintaining integrity across company-wide product information to create a sure footing for AI-driven innovation.
  • Author Company: Gens & Associates
  • Author Name: Steve Gens & Remco Munnik / Iperion Life
Editor: PharmiWeb Editor Last Updated: 24-Apr-2020

Intelligent analytics and smart process automation rely 100 per cent on the trustworthiness of underlying data. Steve Gens of Gens & Associates and Remco Munnik of Iperion Life Sciences Consultancy offer five best-practice tips for achieving and maintaining integrity across company-wide product information to create a sure footing for AI-driven innovation.

‘New’ technologies including artificial intelligence/machine learning (AI/ML) offer exciting potential for transforming business decision-making and process delivery. Yet, in their haste to take advantage of the possibilities, many life sciences organisations underestimate the importance of first ensuring the integrity and ongoing quality of the product/regulatory data these systems will be drawing on.

Here are five critical data governance elements companies must have in place before they can attempt to become smarter in their use of data.

1.      Assigning dedicated roles & responsibility around data quality

Having someone whose remit clearly includes maintaining the integrity and value of data is the only way to ensure that any future activities drawing on these sources can be relied upon, and will stand up under regulatory scrutiny. Allocated responsibilities should ideally include:

Quality control analysis. Someone who regularly reviews the data for errors - for example sampling registration data to see how accurate and complete it is.

Data scientist. Someone who works with the data, connecting it with other sources or activities (eg linking the company’s regulatory information management (RIM) system into clinical or ERP systems, to enable ‘big picture’ analytics.

Chief data officer. With a strategic overview across key company data sources, this person is responsible for ensuring that enterprise information assets globally have the necessary governance, standards and investments to ensure the data they contain is reliable, accurate and complete - and remains so over time.

2.      Quality control routine

By putting the right data hygiene practices into place, companies can avoid the high costs and delays caused by data remediation exercises.

Operationalizing data quality standards is important. Such standards include naming conventions and data standards, data links with related content, and data completeness guidelines. These need to be applied consistently on a global basis.

The ability to identify and flag serious issues for urgent action and tracking of error origins is important too, and will enable additional training or support to be provided.

3.      Alignment with recognition & rewards systems

Recognition, via transparency, will inspire good performance, accelerate improvements and bed in best practice, which can be readily replicated across the global organisation to achieve a state of continuous learning and improvement.

Knowing what good looks like, and establishing KPIs that can be measured against, are important too. Where people have responsibility for data quality assigned to them as part of their roles and remits, it follows they should be measured for their performance, with reviews forming part of job appraisals, and rewarded for visible improvements.

4.      Creating a mature & disciplined continuous improvement programme

Continuous improvement will be critical to maintain the quality and reliability of data over time. Success here will depend on progress being clearly measured and outcomes being tied to business benefits.

At its core, continuous improvement is a learning process that requires experimentation with ’incremental’ improvements. Establishing good governance and reporting on improvements and net gains along with how these were achieved (what resources were allocated, what changes were made, and what impact this has had), will be important too.

5.      Data standards management

The more all companies employ the same data standards, the easier it will become to trust data, and what it says about companies and their products - as it becomes easier to view, compare, interrogate and understand who is doing what, and how, at a community level.

Evolving requirements under ISO IDMP and SPOR mean that companies face having to add and change the data they are capturing over time. To stay ahead of the curve, life sciences companies need a sustainable way to keep track of and adapt to what’s coming. It may be worth seeking external help here, to attain an optimal balance between regulatory duty and strategic ambition.

Long-term AI potential depends on the data groundwork put in today

With emerging technology’s potential advancing all the time, organisations should formalise their data quality governance now, so they are ready to capitalise on AI-enabled process transformation when the time is right.

About the authors

Steve Gens is the managing partner of Gens & Associates, a life sciences consulting firm specialising in strategic planning, RIM programme development, industry benchmarking, and organisational performance. Sgens@gens-associates.comwww.gens-associates.com

Remco Munnik is Associate Director at Iperion Life Sciences Consultancy, a globally-operating company which is paving the way to digital healthcare, by supporting standardisation and ensuring the right technology, systems and processes are in place to enable insightful business decision-making and innovation. Remco.munnik@iperion.com. www.iperion.com