Digital: Data Quality Governance

Taking the Complexity Out of Data Quality Governance

The very phrase ‘data quality governance’ carries so much weight, especially in life sciences where data has to be right and consistent across all functions and with global health authorities. Yet achieving data quality governance doesn’t need to involve large internal and consulting teams with complex frameworks, long timelines and unwieldy success parameters – it can be a lot simpler

By Steve Gens at Gens & Associates and Preeya Beczek at Beczek.COM

Everything in life sciences – from the latest measures around safety and regulatory rigour, to renewed focus on agility, efficiency and streamlined paths-to-market – relies heavily on companies’ effective organisation and handling of data. Unless that data can be fundamentally trusted for its accuracy and consistency right across the organisation, there will always be a sense that its authenticity must be checked before next actions can be taken.

It’s in this context that the concept and discipline of data quality governance comes to the fore. Data is becoming critical to regulatory procedures, safety processes, clinical research, R&D/manufacturing and, ultimately, to connecting all of those parts of the life sciences end-to-end life cycle value chain more seamlessly, which means the need for formal strategies and provisions around the governance of that data’s quality ( eg, integrity, reliability, trusted status) across all internal and external touchpoints, is at its greatest.

Beyond RIM

In a 2022 World-Class RIM survey of 76 companies, top-performing life sciences companies expected to have most of their systems connected and sharing data within the next two to three years, with electronic trial master file (eTMF) systems, quality management systems (QMSs), master data management (MDM) and enterprise resource planning (ERP) being the highest priorities for investment.1 As companies’ dependency on the flow of good data broadens, the assumption of trust in the data means that the potential risk to marketing authorisation/licensing, patient safety, the company’s reputation and its financial performance can become intolerably high. If this forces teams back into manual data entry, quality checks and numerous verification steps, it will undermine the return-on-investment (ROI) of digital process transformation, also known as ‘technical debt’. As companies across the pharma/biotech/medical devices spectrum strive to understand what all of this means for them practically, there is a temptation to create a major initiative supported by a large consulting budget, due a lack of confidence in getting it right. This is likely to involve bringing in an institution steeped in IT-based process transformation and risk management.
Image
'...much of what’s needed has to do with nurturing the right culture, assembling the right teams or assigning key roles, communicating successes and being on the same page as a company about the goals of this whole undertaking'
Achieving the pinnacle of data quality excellence can feel like an onerous undertaking that will cost a great deal of money, take an inordinate amount of time and be too overwhelming, even determining how and where to begin. However, it’s a misconception that a mammoth engagement is required for companies to get started on the right track. On the contrary, it’s far more important that work starts now, to move things in the right direction.
Once a few myths have been dispelled, and organisations have a framework to follow so that they can move in the right direction, it’s easy to start making good, solid progress.

Myth 1: This Will Inevitably be an Overwhelming Programme

The first barrier companies come up against is knowing where to start, when data quality governance by its very nature needs to be an enterprise-wide endeavour. However, it’s fine for good habits to accrue in one corner of the business (eg, regulatory, clinical or manufacturing) before being extrapolated more broadly as teams and leaders learn what works.
All positive change has to start somewhere, so decide whether a top-down or a function-by-function (with consistent practices) approach will produce the quickest wins, and the greatest overall progress. What works for one company may not suit another – especially when considering the size of the product portfolio – and that’s okay.

Myth 2: Complexity and High Cost are Unavoidable

The ‘data driven’ agenda might feel fresh and new in life sciences, but digital process transformation is well advanced in other industries, and matters of managing quality are firmly established in all sectors and businesses. This means that solid frameworks already exist and have been adapted appropriately for data quality governance in a life sciences ‘Regulatory+’ context. In other words, this doesn’t have to be a steep learning curve that takes several years and leaves companies with huge holes in their transformation, organisational change or IT budgets.
It’s possible to overthink and over-engineer the mechanics of good data quality governance. But much of what’s needed has to do with nurturing the right culture, assembling the right teams or assigning key roles, communicating successes and being on the same page as a company about the goals of this whole undertaking. Compliance might make up a tiny part of that, but ultimately the ability to rely fundamentally on the quality, integrity and completeness of data is a passport to better business insights, improved process transparency and more efficient and agile operations.

Myth 3: You’re Doing This Largely Because You Have To

Compliance with Identification of Medicinal Products (IDMP), Substance, Product, Organisation and Referential (SPOR) and other regulatory mandates might seem to be the most obvious driver for getting the company’s product and manufacturing process-related data in order. Yet there are many higher purposes for making data-related investments. These range from more tightly run business operations, to a safer and more convenient experience for patients as consumers of existing and new products. The tighter the controls around data quality, the more companies can do with their trusted data – use cases that could extend into the real world, such as prompter access, to real-time updates, to patient advice.
Another way to look at the importance of data quality governance is risk to the business where this is compromised. Our findings suggest that these risks include:
  • The inability to realise automation/integration investments
  • Limited reporting and dashboard effectiveness and efficiency
  • Continued manual compliance that increases the company’s technical debt, such as reliance on short-term, manual workarounds
  • A lack of confidence in realising global system/process investments
  • A major missed cultural opportunity to build a mentality of good data citizenship: a sense that everyone is in this together, so that from this point on data is captured correctly and well-maintained from day one.

Myth 4: This Is an IT/Data Management Concern First and Foremost

Evidence confirms that key success factors for a data quality programme have little to do with technology and everything to do with culture, organisation and mindset. Specific contributors to progress, based on active programmes today, most notably include:
  • A shared data quality vision, so that good data-related practice becomes second nature
  • Establish ‘actionable governance’ in the form of a data quality office and assigned data quality officer, whose remit is to oversee efforts to clean up and maintain good data
  • End-to-end orientation: in other words, a perspective on data quality that extends beyond a single function to the full spectrum of data use cases and key functions
  • Ensuring that senior leaders advocate for a culture of data quality built into rewards systems and executives drive a ‘right first time’ mindset around data as it is captured and first entered into a system
  • Formal continuous improvement: continued rigour in raising the quality of data and making this consistent across the company over time
  • Transparency of data quality performance, good communications about progress and a plan for celebrating success as the quality and usability of data is seen to improve across the company.

Myth 5: You’re Already Vigilant About Data Quality so Don’t Need a Formal Programme

Here’s a checklist that can highlight where a company has challenges with its data quality, which will become more apparent as data becomes increasingly fundamental to critical everyday processes:
  • Data quality is not viewed as an organisational competency or linked to organisational culture
  • There is no clear data quality vision, policy or strategy
  • There is no overarching data quality operating model
  • There’s a lack of transparency of the current or evolving data quality status and any trending or reporting
  • Data quality is not built into rewards or recognition systems
  • Data connectivity is being prioritised ahead of the organisational support required to properly manage and leverage the value of connected data.

Good Data Quality: What to Aim For

It’s important to understand what the company thinks to be ‘good data quality’ and to communicate this so that everyone understands and is aiming towards the same benchmark. Building data quality checks and reporting on these is the best way to mould and integrate agreed expectations. Based on observed leading practice to date, these are likely to include:

  • Accuracy eg, the expected renewal date for each country is correct
  • Adherence to standards – eg, both the syntax (format) and semantics (terminology) must be standard and harmonised across all systems and processes
  • Timeliness eg, the approval date is in the registration system in time to be used for product release
  • Accessibility eg, access is allowed and enabled to any authorised person, process or connected system
  • Ownership eg, there’s a clear data owner for the authoritative source of truth
  • Transparency of results – the ability to recognise the ‘good’ and what’s ‘not there yet’.

Issues will always arise, but if you can catch them early and identify areas that need more support, you’ll get to a point of competency and conformity sooner rather than later. Another good practice to avoid becoming overwhelmed is to categorise and prioritise data quality enhancement efforts based on each data set’s perceived criticality. For example, something that might adversely affect the quality of data being reported directly to the Health Authority (HA) vs data that isn’t directly critical but might indicate the need for improved practices or processes.

Essential Elements of a Good Data Quality Governance Programme

We’ve set these out in a very usable, three-phase framework, which can get any company started on the right track, however it decides to approach this (eg, function by function, or top-down and enterprise wide).
'It’s important to understand what the company thinks to be ‘good data quality’ and to communicate this so that everyone understands and is aiming towards the same benchmark'
Image

The Establish/Launch Phase
This initial ground preparation phase, which might take up to six months, is about setting out data quality vision and principles, establishing an actionable data quality operating model with formal jobs and roles, and conducting an awareness campaign.


The Operational Phase
This is the most intense phase, which, in a fast-moving company with decisive people involved, could be accomplished within 12 months. It involves establishing optimal processes and capabilities by adjusting to learnings from the ‘establish’ phase, ensuring that all roles with a bearing on data quality have these responsibilities set out in job descriptions and covered as part of the annual review process and establishing recognisable rewards for high quality data.

Optimisation/Institutionalisation Phase
Desirable behaviour is embedded and fostered within the organisational culture. This will ensure that everyone gets and stays on board with maintaining and continuously improving data quality, to everyone’s benefit. Tools and techniques might include automated data quality dashboards to monitor KPIs and metrics for critical end-to-end processes, data integration and connectivity throughout function and organisation and organisation-wide data quality level reporting, supporting a culture of quality.

It’s Never too Soon to Start

Taking a phased approach to systematic data quality governance paves the way for companies to move forward with their improvement efforts without delay, taking a bite-sized approach rather than feeling they need to have all of the parameters established up front.
It may even be that as one part of the organisation starts talking to adjacent functions about what’s needed and what may be possible, that compatible existing ventures emerge (eg, in R&D) – or even enterprise-level initiatives – that these new efforts could be blended with. As progress is built and witnessed, meanwhile, momentum will gather organically.


Image
Steve Gens is the managing partner of Gens & Associates, a global life sciences advisory and benchmarking firm specialising in strategic planning, RIM programme development, industry benchmarking and organisational performance.

Image
Preeya Beczek, director of Beczek.COM, is an independent regulatory affairs expert, providing teams with valuable insights, advice and strategies for operational excellence, by optimising process, systems, roles and operating models.