Digital: Clinical Trial Analysis

Powering Clinical Trial Analytics With Automation

The application of end-to-end technology in clinical trial processes, coupled with updated quality standards, will accelerate the review and usability of data and reduce time to market for new therapies
Stuart Malcolm at Veramed
Image
Drug development is undergoing a digital transformation. The pandemic has amplified existing trends towards decentralised trials, greater volumes of data from multiple sources, and use of powerful technologies, such as cloud computing and AI. Clinical research analytics, which still relies heavily on traditional programming methods to extract insights and results from clinical trial data, is also ripe for change.

Yet, as work progresses to improve how clinical trials are analysed, two opposing forces must be navigated. On the one hand, trial investigators and drug development leaders are driven to speed up ‘time to insight’ by recruiting patients into trials swiftly and getting access to data more quickly to inform decisions. These stakeholders are, rightly, excited by the possibility of faster clinical research, leading to expedited approvals and improving patients’ lives.

On the other side is the highly regulated nature of the industry. While there has been a concerted effort from regulatory agencies such as the FDA to be ‘innovationenabling’ in recent years, the reality is that the pharmaceutical industry depends on robust governance and must ‘dot the Is and cross the Ts’.

As a cornerstone in the operations of clinical trials, analytics capability needs to grow and develop to meet these twin challenges. The question is how to get vital insights to investigators and medical stakeholders more swiftly, while ensuring adherence to due diligence requirements and high quality.

Improvements in Standardisation

Some important inroads have been made. Over the last two decades the implementation of data standardisation has been an important tool to increase efficiency. The Clinical Data Interchange Standards Consortium (CDISC) has, in collaboration with the pharma industry, pioneered the development of high quality standards, with a framework for effectively planning, collecting, organising, and analysing clinical and non-clinical research data.
Image
The benefits of the CDISC initiative have been manifold. For one, the standards support faster, more efficient review of data packages by regulators. They are, in fact, now required for regulatory submissions by the FDA and Japan’s Pharmaceuticals and Medical Devices Agency agencies.

The standards facilitate transparency in the data analysis and research process by providing a clear link between each element of data and its predecessor. Using standard data formats also promotes more seamless data exchange between sponsors and CROs. Beyond these operational benefits, the adoption of standards has enabled organisations to maximise the value of their clinical trial data by facilitating easier ‘pooling’ across individual datasets and studies to yield new insights.

Since the CDISC established its first draft standards in 1999, they have developed from simply standardising data collection to determining data analysis and planning protocols. Today, the framework is highly evolved, with specific standards reflecting individual therapy areas’ data collection and analysis requirements.

It is fair to say that data standardisation has been one of the most impactful developments in clinical trial reporting over the last decade.

Yet, despite these advances, clinical trial reporting is still limited by the capacity of manual labour. Teams of statistical programmers painstakingly create individual programmes and develop reports, outputs, plus tables, figures, and listings (TFLs), which often take months of work before the results can be shared with medical colleagues to inform their decisions.

Automation Offers Potential for a Step-Change

There is now potential for a step-change in productivity. Across the drug discovery and development value chains, there is a drive to apply automated methods to yield greater efficiency, such as in laboratory research. Automated approaches will also define the next era of efficiency in clinical trial analytics.

Removing reliance on manual programming tasks and embracing automated methods can enable a more seamless data flow, from protocol design to data submission, enabling faster results, higher quality, and improved consistency.

End-to-end automation is a ‘computer-aided design’ paradigm shift, in which clinical trials will be designed, or modelled, collaboratively upfront. They will use new software tools, based on existing clinical data standards but extended to include machine-readable biomedical concepts.

The digital model of a clinical trial will be stored as metadata, automating the digital data capture, analysis and reporting systems that are configured manually today.

A world of end-to-end automation would eliminate the need to start from scratch with every new trial. Instead, relevant aspects of previous trials could be harnessed to build out studies, without having to trawl through legacy paper trails.

However, this new era will not come in a single leap. In order to achieve the ultimate vision of end-to-end data automation, a stepwise, incremental approach will be necessary.

First, the existing standards framework must be enhanced and developed. While extremely valuable, the current CDISC standards have been developed for manual processes. To drive automated analytics and reporting, more detailed information will have to be specified within them, in the form of metadata.

Next, duplication must be eliminated. Clinical trials typically involve a network of organisations and stakeholders; sponsors, CROs, data managers, statisticians, medical investigators, and writers, to name a few. The numerous manual steps involved in moving from protocol design to clinical study reports and data submissions involve significant duplication of effort across these different roles. Investigators design the trial protocol and the data management team translates that document into a data capture tool that repeats information, such as the number of study visits. Then, statisticians produce the statistical analysis plan, reiterating much of the same information. Moving to a genuinely electronic and interactive data source document would be a foundational step in eliminating redundancies, allowing all stakeholders to extract the information they need from a single point.

The next step towards automating clinical data analysis is to read in the protocol at the outset, extract the data from the data collection tool, and transform it into TFLs and informative data visualisations without the need for time-consuming manual programming. Fully harnessing the power of automation could enable the delivery of actionable insights to medical stakeholders in real time to inform in-stream decisions in days or hours, rather than months.

The transition from manual to automated approaches brings unique challenges and considerations. Many pharmaorganisations are still grappling with the technology stack required to implement solutions, including enterprise tools to support central data repositories and data transformation steps. Outside the life science industry, data scientists use different tools, standards, and languages to automate data analytics. The industry needs to consider what it might need to bring into its toolkit. For example, it is now diversifying its statistical programming capabilities beyond traditional SAS software with open-source programming languages, such as R. R’s growing community of users provides a strong foundation for collaboration and innovation.

As well as having the right technology in place, different complementary skillsets must be considered, such as data science skills in data systems and ML to complement traditional clinical trial reporting skills in statistics and statistical programming.

Where could this lead in the longer term? The ability to analyse more trials more quickly to a higher quality standard would, in itself, deliver substantial efficiency gains.

At a more aspirational level, harnessing automation tools and advanced analytics could help to de-risk the clinical development process, by systematically leveraging clinical trial and real-world data insights to make better decisions throughout the product lifecycle, from drug discovery onwards.

Expediting With Expertise, Not Workforce

The pandemic and the resulting impressively expedited development of vaccines and treatments have fuelled the industry’s appetite for accelerated development timelines and drug approvals. However, these pandemic achievements drew on extensive combined workforce efforts and came with associated opportunity costs of deploying most resources towards COVID-19 research.

Automated clinical trial analytics presents a powerful opportunity to work smarter and allow technology and expertise to do the heavy lifting, boosting R&D productivity and delivering much-needed new medicines to patients faster.
Image
Stuart Malcolm is Head of Standards, Efficiency and Automation at Veramed, where his focus is the development of a software platform, tools and techniques to optimise the delivery of clinical trial analysis projects. Previously he spent nine years as Senior Statistical Programmer and has over 25 years’ experience of software development in industries including telecommunications, finance, and media.