Digital: AI and ML AI in Pharma: Balancing Reality with PossibilityWhat is AI? As it is a relatively new field, we’re now beginning to see particular types show up in the pharma sector, bringing with it huge promises. However, what is ultimately realistic for our industry and for patients?

By Casper Wilstrup at Abzu

The pharmaceutical industry is not immune to the advancements of artificial intelligence (AI). However, while the possibilities of AI application in pharma are vast, the reality of its implementation is more nuanced. With so much hype and so much at stake, it’s important to take a closer look at what AI is, what can be utilised to benefit both the industry and patients and what ethical and practical implications should be considered.

The Short History of AI

Despite its recent boom, AI is a relatively young field – and it has experienced its fair share of ups and downs. Although it wasn’t called ‘artificial intelligence’ at the time, the first mathematical computer model of a single neuron was developed in the early 1940s. This invention was promising, but a lack of computational power prevented even minor real-world application, which led to what is known as the first AI winter.
There was not much progress in the field of AI until thirty years later in the late 1970s. Neural networks – collections of connections of computer models of neurons – became possible by virtue of the first microprocessors. These were called ‘discovery systems’ or ‘expert systems’, because they were trained to be experts in a particular field to usher in new discoveries.2 The original applications for discovery systems were, in fact, in the life sciences: their primary aims were to study hypothesis formation and discovery in molecular chemistry and disease diagnosis.1 But the limitations of the time – the expenses of data storage, hardware and implementation – outweighed any benefits, which led to the second AI winter.Today, with the advent of Big Data and more powerful processors, we are experiencing an AI boom. AI models are achieving impressive feats in areas such as gaming (mastering chess, Go and then Jeopardy), science (predicting the 3D structures of protein) and automating human tasks (delivering packages, driving cars and having conversations).For all its short history of fewer than 100 years, AI has both disappointed and surpassed our expectations. So, what exactly is AI, and what are the types of AI being applied in the pharmaceutical industry? In this article, we will delve into the different types of AI, including machine learning (ML), deep learning and neural networks. We will also explore the potential and limitations of AI applications in pharma, and finally present some benefits and challenges of balancing this technology with realistic expectations.

AI 101: What Actually Is AI?

‘Artificial intelligence’ is a very broad term, which involves using computer systems to make predictions or automate tasks. When people talk about AI, they’re typically referring to what is called artificial narrow intelligence (ANI).
The field of AI is rapidly evolving, and there are still two other categories of AI that don’t exist yet in reality: artificial general intelligence (AGI) and artificial super intelligence (ASI). These categories are characterised by their closeness to or ability to surpass human intelligence and performance.ANI has several subfields that are often referred to interchangeably, but in reality they have very different use cases in pharmaceuticals: ‘machine learning’ involves the training of an algorithm with data to make a prediction; ‘deep learning’ is a subfield of ML that has a depth requirement of greater than three neural networks; and a ‘neural network’ is a set of algorithms meant to mimic the neurons in a human brain. Generative AI – the form of AI popularised by ChatGPT and DALL-E – although powerful, is still considered ANI.Generative AI is a deep learning algorithm that has been trained on massive amounts of data to accurately predict the next word or pixel. ML, deep learning and neural networks are less like a set of Matryoshka dolls and more like a Venn diagram of algorithms. For example, deep learning is a kind of ML, but ML does not have to be achieved through neural networks. But memorising these categories is not the crucial item: the most important aspect to understand and evaluate any AI model is how it achieves a prediction.Which Models Are Being Applied in Pharmaceuticals? What’s Realistic?

There are many types of ML models, and it’s important to choose the right type of model that fits your specific use case and resource availability, as well as the regulatory and scientific requirements. Let’s examine three common use cases for AI in the pharmaceutical industry:

Use Case 1: Working with Small and Wide Datasets (For Example, Selecting the Right Patients for a Clinical Trial) One of the biggest challenges in applying ML is the lack of access to diverse, high-quality data. Therefore, using an ML model that has a high number of parameters and complexity (like a deep learning model) is problematic because it requires a large amount of high-quality data to train effectively. ML models driven by simple mathematical equations are small-data friendly and are most appropriate for this use case.

Use Case 2: Satisfying Regulatory Requirements of External Assessors (For Example, Nominating a Biomarker Signature) The pharmaceutical industry is rife with regulatory challenges, from drug discovery to active pharmaceutical ingredient (API) manufacturing. Therefore, using black-box ML models that lack transparency or explainability (like a deep learning model or a complex decision tree) is problematic because predictions are not readily understood. ML models that are transparent and explainable comply with Good Clinical Practice (GCP) and are most likely to pass regulatory requirements.

Use Case 3: Designing Safe Lead Candidates or Optimising Leads for Desired Properties (For Example, Developing an RNA Therapeutic) Some scientists depend on ML models as prediction-machines. Historically, users of AI have collected and processed massive amounts of data hoping for ‘silver bullet’ answers. However, this is problematic because predictions are simply not substitutes for scientific theory. Relying solely on the predictions generated by ML models instead of understanding the why behind a biological mechanism or hypotheses fit to test will lead to a stagnation in scientific understanding.

Balancing Reality with Possibility

So many things are possible with AI, and the past few years have been transformative for AI in the life sciences. But we must never forget that AI is a tool. Good application (and model selection) requires appraising your use case and desired outcome. You should always start with your question, for example “Why are these compounds toxic?” or, “Why do only some people respond to this treatment?”, instead of throwing proverbial darts in the dark, hoping that AI will sharpen your aim.
A good scientific theory can give good predictions, but good predictions do not by themselves provide good theories.

  1. Visit: history-of-ai
  2. Visit: research)
Casper Wilstrup is co-founder and CEO of Abzu, the Danish/Spanish start-up that builds AI to find transparent and understandable answers to the world’s unanswered questions in science, business and everyday life. Casper is a physicist by education and has 20+ years of experience building large scale systems for data processing, analysis and AI.

Human-Centred Design: From Inception to Integration

IPT sat down with Dr Chris Vincent at PDD, to understand more about the digital innovations that are leading design and whether technologies like Extended Reality (XR) can be beneficial to the process


The Importance of Temperature Control in Pharmaceutical Analysis

As technology within the analysis sphere continues to evolve, temperature control is becoming increasingly important for drug discovery and research


Providing Support Throughout the Patient Journey

Concierge services are critical to helping patients navigate technology and other logistics in a decentralised clinical trial. How best can they be implemented?