In this interview, NewsMedical speaks with Dr. Justin Devine and Dr. Nicholas Woudberg about biomarkers, their importance for drug development, and their potential for machine learning.
Can you provide a brief overview of the biomarker development process?
Dr. Justin Devine: When we look at this realm of biomarker development in clinical drug development, one realizes this is relatively new. In 2003, the term ‘biomarkers’ found its way into common usage by regulators. Moreover, the first presentation of 90% of the human genome was published only 21 years ago.
Biomarkers were then identified as a research field where knowledge of biology could be applied to improving drug development. In 2003, Janet Woodcock from the FDA focused on using biomarkers to improve clinical development within the context of the critical path initiative.
In 2003, only 760 publications referenced the term ‘biomarker’ in the title or abstract. By 2021, this number had jumped to almost 27,500 publications.
As we saw with the.com bubble or, more recently, with Bitcoin, everything goes through a peak of inflated expectations. This often gets followed by a period of disillusionment and, finally, an increasing realism resulting in maximum productivity.
Exactly where biomarkers are in this process is very difficult to comprehend. However, I hope we are in the stage of increasing realism regarding how biomarkers and omic technologies are used to understand the development of new therapies better.
Today, immune-related biomarkers and diagnostics are one of the hottest areas of biomarker development. In 2021, 3,500 publications referenced the words immune and biomarker in the title of an abstract. In 2010, this number was less than 200.
The expansion of the immune-oncology space has fueled the emergence of this category of immune-related biomarkers. On the other hand, immunological biomarkers have been used in regular clinical practice for many years longer than immune biomarkers, particularly in terms of clinical development.
Why are clinically relevant biomarkers valuable?
Dr. Justin Devine: The analysis of the immune system plays an essential part in routine clinical diagnostics. There are many factors in clinical practice, from the monitoring of CD4 counts in HIV to other areas of flow cytometry that are used in immune diagnostics. Recently, cytokines have played a role in monitoring various aspects of health and disease. Specific complex algorithms have also made their way into managing certain diseases, such as rheumatoid arthritis, in recent years.
One of the best examples is an algorithm called Vectra, which essentially takes a panel of cytokines and applies an algorithm to it to understand disease severity. However, there have not been that many immune-related diagnostics that have come into widespread clinical usage.
There are currently no FDA-approved immune companion diagnostics. PDL1 as a companion diagnostic would be the closest, even though it is not immune-related. I think this reflects the highly pleiotropic nature of the immune system where for instance, a cytokine-like IL6 could be involved in many unique aspects of health and disease.
It just makes the idea of a traditional companion diagnostic very challenging. So, as we look back at biomarkers and immune biomarkers over these last two decades in the context of immune-related biomarker development, it can be observed that there is just an explosion of the use of biomarkers in clinical development, making them extremely valuable.
In this regard, a remarkable piece was published by IQVIA in 2019, in which they observed the factors with the most significant impact on clinical productivity. According to IQVIA, biomarkers and patient selection tools would influence clinical development success most until the end of 2023.
IQVIA also indicated that the relative effort for applying biomarkers would decrease, and biomarkers would have the most impact in improving clinical productivity compared to other sectors, such as new therapeutic modalities.
How should clinically relevant biomarkers be applied to different therapies?
Dr. Justin Devine: CRISPR-based and cell therapies make it challenging to define a biomarker strategy clearly. However, different frameworks can be used when applying a biomarker strategy to therapy, whether it is a therapy where the biology is already well-known or potentially more novel.
An excellent framework developed by Pfizer was based upon a review of drug development failures and accomplishments over many years before 2012. The team at Pfizer did a fantastic job and was transparent when it published its three pillars of drug survival.
Pillar one ensures that the drug reaches the target where the pharmacological effect should occur. For instance, if a therapy targets a central nervous system illness, it must be proven that the drug can safely reach the central nervous system.
The second pillar is to prove that the drug binds to the particular target after reaching the tissue and that there is modulation of that target.
The third pillar is that there is some expression of pharmacology. A metric can be applied to ensure that there are downstream effects, whether biomarker-related or clinical, to determine if the drug has the desired result.
Pfizer published the results of their drug developments in phase two programs between 2005 and 2009. These publications showed how many of Pfizer’s drug programs complied with these three pillars of drug survival. Eight of the 14 drugs that passed all three pillars advanced to phase three development.
Pfizer also had 12 drugs that did not pass the three pillars. They had no proof that the drug even got to the target tissue. One example is a dopamine-3 receptor antagonist found in the brain. All 12 of these drugs did not proceed to phase three development.
Although this is a small sample size, it still provides a very helpful framework for applying clinically relevant biomarkers.
Pfizer did a similar review some years later. Pfizer had gone several years without the approval of a new drug. Therefore, they reviewed their pipeline and published the results in 2014 in Nature Reviews Drug Discovery.
When Pfizer looked at the factors that could predict whether a phase 2A study was going to proceed to the next level or not, Pfizer found that those studies that had an efficacy biomarker linked to them had a 73% chance of proceeding to the next level versus 29%, where no such efficacy marker ever existed. Pfizer also deduced that if the right patient group could be selected for the therapy, the probability of succeeding and moving to phase three was 90% versus 22%.
What is Pfizer’s 5Rs principle?
Dr. Justin Devine: Pfizer summarized their principles into the 5Rs, which include the right drug, tissue, safety, patient, and commercial potential. There is a high level of overlap in Pfizer's work. For example, the right tissue and showing that the drug is getting to the right place are fundamentally important.
The right target means that the target biologically makes sense and is linked to disease. It is also essential that there are predictive markers that can show the engagement of the target. Right patients and right safety could be seen as the expression of pharmacology, where the on-target and off-target effects of the drug can be demonstrated.
The right commercial potential is also of the utmost significance. Innovation is the combination of science, sales, and marketing, and confirming the right commercial potential in the new therapy is incredibly important.
How can these frameworks be applied to drug development?
Dr. Justin Devine: These frameworks can be applied to essential areas such as drug development. This can be seen in the realm of immune oncology and checkpoint inhibitors. This demands a deep understanding of the fundamental nature of the tumor, particularly in checkpoint inhibitor sensitivity.
One of the exciting measures in this regard is the tumor mutational burden. The level of mutations in the tumor is an excellent indicator of whether somebody will respond to a checkpoint inhibitor. The first step in this process is looking at target engagement. Target engagement measures the level of impact that therapy has on the immune system and how the immune system interacts with the tumor.
Understanding tissue processes can be essential for gaining insight into the engagement of a checkpoint inhibitor with its target and its impact on the expression of pharmacology.
In this case, the expression of pharmacology will determine how the immune system becomes active as a result of therapy such as checkpoint inhibitor therapy. This can be done using measures such as flow cytometric activation of various antitumor immune markers. It can also be carried out using immune polyfunctionality and different immunogen genomic signatures.
Humans are highly heterogeneous, and different population sets have significant differences. In specific individuals, particularly those of African ancestry or Asian ancestry, where there is a higher level of genetic variability and heterogeneity present, this genetic heterogeneity can be misinterpreted when sequencing just the tumor as a higher tumor mutational burden.
It is essential to understand the people and genetic background to whom the therapy is given because that can significantly impact therapeutic responsiveness.
Central nervous system diseases have the lowest probability of therapeutic success and are still one of the trickiest areas for novel drug development. In this case, a biomarker strategy using the same principles as neurogenerative diseases can be applied, but it is a lot more complicated.
Immune modulators are being researched to enhance immune monitoring of treatments by boosting the transfer of immune cells into the brain and subsequently removing trash, such as misfolded proteins, from the central nervous system.
Phosphoprotein indicators that are difficult to reach in CSF but possibly important in the periphery might be highly informative on whether the medicine is having the expected impact on the target pathway.
Gene treatments are one of the most important fields of gene therapy, particularly in the central nervous system. Exosomes are also considered because they are a surrogate for a brain biopsy. Pre-analytical factors of exosomes are still challenging, but they represent a very attractive way of understanding both target engagement as well as expression of pharmacology.
Central nervous system (CNS) diseases have a high number of oligomeric proteins, and quantifying this can help understand the level of disease and how that disease is being treated as a result of the impact of therapy.
Imaging is still a significant area of focus because it is not easy to access the CNS space. The gut-brain access is vitally essential regarding how the brain functions and the homeostasis of the brain.
Image Credit: Synexa
How can artificial intelligence and machine learning be applied in biomarker development?
Dr. Nicholas Woudberg: Artificial intelligence (AI) mimics human intelligence by mimicking code in various algorithms. AI presents an overwhelming opportunity in multiple fields, including data management, drug discovery, trial design, surgical procedures, and diagnostics and screening.
There has been a significant investment in AI in the healthcare sector. This is expected to reach $4.3 billion by 2024. AI is also considered the most disruptive technology in pharma over the next two years.
AI can dramatically increase efficiencies and target identification, the design of a clinical trial, the manufacturing of drugs, and marketing. However, how does it relate to biomarker discovery?
Machine learning and AI presents a new way to look at biomarker discovery. Machine learning is a subset of AI that employs statistical methods to develop and refine algorithms. It requires large data sets and leverages inputs from different sources to enable patient stratification based on illness progression and response to therapy.
It can be an incredibly useful tool in designing a clinical trial and its progression, following up with patients on active therapies and determining if there is potential refraction or remission.
What are some of the AI and ML challenges in biomarkers development?
Dr. Nicholas Woudberg: In ovarian cancer, biochemical markers include CA-125, HE4, the BRCA genes, and epigenetic markers. However, the serum biomarkers could not distinguish the different cancer types and there were no decreases in mortality. This is because not many algorithms have been cleared for use by the FDA.
Currently, the only FDA-approved AI algorithms for biomarker identification are in heart and lung cancer. The introduction of bias is one of the primary obstacles to why the usability of these essential technologies has been constrained.
Algorithms develop biases and produce prejudiced responses when trained with non-represented or incomplete data. The bias in AI systems is critically hampering their overall application and usefulness.
Even though a very complex AI machine learning algorithm can enable the stratification of patients, whether or not that stratification will work in a different population or with a slightly different condition is largely down to what the algorithm is trained on.
AI requires large datasets to apply new algorithms. In many cases, it can even require combining multiple data sets. Several exploratory studies used to establish these AI algorithms are early phase studies without many participants.
Therefore, there is a limitation regarding the amount of data that AR can be trained on. Many racial and ethnic minorities can be more susceptible to certain cancers.
Tumor mutational burden is the only FDA-approved pan-cancer biomarker. In many cases, patients of non-European descent are being prescribed expensive therapies they do not need. This is because there is an overestimation of their tumor mutational burden.
A paper published in 2018 looked at multiple diseases characterized by Genome-Wide Association Studies. There was an overwhelming bias for European ancestry individuals included in these studies and a big underrepresentation of minority groups. In many cases, these studies do not use the most recent variability from the minority groups.
As a result, the predictive power of polygenic risk scores, currently developed on European data, can be up to four and a half times lower when applied to individuals of African descent.
Why should AI and ML be incorporated into biomarker development programs?
Dr. Nicholas Woudberg: AI and ML are extremely powerful tools, and there is no doubt that they will continue to become more important parts of our daily lives and medicines. These techniques can help better stratify patients in clinical studies, providing the most accurate signs of remission, reoccurrence, and treatment refraction.
They can also help develop new medications by offering new insights and enabling us to find a broader spectrum of biomarkers. Using these large datasets, AI can identify biomarkers that would be hard to detect using traditional approaches.
However, these AI and machine learning algorithms must be trained on diverse data sets more representative of multiple populations. Some studies are trying to address this problem. Project Survival is a new study that is currently enrolling participants, and tissue and fluid samples are used alongside an AI platform to identify and validate pancreatic cancer biomarkers.
How would you conclude this interview?
Dr. Justin Devine: A framework is a helpful starting point when defining a biomarker strategy. The frameworks discussed here today are beneficial and powerful, and they serve as a tremendous starting point to structure thinking. They should be considered when defining a biomarker strategy, even for a complex new therapy.
Having a clear sense and being quite honest about what to expect from a given biomarker strategy is essential. Tying a product to a single marker without a deep understanding of the biology behind it can result in repeated failure. Many markers are made or broken between when the samples are drawn from the patient and when they get analyzed in the lab.
Understanding pre-analytical factors can make a big impact on the robustness of the final results. Omics technologies are very powerful, but they are not a replacement for robust scientific thinking. Simply throwing an omic at something and hoping something will come out of it is a poor strategy.
Interacting with teams and people that have experience is useful and powerful. Building a biomarker strategy with a realization that biology can be cruel can be an excellent approach.
About Nicholas Woudberg & Justin Devine
Nick is a medical scientist with experience in soluble biomarkers and lipidology. He completed his PhD in medicine at the University of Cape Town, which focused on exploring new novel lipid-based biomarkers of cardiovascular disease. Nick joined Synexa as a Scientist in the Soluble Biomarkers lab where he was involved in developing and validating methods for unique client compounds. He is now part of the client services unit, assisting with outreach to clients, developing bespoke biomarker and analytical strategies and proposal generation.
Justin is a medical doctor, immunologist and pharmacologist, Co-Founder of Synexa Life Sciences and is currently the CMO and interim CSO at Synexa. His primary focus is to understand the company’s clients’ objectives in new drug development and to implement biomarker strategies that bring real insight to the challenges of clinical development and the immunological underpinning of health and disease.
About Synexa Life Sciences BV
Synexa is an industry leader in contract bioanalytics and biomarker research services specializing in biopharmaceuticals, supporting Biopharma partners across the globe to achieve their clinical milestones with bespoke analytical solutions to meet specific challenges. Our global purpose-built laboratories have been designed and fitted to meet the fast-changing and ever-evolving complexity of drug development pipelines.
Guided by a combined scientific experience of over 800 years, our team of world-class scientists pride themselves on going the extra mile to deliver high levels of customer service and quality science that meets your expectations. We specialize in providing quality scientific expertise in key disease areas, encompassing a wide range of technology platforms and bespoke assay solutions to meet each outcome.
Services include protein, cell and gene therapy bioanalytics, biomarker discovery, analysis and validation, PK/PD, de novo assay method development and validation, and ProtoTrials®, our proprietary human in vivo translational research platform focused on providing valuable early clinical insights.