In the 1990s, biomarkers were the future of personalized medicine and a “lab on a chip” was the holy grail of understanding patient health. As much as past predictions of whole organ transplants and intra-body cameras have been realized, the notion of using measuring an array of protein to better understand a patient’s individual health, current and future, has still eluded us. But maybe for not much longer.
This fall, a panel of researchers pulled together by STATnews[1] discussed this very topic as it applies to oncology, focusing on where we are in terms of using a person’s chemistry to optimize their diagnosis and treatment. No two oncology patients are the same, with their differences going beyond both patient and tumour characteristics. That means each patient’s path to diagnosis and treatment is different. Biomarkers in blood, fluids, and tissues have significant potential to guide targeted therapies individualized to each patient and their disease.
The panel discussion was a tantalizing look at what’s in reach now, both in terms of diagnostic and therapeutic oncology targets, but also how we can use emerging technologies to re-evaluate how we frame personalized medicine.
It’s not just about getting the data—it’s about developing better analysis tools
We finally have the capacity to interrogate “Big Data”, meaning we can look at biomarkers in terms of systems, which is critical for how we understand epigenomics and early-stage cancer detection. In terms of current treatments, that can mean that biomarkers offer personalized clinical information that physicians can use to make treatment decisions. The challenge here is that no single or set of biomarkers can completely predict patient response to a particular treatment, which reflects the heterogeneity and complexity of disease pathways. So, deciding what blood chemistry tests to order for a patient is not often clear, at least not yet. And that’s where we need to involve the broader research community.
We need mathematicians and engineers, too
Frankly, biomarker research tends attract biologists interested in areas such as genomics and proteomics rather than engineers. But things are changing and a sweet spot between research science and engineering is the emerging field of synthetic biomarkers. As one panellist explained, the framework around synthetic biomarkers involves understanding the limitations of natural biomarkers. Engineers can synthesize new solutions around natural biomarkers to overcome their transient nature, and that has potential to change medicine in a different direction altogether.
What comes next?
Applying large-scale computing to cancer data is now a reality. What is missing, however, is elevating research from observational data to practical insights. It’s an area mathematicians and physicists have much experience in, but those researchers don’t typically get involved in this type of work.
More than ever, we need to look at a shared model of research and analysis. Pharma can’t do this on their own, and neither can physicians. We need to look at ways of bringing multidisciplinary teams of doctors and nurses, researchers and statisticians, mathematicians, and engineers, and more to work together on these complex datasets. It’s not the way that we have worked in the past, or are working now, but if we truly want to glean learnings from large biomarker datasets that are only now possible, much less apply those learnings to individual patients, we need to rethink who analyses the data and how.
Vanitha Sankaran, PhD, Medical Director, is part of the global medical strategy team at Evoke Mind + Matter. She works with her fellow matter makers to develop medical insights and strategy that drive our business with pharmaceutical and life sciences clients. We strive to make health more human by creating meaningful experiences to bring meaningful change.
[1] Sponsored by Astellas Pharma
Address:
Sovereign House
Church Street
Brighton
East Sussex
BN1 1UJ
UK