When President Obama announced his Precision Medicine Initiative in the 2015 State of the Union Address, few disagreed that the concept makes complete sense. For too long, most medical treatments have been designed for “the average patient,” resulting in a one-size-fits-all approach that, by the sheer law of averages, has a less-than-ideal outcome for far too many.
Precision medicine promises to change that paradigm, to pioneer a new model of personalized, individual, patient-powered research and treatment. The idea is that, thanks to big data, researchers can tap into a wider range of digital sources and conduct deeper analysis into a greater variety of characteristics to better understand which types of treatments work best for more specific subsets of the population based on factors including other health conditions, environment, and lifestyle. Ultimately, this could mean personalized treatments designed specifically for each individual, offering a much greater likelihood of effectiveness.
Certainly, the move to electronic health records and the Internet of Things (IoT) is making precision medicine possible – imagine the insight your physician could gain by linking your wearable activity tracker directly to your medical record in his or her system. However, making precision medicine practical remains a formidable challenge. While the requisite data already are being gathered by a multitude of means, getting these systems, platforms, and applications that collect and store data to work together is no small task.
Fortunately, we have a proven path to follow that makes the reality of precision medicine not that far-fetched after all. Here’s what you need to know about the critical role of data in precision medicine, and how we can get there.
- The precision model requires massive amounts of data. Most people think of healthcare data in terms of lab and other diagnostic results, but it goes much deeper than that. For precision medicine to work, we must build an entire knowledge base about each person, including genetic information, environmental factors, lifestyle, and other tertiary data from multiple sources, and we must do this on a massive scale.
- A shift is required from an application-centric focus to a data-centric focus. Ingesting and effectively using the massive amounts of data available today currently requires various applications. Concentrating on integrating applications results in data silos, where each system’s data exists only within that system. To realize the potential for precision medicine, we must find ways to aggregate data from all sources – IoT, patient surveys, clinical lab data, etc. – in a centralized repository, giving clinicians access to this wealth knowledge with virtually any analytics tool or user interface. In other words, instead of focusing on building precision medicine applications, we first must focus on integrating and harmonizing the data to make it accessible by any application.
- Integration must merge with data management. This is not a medicine-specific requirement, but a challenge facing the big data industry as a whole. Integration and data management typically have been separate functions, and both have been plagued by inefficiencies, inaccuracies, and persistence problems that we now have the ability to avoid. By converging integration and data management workflows into a single platform, we can build a critical foundation for more efficient harmonization of data and real-time accessibility.
- The cloud is critical to achieving scale. Aggregating data on the scale required to put precision medicine into practice will be virtually impossible with conventional on-premises processing. The cloud must become the primary platform for the aggregation and harmonization of data, allowing researchers and clinicians to tap into the data repository and either pull down what’s needed to an on-premises system or use cloud-based applications to analyze the data where they reside.
- Privacy and security must be paramount. Of course, the idea of “sharing” patient data, especially in the cloud, is a tough sell for some individuals and for privacy advocates. However, we can build a model in which we anonymize that shared data to support population health research and individual care collaboration while ensuring patients’ personal privacy. We are already doing this in small-scale models, such as health information exchanges, purpose-specific registries (for organ transplants, for example), clinical trials, and community-based care initiatives. We’ll need platforms that not only can handle the scale, but also can meet the most stringent industry-mandated standards for security and privacy compliance.
- Analysis tools could change dramatically. Healthcare data analysis tools of today may look nothing like those required for precision medicine in the future. Today’s tools are based on conventional methods of data integration and management – monolithic silos that can ingest their own data but require massive amounts of complex coding and investment to sync with other sources. In the new era of a shared, aggregated, data-centric world, a whole new breed of analytics tools will emerge that will allow researchers and clinicians to investigate health conditions and treatments, and manage individual patients, in entirely new ways. That’s why it’s critical that we prepare our data architecture to be ready for the future.
- Precision medicine requires a philosophical shift. While the idea of precision medicine makes sense to most people, it’s a complete diversion from the way in which medicine is currently practiced. Having the tactics and technology available to enable the transition is one thing; we also must change the way we approach disease management, from a population-focused model to an individual-based model. This major shift won’t happen overnight, but it can happen.
The President’s Precision Medicine Initiative is a bold step in the right direction to better serve at-risk populations. But, just as health concerns and treatment options are guaranteed to evolve in the future, so will the ways in which we research and discover them.
No one has a crystal ball to predict what health challenges or new questions will arise 20, 10, or even five years down the road. So we must prepare now with a data model that enables complete flexibility and harmonization, to avoid boxing ourselves in to a model that we’ll need to rip out and replace before we realize the full potential of precision medicine.
We don’t know what questions we will need to ask. But we do have the tools and know-how to prepare to ask them in the most streamlined, efficient, and effective way so that access to data is never again an obstacle to life-saving medicine.
Gary Palgon is vice president of healthcare and life sciences solutions at Liaison Technologies. In this role, Gary leverages more than two decades of product management, sales, and marketing experience to develop and expand Liaison’s data-inspired solutions for the healthcare and life sciences verticals. Gary’s unique blend of expertise bridges the gap between the technical and business aspects of healthcare, data security, and electronic commerce. As a respected thought leader in the healthcare IT industry, Gary has had numerous articles published, is a frequent speaker at conferences, and often serves as a knowledgeable resource for analysts and journalists. His prolific and consistent contributions to the advancement of healthcare IT earned him a prominent spot on the crowdsourced #HIT100 list in 2012 and 2013. Gary is a Certified Information Systems Security Professional (CISSP®) and holds a Bachelor of Science degree in Computer and Information Sciences from the University of Florida.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.