- When more than 37,000 health IT professionals get together under one big roof, you expect to hear a lot of buzzwords. Interoperability, patient engagement, and population health were plastered on every booth in the Orange County Convention Center, with companies large and small showing off innovative solutions to shuttle healthcare along its inevitable trajectory towards patient-centered, quality service. While the vast majority of these offerings relied upon or tried to build an integrated model of insights based on the enormous wealth of data being collected by providers every day, there was one term no one seemed to want to mention: big data.
In fact, during three days of interviews with top executives who have extensive experience with analytics, clinical decision support, population health management, and documentation improvement, it only made one single appearance – just after the 28 minute mark of a discussion about cognitive computing with Steve Gold, Vice President of the IBM Watson group.
So why aren’t people talking about big data anymore? Because they’ve accepted that all healthcare data is big data, and it’s only going to be getting bigger. “I heard a statistic that says if you’re an epidemiologist, it would take 176 hours a week to keep up with all the new science and information out there,” Gold said. “Now, that’s just not going to happen. Even if that stat is off by an order of magnitude, there’s just no way. It’s not humanly possible.”
While a supercomputer like Watson certainly has the capability to read, digest, and process such an enormous volume of data, and Gold believes that the future of analytics is in the transition from programmatic computing to cognitive machine learning and natural language processing, not every provider has made it to the point of harnessing such sophisticated technology. Many are still struggling with identifying their primary data streams and tidying them up enough to produce meaningful reporting and actionable insights.
Instead of focusing on big data, providers are interested in tackling the cumbersome process of shaping their information to make it small, lean, and smart. “There are small populations that consume a huge amount of resources,” said Steve Fanning, Vice President of Healthcare Industry Strategy at Infor. “We’re talking about 5% of patients making up 40% of all spending. Targeted programs that engage with those people will become very important.”
Most people would call that “population health management,” but that’s almost as nebulous a term as “big data,” Fanning says. “You see the words ‘population health’ on nearly every booth here at HIMSS, but every one of them has a different opinion on what it is. Infrastructure is critical, and analytics are critical, but what’s not there yet is the ability to engage patients and the ability to manage financial risk. That’s a missing ingredient from the population health approach.”
According to Dan Riskin, MD, CEO of HealthFidelity, that lack of a clear consensus among stakeholders is what is preventing providers from using data as a tool to produce overall value. While a number of vendors have started to build momentum with analytics offerings and providers are engaging with commercial companies and researchers to turn data of all sizes into better outcomes with pilots and studies, these small forays into true population health management are not yet at the point of achieving the ultimate goal.
“I think there’s a lot of experimentation, which is good,” he said. “I think vendors are bringing healthcare organizations into this experimentation, which is also good. We need to figure out what will save money and what won’t, and what will improve outcomes and what won’t. But there’s a lack of focus on value. Is this about just trying to push your revenue, or is it about creating value in the system? There’s a certain amount of ROI when it comes to boosting revenue internally, but that isn’t going to change things long-term. If the focus is reducing costs while keeping quality high, that’s a sustainable strategy.”