Electronic Health Records

Policy & Regulation News

Developing Quality Measures to Succeed in Value-Based Care

Access to high-quality, reliable data is crucial to the evolution of care quality measurement in value-based care.

Source: Thinkstock

By Kyle Murphy, PhD

- The ability for care quality measures to paint an accurate picture of the patient care experience depends on the availability of reliable data, yet the latter remains a persistent challenge for providers participating in value-based care models. Not surprisingly, a lack of health IT interoperability is a major source of frustration.

Quality measures are a critical component of pay-for-performance agreements as the mechanisms for quantifying the quality of care a provider delivers and ultimately determining payment. Because these measures assign a quantity to quality, healthcare organizations must ensure they possess the data necessary for reporting on performances as part of risk-based agreements with public and commercial payers.

According to leadership at the National Quality Forum, these organizations need to make an honest assessment of their data and its sources to ensure that reported measures present an accurate reflection of a given practice.

“All the measures in the world aren’t going to matter if you have horrible data,” NQF Senior Director Jason Goldwater told EHRIntelligence.com. “It’s not simply having the ability to report on a certain number of measures. The focus needs to be on where the data is coming from for these measures and how good that data is.”

Given that healthcare is in the midst of the transition from fee-for-service to value-based reimbursement, many organizations still find themselves in the earliest stages of quality measurement. That being said, decisions made today can have a lasting impact if not thoroughly considered. In other words, quality measurement is a marathon, not a sprint.

Jason Goldwater oversees electronic clinical quality measures at NQG
Jason Goldwater, Senior Director at National Quality Forum Source: Xtelligent Media

READ MORE: Team-Based Model Key to Achieving the Quadruple Aim at Reliant

NQF recommends beginning measuring areas where providers have data clearly showing improvement over baseline. “That’s the easiest place to start,” said Goldwater.

The next step is to then expand to other areas where large collections of data reliably demonstrate improvement or at least a course to correcting deficiencies.

“Not only should quality measurement give you information, but it should also should show you the pathway for what you need to do to make corrections if necessary or to continue to be consistent in delivering quality and efficient care that will continually meet those measures,” Goldwater explained.

Quality measurement must do more than simply report out on provider performance based on certain criteria — it need also provide an internal purpose in identifying gaps in performance or a lack of data necessary for satisfying measures.

“That’s how you continue to evolve,” he continued. “It doesn’t come down to just choosing the measures you think are the most appropriate — it comes down to having the reliable data for these measures to give a better indication of quality.”

READ MORE: Transition to Value-Based Care Requires Health Data Exchange

Eventually this close assessment of available provider performance data should pave the way toward more advanced quality measure and the ability to succeed in value-based care.

“As you evolve, then the process becomes a question of examining the various data streams that you’ve got and understand not just the types of data but its impact on quality,” Goldwater added. “If you’re going to choose an advanced quality measure, then you have to be able to look at the data and determine that it ultimately shows differences in improvement over time.”

Access to reliable data plays a pivotal role in successful quality measurement and reporting. It also poses a significant challenge to the ability of providers to complete these two activities moving forward.

Quality contingent on interoperability

In early September, NQF released a framework for measuring healthcare interoperability to help satisfy a provision of the 21st Century Cures Act. Included among findings from an environmental scan was the observation that the ability to exchange information electronically between various health IT systems has a “significant” effect on both the accuracy of quality measurement and the act of quality reporting itself.

Advancements in interoperability across healthcare will allow for the creation of quality measures capable of capturing the patient’s entire experience across the care continuum rather than individual snapshots in time.

READ MORE: Value-Based Care Fueling Provider Demands on Health IT Companies

“You can measure care coordination in specific areas — medications, visits, etc. — where the data is available and the systems are fairly interoperable,” Goldwater observed. “But to be able to assess if complete care coordination has been done, we are a little ways away from a full record of patient information is being sent from one provider to the next and updated in real time as the patient is moving from one setting to the next."

“All the measures in the world aren’t going to matter if you have horrible data.” 

NQF advises stakeholders to consider quality measurements that are “interoperability-sensitive” to determine where data exists to populate measures and provide a “more robust and accurate metric about where quality is now and where it is headed,” noted Goldwater.

One guiding principle of the NQF report on interoperability made clear that the concept extends well beyond EHR-to-EHR information sharing. According to Goldwater, a growing interest in patient-centered outcomes has pointed to the need for considering additional data sources for quality measures.

“Where we’ve evolved to now is that there are ways of getting data that don’t involve a standardized tool,” he maintained. “The explosion of digital health, wearables, sensor-based technologies, and the like have provided a large stream of data that we are beginning to understand how to best use for these patient-based outcome measures. It’s still very early on in this process to get a better understanding of not only the quality of the data but also what that data indicates and how it can be used in a measure.”

Though nascent, these data sources speak to a salient matter in quality measurement concerning tracking metrics continuously rather than episodically.

“Data from these technologies is always updating,” Goldwater said. “It’s not a measure of a snapshot in time. It’s longitudinal and therefore dynamic. You’ll always be able to measure a patient trend over time to see that quality is being met and sustained. That changes the quality improvement dynamic a bit because that’s not the way it’s being done now.”

Many encounters make up a patient’s experience of care and no one part is bigger than the whole, so it figures that quality measurement moving forward must be able to put the big picture into perspective — something that is only possible by connecting all the dots, or rather data points.

Quality measurement is only as valuable as the data by which a provider’s performance is measured. Limited availability to extant and emergent data sources leads to limited insight into the actual experience of care from the perspective of both providers and patients — junk in, junk out. By removing barriers to data access, healthcare organizations and the industry also a whole can evolve to provide an accurate image of care quality and the changes necessary to spur advancement.

Continue to site...