- Pew Charitable Trusts and the Arch Collaborative — a KLAS-affiliated initiative comprising 5,000 providers — recently recommended the EHR Reporting Program include criteria centered on safety-related usability and EHR training, respectively.
This feedback came in response to a request for information (RFI) issued by ONC in August.
ONC requested stakeholder feedback about potential criteria to include in the EHR Reporting Program. In accordance with the 21st Century Cures Act, reporting criteria must address EHR security, interoperability, usability and user-centered design, conformance to certification testing, and other categories deemed appropriate to measure the performance of certified EHR technology.
In its letter, Pew recommended reporting criteria focus primarily on testing EHR usability to promote patient safety.
“Usability-related safety problems can emerge due to confusing interfaces to complete tasks, the need to develop workarounds, an overabundance of unnecessary alerts, and many other issues given the central role that EHRs increasingly have in helping clinicians order procedures, review health information, and obtain decision support,” noted Pew.
To develop its RFI response, Pew gathered feedback from health IT experts and identified principles to inform the selection of usability-related reporting criteria, ideas for existing sources of information that could be utilized as safety-related usability reporting criteria, and other factors ONC should consider when developing a reporting program.
Pew identified four principles to guide usability-related reporting criteria. First, Pew suggested ONC adopt a life-cycle approach to developing usability-related criteria.
“The usability of EHRs can change significantly once implemented within healthcare facilities,” wrote Pew. “Initial system design, unique workflows within facilities, interactions with other technologies used within each site, and individual clinician preferences can all affect system usability.”
ONC should ensure the EHR reporting Program draws upon information from all stages of the EHR lifecycle when developing criteria, Pew advised.
Second, Pew suggested ONC incorporate quantitative, measurable data into reporting criteria.
“Given variability in how systems are implemented and used, some reporting criteria may benefit from providing ranges on which data were received,” suggested Pew. “For example, on quantitative criteria, ONC could list the minimums or maximums observed in addition to the mean.”
Pew also recommended ONC limit the amount of administrative burden they place on end-users.
“Clinicians should spend their time caring for patients—not data entry to meet regulatory requirements,” wrote Pew. “Therefore, the reporting program should limit any additional requirements on the end user.”
Finally, Pew suggested the reporting program employ transparent methods that prevent gamesmanship.
“ONC should ensure that the way reporting criteria are calculated or how data are obtained is transparent, so that healthcare providers and EHR developers are able to effectively use findings from the program,” Pew suggested.
“While the methodologies used should be transparent, ONC should also ensure that the reporting criteria are not easily gamed,” cautioned Pew.
Pew recommended several potential options ONC could leverage to incorporate safety-related data into the usability portion of the reporting program. The organization suggested ONC use the Leapfrog CPOE Tool, safety surveillance data from ONC, the ONC SAFER Guides, or a 2016 health IT safety measure report from NQF.
“As ONC implements this program, the agency should ensure that the usability aspects of the program focus on the facets of EHR usability that can contribute to unintended patient harm,” maintained Pew.
Meanwhile, the Arch Collaborative emphasized the importance of EHR training in boosting EHR usability and provider satisfaction with health IT use.
In its comments, the collaborative offered ONC four major findings from research incorporating responses from more than 50,000 clinicians.
First, the collaborative noted that EHR satisfaction is directly related to the impact of EHR training clinicians receive upon being introduced to a new EHR system. The collaborative also pointed out that organizations that focus on training to support clinician workflows have higher rates of EHR satisfaction than those who don’t.
Finally, members stressed that ongoing EHR education is critical to helping clinicians improve their competency with EHR use, and that higher levels of personalization tool use among clinicians translates to higher EHR satisfaction scores.
“EHRs are not simple enough to be operated efficiently without ample instruction,” maintained members of the collaborative. “It is essential that new providers spend enough time learning how to use the EHR, and it is requisite that providers have the option to participate in ongoing training each year.”
“The structure of training matters less than the availability and quality of the education,” members added. “It is also quite helpful to teach providers how the EHR should be used in their specialty.”
The collaborative suggested clinicians could access ongoing training through online content, CME credit, distraction-free, offsite classes, or at-elbow support onsite.
“A trend that has been noted is that success begets success; when providers share how EHR training has improved their efficiency, their peers become more likely to participate,” wrote members. “The key is that the providers must have the option to choose what works for them.”
The public comment period for the EHR Reporting Programs closes today at 5pm.