- Providers are looking for a way to accurately assess how well patients generally comprehend the technical language used in EHR documentation in order to gauge patient comprehension of health conditions.
A recent study by Zheng et al. sought to compare readability formulas of EHR notes against patient comprehension of their own EHRs as a means of determining whether current readability formulas are an accurate measure of how challenging patients actually find electronic provider notes.
While allowing patients access to their own EHRs is one way providers have attempted to boost patient engagement and enhance patient-centered care, technical medical language can serve as an obstacle for patients trying to understand their clinical conditions and treatments.
Using 140 Wikipedia articles and 242 EHR notes on diabetes, the study assigned 15 laypeople to read and determine the difficulty of the information in the documents. The study then used three readability formulas, the Flesch-Kincaid Grade Level (FKGL), the Simple Measure of Gobbledygook (SMOG), and the Gunning-Fog Index (GFI), to assess the general difficulty of language used in EHRs. The difficulty levels of the documents as determined by these assessments were then compared alongside user perception of EHR language difficulty.
“EHR notes are foremost a tool for physician communication, and a large portion of them are not originally written with easy patient comprehension in mind,” the authors wrote. “However, they are shown to be beneficial to the patients. As more institutions allow patients access to their own EHR records, patients are also interested in reading them. Research has shown that patients may need help in understanding them. An accurate readability metric for the EHR notes can encourage physicians to write notes in a simpler language. It may make patient portals more useful.”
Researchers determined these three readability formulas were not an accurate assessment of EHR readability as the results of these formulas did not align with reader perceived difficulty. While the formulas all produced results similar to each other, reader perception of EHR readability differed significantly.
Through their study, researchers determined that a better metric of readability would prove beneficial to demonstrating to physicians the need to simplify their notes to make the information more accessible to patients.
“A better metric should incorporate features beyond simple word and sentence length, such as the complexity of the concepts involved in the document,” authors stated.
While improved readability could potentially encourage patient engagement with EHR records, researchers pointed out readability is not the only factor impacting patient use of EHR notes.
“For example, reader interest and motivation have been pointed out in the literature to be a factor contributing to comprehension,” the study noted, “In a more realistic scenario where patients read their own EHR notes, they are likely to be motivated and show interest in knowing their own health conditions.”
With recent studies showing that patient-facing EHR technology allowing patients access to their own EHR notes could contribute to improved health outcomes, researchers are eager to develop a more useful readability formula to point providers in the right direction when finding ways to ensure their EHR notes are useful to interested patients.
“We plan to develop new methods that can better capture the readability of complex technical documents so that both health care providers and patients can benefit from focusing first on EHR notes that are at an appropriate difficulty level,” stated researchers in comments regarding future studies to improve EHR readability.
Improving patient literacy is one of the many ways in which providers hope to boost patient engagement in EHR technology and ensure EHRs fulfill their potential as agents of optimized health outcomes.