The ubiquity of EHR technology in healthcare has done little to improve its reputation among its users.
Despite efforts from federal regulators, industry associations, and healthcare leaders, providers across the care continuum are still struggling with huge amounts of frustration, stress, and burnout caused by interacting with EHR tools on a daily basis.
Many industry leaders place the onus on health IT developers and EHR vendors to improve EHR usability and create new technologies that ease the burden of clinical documentation - and to their credit, vendors are working diligently to simplify the EHR interface and develop user-friendly products.
But cutting down on clicks only solves some of the usability problem. Getting data out of the EHR is just as difficult as putting data in.
Patient health information is often buried in lengthy medical records, making it difficult to absorb important data in a timely manner. The time and cognitive effort required to learn everything a provider needs to know about every patient can eat into the very limited window they have to actually speak with the individual and collaborate on care.
Ultimately, clinicians need more than a sleeker interface. They want a fundamentally different way to manage, retrieve, enter, and apply information available in patient EHRs to clinical care.
Artificial intelligence (AI) may be their answer.
Proponents of AI in healthcare believe the technology will offer seamless support for providers during clinical decision-making and administrative tasks, creating more intuitive workflows and innovative ways of interacting with sophisticated technologies.
AI has already been shown to be faster and more accurate at routine, time-consuming tasks - and natural language processing (NLP), a subset of the field, is highly capable at accepting input from clinicians who prefer to dictate their clinical notes instead of type them.
But dictation software is more or less a one-way street. The challenge for AI developers is to create intelligent systems with bi-directional capabilities: virtual assistants that can more or less independently perform background information-gathering work and synthesize those results into a meaningful conversation with providers..
“The key is to not look at AI in isolation, but as part of a workflow where the technology is quietly running in the background and actually assists physicians where it can be helpful,” explained Allen Hsiao, MD, CMIO at Yale New Haven Health. “Instead of interrupting or alerting, it’s all about listening and reminding and making it easier to do the right thing.”
Virtual assistants could be the answer that fed-up providers are looking for.
Ideally, these tools will be able to dig through EHR data to present important nuggets of information, place phone calls and prescription orders, take notes, and help users navigate the EHR system itself.
Several well-known EHR vendors, including athenahealth, Epic, and eClinicalWorks, are in the process of launching integrated virtual assistants which will be commercially available to providers in the near future.
For now, however, virtual assistants are far from commonplace. The tools are predominately being used by small groups of clinicians participating in limited pilot programs as health systems test the opportunities and limitations of the strategy.
There are plenty of challenges and advantages to virtual assistants, said leaders from Yale New Haven Health, Beth Israel Deaconess Medical Center in Boston, and Vanderbilt University Medical Center in Nashville.
All three health systems are exploring whether virtual helpers can improve the provider experience and reduce physicians’ complaints about the drawbacks of EHR use.
The need to address the problem of user burnout is more important than ever, leaders from the health systems stressed, but creating a tool accurate and useful enough to convince providers of its promises requires significant investment, creativity, and commitment.
Testing virtual assistants as an EHR usability solution
The demands of EHR data entry have literally divided clinicians’ time and attention in half, according to a 2017 study by the American Medical Association (AMA) and the University of Wisconsin.
Nearly six hours of a primary care physician’s typical 11.4 hour workday is devoted to EHR use.
And clinical documentation, order entry, billing and coding, inbox management, and security procedures often tether providers to their monitors for up to an hour and a half after leaving the office.
Clinicians at Yale New Haven Health (YNHH) were finding that they were experiencing similar challenges, and leaders from the health system were eager to change that.
YNHH began testing virtual assistants as digital scribes in an attempt to free up clinicians’ time and redirect attention to fostering a healthy rapport between patients and providers.
“We are always looking for ways to improve the experience of our patients and clinicians, and we’re sensitive to the fact that in addition to the benefits of EHR adoption, technology can create competing demands on physician time and attention,” said Lisa Stump, Senior Vice President and CIO of Yale New Haven Health and Yale School of Medicine.
“We are very attentive to mitigating this burden and have taken a multi-modal approach to give physicians time back — for their own well-being and for patient care.”
Yale partnered with health IT developer M*Modal to develop a virtual assistant that integrates directly into the health system’s Epic EHR system.
“The scribes help our doctors with several aspects of the patient encounter,” explained Stump. “Documentation and navigation, queuing up orders for review, pulling up relevant patient information, and other tasks.”
So far, clinicians have been able to devote significantly more time to patient care now that less of their workday is spent on clinical documentation and filling orders.
“For some of our doctors, having a virtual scribe allows them to see as many as six more patients a day with no to little added time,” said Allen Hsiao, MD, Chief Medical Information Officer at YNHH.
“Virtual scribes have significantly improved our clinician satisfaction by enabling more patient-physician interaction during patient visits and little to no after-hours time spent catching up on documentation,” Hsiao added.
At Beth Israel Deaconess Medical Center (BIDMC), the same pressures were building up as clinicians struggled under data entry burdens.
BIDMC first worked to reduce data entry for providers by launching Nuance’s speech-recognition software, Dragon Medical.
“We are a big user of voice dictation,” said Manu Tandon, CIO at BIDMC. “Our providers find that very useful.”
The health system then implemented Nuance’s virtual assistant and trained the system to automatically develop clinical notes that can then be shared with patients at the end of their consult.
“Doctors are dictating the note and the patient walks away with a copy of that,” said Tandon.
Yaa Kumah-Crystal, MD, assistant professor of Biomedical Informatics and Pediatric Endocrinology at Vanderbilt University Medical Center (VUMC), said her health system is also in the process of testing a virtual assistant prototype from Nuance that could brief providers on key patient information before patient encounters.
“The main use case that we're building towards right now with our own voice assistant is preparation before you're going to see the patient,” said Kumah-Crystal. “This is a patient that you've probably seen already, so they have information in the EHR that's being communicated back to you.”
The virtual assistant integrates directly into VUMC’s Epic EHR system to automatically pull relevant clinical information from the health records of scheduled patients.
“You're in a room — in a workroom or some kind of private area where it's allowable for PHI to be spoken back out loud to you — and you're just gathering information to build a picture in your mind about who the patient is and what the next steps in their care will be,” said Kumah-Crystal.
Convincing providers that the cure isn’t worse than the disease
Most users of modern AI technology - Siri and Alexa, for example - have had the experience of swinging wildly between sincere admiration and inexplicably intense anger as they push the limitations of these artificially intelligent personalities.
In some cases, a task that seems simple to a human may be prohibitively complex for these tools, which may not have access to the data or computational skills required to return a useful answer. In others, a unique accent, odd phrasing, or mumbled word will cause consternation, leaving the user repeating the request in increasingly shrill tones.
In the healthcare setting, these lapses aren’t just annoying: they can be dangerous. Ordering the wrong drug or failing to correctly document a diagnosis brings significant patient safety implications, not to mention legal liability concerns.
Advocates of virtual assistants have indeed received some pushback from clinicians who find the technology frustrating to use and who are hesitant to integrate the tools into their daily routine.
“The technology has to evolve a lot before it can get to the point that it is accepted as ubiquitously as we all want it to,” agreed Tandon.
“If they are error-prone or are not correct in differentiating the medical terminology, that poses a risk.”
This potential problem with accuracy has led some clinicians to believe the technology may be more trouble than it’s worth.
Kumah-Crystal and her team at VUMC gathered clinician feedback about virtual assistants to gauge provider interest in actually utilizing the tools in day-to-day operations, and found many clinicians become quickly irritated with the technology.
“Mainly, people are frustrated that it doesn't understand them well or it doesn't do the complex things they want them to do,” Kumah-Crystal said.
“You can set reminders very easily or set an alarm, or maybe even add an event to the calendar,” explained Kumah-Crystal. “But if you want to add a very specific thing to your calendar and it's just not going to understand you, it's easier to do that manually. Many people think that it’s more frustrating to try to say it the right way than it is to just do the task themselves.”
Poorly-handled implementations can also negatively affect provider perceptions of new technologies.
“Technology is not always perfect and, as we have seen with the EHR, well-intended and intensively-designed tools can become a burden depending on how they are implemented,” said Hsiao.
Clinicians involved in developing VUMC’s virtual assistant are keenly aware of this.
Kumah-Crystal and her team have engaged clinicians in the health IT development process by conducting interviews with providers to identify their pain points and remediate existing problems before making the tool generally available to all providers.
Through these interviews, Kumah-Crystal learned virtual assistants will need to be capable of clearly understanding clinicians and responding in a way that mirrors human speech as closely as possible.
“We just recently reached the inflection point where you can speak more naturally,” she said. “We’re trying to figure out how to make the interactions as sweeping and reasonable for people as possible while making the voice models better themselves.”
Training virtual assistants to understand the fluidity and cadence of human speech requires an intimate understanding of human interaction.
“At Vanderbilt, we’re a teaching hospital, so we have the dynamic of residents and trainees and attending physicians,” said Kumah-Crystal. “We’re seeing the way they communicate with each other and observing how a resident summarizes information and describes it back to providers when they’re giving them a report.”
“We’re using that as a model to say that this is how we expect someone to communicate back when asked for information.”
By continually studying clinician’s interactions during care delivery and gathering feedback about virtual assistants, VUMC is working to create a virtual assistant that fits seamlessly into the existing hospital culture.
Providers at YNHH similarly emphasized the importance of molding virtual assistants to suit users rather than requiring users to change their daily processes to accommodate the new technology.
Creating virtual assistants that adapt to human patterns, not the other way around, will require ongoing tweaks and changes from the tool’s developer, Hsiao noted. As a result, organizations will need to develop strong, ongoing relationships with their vendors to limit disruptions and allow for a speedy optimization process.
“For us to get leadership buy-in to even pilot virtual scribes, our strategic partnership with our vendor was key,” said Hsiao. “Moving along incrementally, our goal is to smoothly transition to an increasingly more ambient and automated documentation experience.”
“Our vendor is helping to take us from where we are to our desired goal,” said Hsiao.
The future of virtual assistants
Preparing virtual assistants for industry-wide use in clinical care will take time and effort. But according to healthcare leaders, the payoff will be well worth the wait.
“I see virtual assistants playing a support role,” said Tandon. “They could help with triaging; they could help with access in some cases. I certainly don’t see them replacing the core of the physician function, but they can be a supporting actor.”
Tandon is looking forward to leveraging virtual assistants to complete more complex tasks for providers in the near future.
“We want to figure out how we take the ability to listen in the room, which is what some of the tools like Nuance or Alexa can do, and turn that into some actionable items,” Tandon said.
“For example, if the doctor has a patient and they want to order a lab, could that lab get ordered in the background? When it’s time to do the billing, could the virtual assistant assemble the cost of that for the doctor to review?”
Stump and her team at Yale also believe AI will be most effective for clinicians if health IT developers focus on building toward this vision of a silent, intuitive, subtle virtual assistant.
“Our goal is to leverage virtual assistants so that clinical documentation is a passively created by-product of the patient-physician encounter and any gaps in care are closed right at the onset,” she said.
While the technology still has some way to go before it’s primed for widespread use, Yale leadership believes there is a benefit to embracing new tools early on in their development.
“Yale New Haven Health has always been an early adopter of innovative solutions that can help us better serve our patients and physicians, and it is no different with virtual assistant technology,” said Hsiao.
“Speech, natural language processing, and other AI technology has reached a point where virtual assistants in healthcare – where the stakes and complexity are much higher than in consumer applications – can be a reality,” he added.
“It’s a win-win situation: our doctors are happier and more productive, our patients and referring physicians are also happier as they feel more engaged with more timely and comprehensive notes and can better understand the thinking of the physician.”
At VUMC, clinicians working to develop virtual assistants have similar goals, but also hope to extend the tool’s capabilities beyond the medical center’s walls.
Enabling clinicians to access and activate EHR-integrated virtual assistants anywhere and anytime gives users the freedom to work the technology into their busy schedules.
“The next step for us will be the concept of the commute summary,” said Kumah-Crystal. “You know that you have a schedule of ten patients for the day, and you know that have a 30-minute commute to work. Why can’t you use that time to find out about your patients or what your day looks like?”
No matter when or where clinicians use the technology, Kumah-Crystal maintained the key to its success as a tool depends on its ability to adapt to the unique needs of different users.
“We're focused right now on getting information out of the EHR in a really smart, contextual way so that when somebody asks for information, what we give them depends on who they are, who the patient is, maybe the clinic they're in, all those pieces of context.”
Regardless of application, Kumah-Crystal is certain virtual assistants will someday be an integral component of care delivery.
“It's not just a passing fancy,” emphasized Kumah-Crystal. “There’s the hype-cycle component of it—people saying that voice is going to change the world. And it will, but probably not today or next week. But it’s the next-gen interface where our kids' kids are going say, ‘what do you mean you couldn't talk to your computers before?’”
This article was originally published on December 17, 2018.