Hospitals are turning to technology to bridge the communication barrier

Healthcare organizations are using digital healthcare technologies to help doctors and nurses communicate with patients who speak different languages, are unfamiliar with English, or have other communication difficulties.

With more than 800 languages ​​spoken in the New York City area, communication challenges are a very real possibility. And nowhere is this more dangerous than in a healthcare facility, where a mistranslation could affect clinical outcomes.

Healthcare organizations are turning to technology to address this challenge, with partnerships and digital healthcare platforms that enable care teams to access interpreters in real time.

“We’re dealing with a melting pot of diversity,” said Kerry Donohue, MSN, RN, patient experience manager and cultural leader at Manhattan Eye, Ear, and Throat Hospital (MEETH), a division of Northwell Health’s Lenox Hill Hospital . “Every day, I would say every fifth patient [speaks a language other than English}, and it can be challenging.”

When confronted with a patient speaking Farsi, Romansh, Mandarin, or any other language, the traditional tactic would be to look for a multilingual family member or grab the nearest staff member who just happened to speak that language—at least that’s what happened in St. Elsewhere—or grab a phone, call the hospital’s translation service and hope they had someone nearby who knew that language.

Digital health technology has made that process easier. Care teams can now use a smartphone or tablet to connect through an mHealth app with an interpreter in real time, even by video, on a platform that specializes in translation services. MEETH, for instance, uses LanguageLine services on tablets provided by Equiva Health, a digital health patient engagement company based in New York.

“It’s like FaceTime,” says Donohue. “You’re connected with someone who knows the language.”

Making sure patient and provider are speaking the same language is critical in healthcare, and it goes far beyond patient engagement. Doctors and nurses not only need to know exactly what happened and how a patient is feeling, but that their questions, diagnoses and care plans are understood. Something lost in translation could result in a missed symptom that indicates a more serious health issue, or a misunderstood prescription or treatment plan that could make things worse, even fatal.

“It’s really not best practice to use a fellow clinician or a family member as a translator,” Donohue says, noting that a trained medical interpreter can pick up nuances in both language and clinical terms that others might miss. In addition, this resource means providers don’t have to pull in colleagues to help with translation, interrupting other workflows and affecting patient care.

The language barrier isn’t just in New York City, either. From Maine to Hawaii, in communities and healthcare sites large and small, the chance of coming across someone who speaks a different language—and who may not speak English at all—has grown. And with the advent of telehealth, more hospitals are engaging in virtual care with patients and other providers in different parts of the world.

In Boston, Brigham & Women’s Hospital is testing a device-agnostic website and app called Cardmedic, designed to tackle both language and communication barriers, including visual, hearing and cognitive impairment.

“You need as many tools as you can get to help communicate with patients,” says Andrew Marshall, MD, an emergency medicine physician. “Clinical questions don’t always fit well into a box, and interpreters aren’t always available.”

Marshall sees the technology addressing a key social determinant of health that affects care for a wide array of underserved populations. If someone is uncomfortable talking to a care provider in another language or has issues communicating, he or she might delay going to a clinic or hospital or even skip the visit altogether. Or that person might come out of a visit to the hospital or doctor’s office with questions about what was said or communicated.

“Brigham & Women’s has a robust interpreter service, but you need to make sure” that every word is understood correctly, he says. That might mean using sign language, or providing visual cues or a vocabulary for someone with cognitive issues.

“God forbid you end up having to use Google Translate” to explain the intricacies of diabetes or a heart condition, he adds.

Equiva and Cardmedic are part of a wave of innovative ideas aimed at tackling communication barriers in healthcare. Aside from apps and websites that can handle interpretation, there’s ongoing research into natural language processing (NLP) and voice activated technology—imagine Alexa handling these tasks in an ER or doctor’s office. Other ideas include robots, avatars, and wearables, even smartglasses and hearing aids, that can handle translation.

“At the end of the day you’re making physicians into better physicians,” says Marshall.

Eric Wicklund is the Innovation and Technology Editor for HealthLeaders.

source

Leave a Reply

Your email address will not be published. Required fields are marked *