AI has become an increasingly helpful part of some aspects of medicine - for instance, some programs help with things like scheduling and billing. So why not also with translating standard documents like patient consent forms?
As advanced as some translation AI is, we’re not there yet.
To be absolutely certain that a patient understands and consents to treatment, healthcare providers have to clearly explain the treatment to patients and be able to answer any questions, as well. Whether the information is on a form or being discussed orally, complex vocabulary and phrasing are often needed.
For instance, when explaining a cancer treatment or even comparing and contrasting several options, a doctor may need to refer to things like a patient’s history, statistics and findings about the treatment(s) being discussed, side effects, and details about how a healthcare facility would deal with post-treatment care. This can be complex even when the doctor and patient share the same native language and culture. Translating this information with AI could result in basic translation errors, not to mention AI’s lack of cultural adaptation that could cause miscommunication between patients and doctors.
It’s common to find reports of AI incorrectly translating medical terms, especially if they’re lesser-known or are similar to another word. For instance, AI may have trouble with translating the names of medication. In a recent example, a ‘bot translated a doctor’s instructions to stop taking Coumadin, a blood thinner, into Chinese as “Do not take anymore soybean.”
Even simple instructions that don’t involve proper names can be risky, sometimes simply because of the language in question. As we discussed in a previous article, translation AI‘s accuracy rates vary widely among different languages.
Maybe the most challenging issue, though, doesn’t come from words themselves, but from their connotations and the culture of those who use them.
Different cultures perceive the illness, talking about illness, the role of family in decision-making, and treatment options differently. For instance, in some Asian cultures, mental illness is seen with shame and as a sign of disgrace to the larger family group. This is extremely important for doctors to take into account when discussing these conditions with patients and family members.
Ideally, doctors and other healthcare workers who have a high number of patients from particular cultural backgrounds should do at least basic research to gain insight into their views on illness, medicine, and treatment, all of which could affect understanding and consent.
Using a guide like this one to help facilitate communication with patients from different cultural and linguistic groups would also be helpful.
But none of this is information that could be plugged into a ‘bot.
AI works by “learning” based on algorithms and data input, not by observing or researching human behavior and beliefs. So when it comes to truly communicating with patients on both a linguistic and cultural level, it’s just not up to the task.
Patient consent forms should ideally be transcreated - that is, translated and also adapted to the target culture. This should be done by a professional translator, not AI like online translation apps. When it comes to conversations, a professional medical interpreter should know how to translate on both a linguistic and cultural level.
Patient consent is crucial. When it comes to patients with limited or no English proficiency, this kind of communication should never be left in the hands of a ‘bot, no matter how capable it might seem.