According to recent statistics, 49% of US households own a voice assistant. Most people use voice assistants on their phones or TV.
For many of us, these devices seem like something out of our favorite science fiction stories or movies, allowing us to ask a question and get an answer from a friendly disembodied voice. Like the fictional AI we know and love, voice assistants in our world seem reliable and knowledgeable. This may contribute to the reasons why some people rely on them for medical information.
Unfortunately, our technology isn’t up to snuff with its sci-fi counterparts. And so, when it comes to asking a voice assistant to search for medical information, the results aren’t always the best ones - and in some cases, might even be dangerous.
The issue has become an increasing concern for doctors and other medical professionals. In May, a group of gastroenterologists presented the results of a small study they’d performed after realizing that their patients in their 30’s and 40’s often use voice assistants to answer medical questions.
They asked popular voice assistants Siri, Google Assistant, Alexa, and Cortana the five most common questions they get in their clinic.
The results were surprising. For instance, while both Siri and Google Assistant answered correctly and directed users to accredited medical websites, Google Assistant’s results also placed advertisements at the top of results. This could mean patients are getting inaccurate or biased information about even serious health conditions.
Alexa only answered three questions correctly, and Cortana only responded correctly to two. But for the doctors, Google Assistant’s ad placement remains even more concerning. They suggest changes to programming or even legislation that could change this. Another programming change they want is for search results to be concluded with “Talk to your doctor.” But how easy would these changes be to implement?
This is just a small study, with questions related to a particular type of health concern, but these doctors aren’t the only medical professionals who are concerned and curious.
In January, Stanford Medicine’s Scope blog reported on a study in which major voice assistants were asked questions about getting screened for 11 different types of cancer.
One issue that the study found is a lack of accessibility in how some voice assistants are programmed to perform searches. Ideally, voice assistants should give verbal answers, not just direct users to a website. This makes them help with internet accessibility for visually impaired or illiterate people. But not all of the voice assistants did this, although some went beyond expectations, providing both a verbal answer and referring users to a website.
Still, despite the variety of options, at best, voice assistants gave correct information only 70% of the time.
The article also cites previous studies done on things like vaccine information before the COVID-19 pandemic, as well as COVID-specific information. None of these studies showed high levels of accuracy from voice assistants, either.
Hopefully, all of these studies will motivate the creators of voice assistants to find ways to improve the way they provide medical information.
In the meantime, people should remember that the voice in their device doesn’t belong to a wise sci-fi robot like Iron Man’s Jarvis. Instead, it’s simply following the rules it’s been programmed to follow for any sort of search, whether potentially life-saving medical information, or who sang that song that’s stuck in your head. Looking for information on accredited medical websites, and, ideally, talking to a doctor, are still the best ways to get accurate answers to health-related questions.
Contact Our Writer – Alysa Salzberg