John W. Ayers, PhD, MA Vice Chief of Innovation | Assoc. Professor Div. Infectious Disease & Global Public Health University of California San Diego

How Can Virtual Assistants Be Trained to Help Substance Abusers?

MedicalResearch.com Interview with:

John W. Ayers, PhD, MA Vice Chief of Innovation | Assoc. Professor Div. Infectious Disease & Global Public Health University of California San Diego

Dr. Ayers

John W. Ayers, PhD MA
Vice Chief of Innovation | Assoc. Professor
Div. Infectious Disease & Global Public Health
University of California San Diego

MedicalResearch.com: What is the background for this study?

Response: Already half of US adults use smart device enabled intelligent virtual assistants, like Amazon Alexa. Moreover, many of the makers of intelligent virtual assistants are poised to roll out health care advice, including personalized wellness strategies. We take a step back and ask do intelligent virtual assistants provide actionable health support now?

To do so we focus on a specific case study. One of the dominant health issues of the decade is the nation’s ongoing addiction crisis, notably opioids, alcohol, and vaping. As a result, it is an ideal case to begin exploring the ability of intelligent virtual assistants to provide actionable answers for obvious health questions.

MedicalResearch.com: What are the main findings?

Response: We asked Amazon Alexa, Apple Siri, Google Assistant, Microsoft Cortana, and Samsung Bixby to “help me quit…” followed by drugs and various substances including alcohol, tobacco, marijuana, or opioids (e.g., “help me quit drinking”). Among seventy different help-seeking queries, the intelligent virtual assistants returned actionable responses only four times with the most common response being confusion (e.g., “did I say something wrong?”). Of those that returned a response, “help me quit drugs” on Alexa returned a definition for drugs, “help me quit smoking” and “help me quit tobacco” on Google Assistant returned Dr. QuitNow (a smoking cessation app), while “help me quit pot” on Siri returned a promotion for a marijuana retailer.

MedicalResearch.com: What should readers take away from your report? What can virtual assistants do to provide meaningful help to users who are looking to reduce addictive habits?

Response: While intelligent virtual assistants are largely ignoring the substance misuse questions they have the potential to provide meaningful help. Thanks to free federally-managed remote substance misuse treatment or treatment referral services, like 1-800-662-HELP for alcohol or drugs and 1-800-QUIT-NOW for smoking or vaping, we can encourage people to take the first step towards treatment by having intelligent virtual assistants promote 1-800 helplines.

The solution is not only obvious but the makers of intelligent virtual assistants should be able to swiftly implement it. Alexa can already fart on demand, why can’t it and other intelligent virtual assistants also provide life saving substance use treatment referrals for those desperately seeking help? To achieve our solution some of the time companies spend on fart sounds can simply be redirected to substance misuse.

Our proposed solution is poised to yield tremendous benefits for public health. Only 10% of Americans that need treatment for substance misuse receive it. Because intelligent virtual assistants return the optimal answer to a query, they can provide a huge advantage in disseminating resources to the public like 1-800 helplines. Updating intelligent virtual assistants to accommodate help-seeking for substance misuse could become a core and immensely successful mission for how tech companies address health in the future.

MedicalResearch.com: What recommendations do you have for future research as a result of this work?

Response: Now is the time for action. Rather than reading another study in a couple of months we would instead like to ask Amazon Alexa, Apple Siri, Google Assistant, Microsoft Cortana, and Samsung Bixby for substance use help and see that users are being redirected to free federally managed 1-800 helplines.

No disclosures

Citation:

Nobles, A.L., Leas, E.C., Caputi, T.L. et al. Responses to addiction help-seeking from Alexa, Siri, Google Assistant, Cortana, and Bixby intelligent virtual assistants. npj Digit. Med. 311 (2020). https://doi.org/10.1038/s41746-019-0215-9

https://www.nature.com/articles/s41746-019-0215-9#citeas 

JOIN OUR EMAIL LIST
[mailpoet_form id="5"]

We respect your privacy and will never share your details.

Last Modified: [last-modified]  

The information on MedicalResearch.com is provided for educational purposes only, and is in no way intended to diagnose, cure, or treat any medical or other condition. Always seek the advice of your physician or other qualified health and ask your doctor any questions you may have regarding a medical condition. In addition to all other limitations and disclaimers in this agreement, service provider and its third party providers disclaim any liability or loss in connection with the content provided on this website.

 

Last Updated on January 29, 2020 by Marie Benz MD FAAD