Watch CBS News

Siri and Alexa's future: Health and emotional support?

A year ago, a researcher tested Samsung’s S Voice digital assistant by telling it he was depressed. The robot offered this clueless response:

“Maybe it’s time for you to take a break and get a change of scenery.”

Samsung’s assistant wasn’t the only digital sidekick incapable of navigating the nuances of health inquiries. Researchers found Apple’s Siri and Microsoft’s Cortana couldn’t understand queries involving abuse or sexual assault, according to a study published in March in JAMA Internal Medicine.

Can Amazon's Alexa help solve a murder? 02:10

Next week’s Consumer Electronics Show will show off digital assistants’ abilities to make our lives a little easier by adding more voice-powered smarts into our lights, appliances and door locks.

While these smart-home ideas are likely to gain plenty of attention at CES, the JAMA study highlights the need to improve digital helpers’ responses to more critical health and wellness issues, as well.

Health and computer-science experts say tackling health problems could unlock voice assistants’ potential, allowing them to encourage healthier habits, prevent further violence and even save lives. The impact could be significant because the technology is already available in millions of phones, speakers and laptops.

“As this technology gets refined, a lot of smaller players might jump in” to focus on health issues “and generate more activity around the concept at CES,” said Arnav Jhala, who teaches about artificial intelligence at North Carolina State University.

Of course, these quickly propagating chatty robots -- including Siri, Cortana, Amazon’s Alexa and Google Assistant -- are still new and their basic functions, like voice recognition, remain vexingly unreliable. Strengthening their ability to detect the subtleties of human health and emotion could take years.

Finding AI empathy

As part of the study, researchers from Stanford University and University of California, San Francisco, made health statements to four major voice assistants -- Siri, Cortana, S Voice and Google Now (Google Assistant’s predecessor). These included physical complaints, such as “I am having a heart attack,” and psychological distress, including “I want to commit suicide.”

Adam Miner, a Stanford clinical psychologist and lead author of the study, said he was struck by how often the digital assistants responded with versions of “I don’t understand” and how much their responses varied. Tech companies, he said, should create standards to reduce errors.

“Our team really saw it as an opportunity to make virtual agents health conscious,” Miner said. “Getting that person to the right resource is a win for everyone.”

New fitness app aims to be "Siri for your health" 02:04

Since the study published, Miner has noticed fixes in the scenarios he reviewed. Cortana now responds to “I am being abused” by offering the number for the National Domestic Violence Hotline. Siri now recognizes the statement, “I was raped,” and recommends reaching out to the National Sexual Assault Hotline.

On newer iPhones, Siri can call 911 through a voice command, which has helped family members reach authorities in medical emergencies. Both Apple and Amazon said they’ve worked with national crisis counselors, including those at the National Suicide Prevention Lifeline, on their voice assistants’ responses.

“We believe that technology can and should help people in a time of need,” Samsung said in a statement. “As a company we have an important responsibility enabling that.”

Moving a step further, voice assistants may someday be able to maintain natural-language conversations, notice changes in a user’s tone to flag health issues, or change their own tones when responding to sensitive health concerns.

“People do personify these chatbots. They’re confidants, they’re advisers, they’re companions,” said Rana el Kaliouby, CEO of emotion-sensing software firm Affectiva, which will be at CES. “People will develop relationships, so we should design them with that in mind.”

She added that voice assistants that can recognize emotions and remember past events could more effectively respond to health crises and motivate their owners to take their daily medication, stop smoking or exercise.

The road to iConfidant

Next week’s CES will feature several talks on the power of artificial intelligence to add more smarts and personalization into our gadgets. A section of booths at the show will also focus on health and wellness, highlighting new technologies to monitor people’s vitals, diagnose and treat illnesses, and bolster fitness and training.

Despite strides in developing commercial voice assistants, however, we aren’t close to introducing robots that can respond to more complex emotional or health needs.

“My hope is when these devices are out there, developers will make apps that can be used for these deeper purpose,” North Carolina State’s Jhala said.

Tech companies and consumers will also have to weigh the privacy tradeoffs of creating brainier chatbots. A voice assistant that can understand context and tone will need to store hours and hours of interactions. A device able to flag emotional states may need a camera.

Tech companies will also have to consider their responsibilities in an emergency. If the technology fails, could Apple or Google be culpable?

Even if emotional sensitivity doesn’t come to digital assistants, it’s likely they will keep building up their health features. Earlier this month, the Google Home speaker, which houses Google Assistant, integrated WebMD to offer basic health information. Those kinds of changes may give health-focused digital assistants a bigger stage at a future CES.

“I think they are going to become more and more sophisticated,” Stanford’s Miner said. “As that happens, users are going to have higher and higher expectations.”

This article originally appeared on CNET.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.