Watch CBS News

Is it time for Alexa and Siri to have a "MeToo moment"?

United Nations -- More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled.  

But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says.

"I'd blush if I could"  

A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm.

The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."

The authors say the report is named based on the response given by Siri when a human user says, "Hey Siri, you're a bi***". That programmed response was revised in April, when the report was distributed in draft form.

The report reveals a pattern of "submissiveness in the face of gender abuse" with inappropriate responses that the authors say have remained largely unchanged during the eight years since the software hit the market.

Artificial intelligence poses ethical challenge to tech industry 03:08

Dr. Saniye Gülser Corat, Director of the Division for Gender Equality at UNESCO, conceived and developed the report, along with Norman Schraepel, Policy Adviser at the German Agency for International Cooperation with the EQUALS global partnership for gender equality in the digital age, a non-government organization of corporate leaders, governments, businesses, academic institutions and community groups that partners with several U.N. agencies, including U.N. Women, to promoting gender balance in the technology sector for both women and men.

Bias in the code

Both Alexa and Siri, the study says, fuel gender stereotyping: "Siri's 'female' obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products."

The study blames a lot of factors but says programming is the main culprit, and it recommends a change to the programmers: "In the United States, the percentage of female computer and information science majors has dropped steadily over the past 30 years and today stands at just 18 percent down from 37 per hundred in the mid-1980s."

"AI is not something mystical or magical. It's something that we produce and it is a reflection of the society that creates it," Gülser Corat told CBS News. "It will have both positive and negative aspects of that society."

Future of artificial intelligence becomes key topic at World Economic Forum 04:03

Artificial intelligence "really has to reflect all voices in that society. And the voice that is missing in the development of AI at the moment is the voice of women and girls," she said. 

As CBS News sister site CNET reported last year, despite efforts to change the culture Silicon Valley still suffers from a lack of diversity.  

The UNESCO study aims to expose biases and make recommendations to begin "closing a digital skills gap that is, in most parts of the world, wide and growing."

The study makes several recommendations:

  • Stop making digital assistants female by default
  • Discourage gender-based insults and "abusive language
  • Develop advanced technical skills of women and girls so they can get in the game.

"It is a 'Me Too' moment," Gülser Corat told CBS News. "We have to make sure that the AI we produce and that we use does pay attention to gender equality."

Docile digital assistants

Gülser Corat said the use of female voices is not universal in all voice assistants.

The UNESCO study found that a woman's voice was used intentionally for assistants that help in the use of "household products," appliances and services, contrasting driving and GPS services that more often use male voices.

The study also showed that the input languages in some cities around the world, for example in France, the Netherlands and in some Arab nations, use a male voice by default, leading the authors to hypothesize that these countries have "a history of male domestic servants in upper class families." 

Examples of AI responses 

Some sample language of AI assistant responses was included in the study:

graph.jpg
UNESCO/Adapted from Quartz
View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.