More than two dozen human rights groups call on Zoom to halt emotion tracking software plans

Women suffer higher levels of "Zoom fatigue," study says

Twenty-eight human rights organizations penned a letter to Zoom Wednesday, calling on the company to halt any plans it has for emotion tracking software aimed at assessing users' engagement and sentiment. The groups say the technology is discriminatory, manipulative, punitive, a data security risk and based on pseudoscience. 

"Adopting the junk science of emotion detection on the Zoom platform would be a huge mistake," Tracy Rosenberg, of Oakland Privacy, said in a statement Wednesday. "There is zero reliable evidence that a machine can accurately assess someone's emotional state and a lot of evidence that one-size fits-all assumptions about 'normality' don't mirror human diversity and punish out-groups for differences." 

The letter, addressed to Zoom's CEO Eric Yuan, comes in response to an article published last month by the technology publication Protocol, which reported Zoom was developing technology aimed at evaluating a user's sentiment or engagement level.

"Think about this being used at work, and someone misses out on a promotion because the computer says they are not paying attention in a meeting, or if a student is disciplined because they're not paying attention in school," Caitlin Seeley George, campaign director of Fight For the Future, told CBS News.

According to Zoom, the system, called Q for Sales, would assess users for their talk-time ratio, response time lag, and frequent speaker changes to track how engaged the person is. Based on these factors, Zoom would then assign scores between zero and 100, with higher scores used to indicate higher engagement or sentiment. 

It's unclear where the company intends to follow through with its plan and the software is not currently implemented. But groups including the American Civil Liberties Union, Jobs With Justice and Muslim Justice League are urging Zoom to "stop its exploration" of the technology entirely. 

"Our emotional states and our innermost thoughts should be free from surveillance," senior policy analyst Daniel Leufer of Access Now said in a statement. 

The groups claim that the technology paves the way for discrimination against people with disabilities or of certain ethnicities by assuming that everyone uses the same facial expressions, voice patterns and body language to communicate. The groups also say the software could be a potential data security risk for users, making their information vulnerable to "snooping government authorities and malicious hackers."

Facial recognition technology in general has been problematic and prone to misidentifying people, especially people of color, noted Seeley George.

"We've seen this with technology like facial recognition, when it's being used to identify people accused of committing crimes and it points to someone for a crime they didn't commit," she said.

"Millions of people have started using [Zoom] in their daily lives to connect with friends and family members, people are using it for work, for school, so the number of people who could be impacted by this are really massive."

The groups have asked Zoom to publicly respond to their request by the end of the month. 

"You can make it clear that this technology has no place in video communications," the letter to Yuan stated. 

A Zoom spokesperson told CBS News that it does not currently have a statement in response to the joint letter.

CBS News' Irina Ivanova contributed reporting.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.