Researcher apologizes for Facebook study in emotional manipulation

Facebook responds to concerns over emotions experiment

A Facebook data scientist is apologizing for a study that angered many Facebook users by trying to manipulate their emotions based on the posts they saw in their News Feeds.

In the study, published in March in the Proceedings of the National Academy of Sciences, the researchers said they changed the algorithms on News Feeds of almost 700,000 Facebook users for one week in January 2012, to see whether a mostly positive -- or a mostly negative -- News Feed would elicit different types of status updates.

The study concluded that "when positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

However, Facebook soon faced a backlash from users who felt that the social network was improperly trying to mess with their emotions -- or at least should have notified them that their News Feeds may have been modified for experimental purposes.

On Sunday, a Facebook researcher involved in the study, Adam D.I. Kramer, posted a detailed explanation on Facebook, and apologized to users.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused."

The authors note in the study that use of the information "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook."

Indeed, the social media giant routinely tweaks its News Feed algorithm -- announcing one recent change in December 2013 to highlight "high-quality content" and bring users more "relevant news and what their friends have to say about it."

Kramer noted that during the research, actual impact on people in the experiment was minimal: "the result was that people produced an average of one fewer emotional word, per thousand words, over the following week." A Facebook spokesperson told Forbes that the data used could not be attributed back to any single person.

Some users were still upset, and took to social media to criticize Facebook's psychology experiment.

According to the American Psychological Association, psychological studies that involve deception must let participants know of said deception as soon as it is feasible. The researchers did not mention this in their final report.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.