NEW YORK -- Facebook has tightened its research guidelines following uproar over its disclosure this summer that it allowed researchers to manipulate users' feeds to see if their moods could be changed.
At issue was study in which Facebook allowed researchers to manipulate the content that appeared in the main section, or "news feed," of small fraction of the social network's users. During the weeklong study in January 2012, data-scientists were trying to collect evidence to prove their thesis that people's moods could spread like an "emotional contagion" depending on what they were reading.
"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Mike Schroepfer, Facebook's chief technology officer, wrote in a blog post Thursday. "It is clear now that there are things we should have done differently."
In the past three months, Schroepfer said, Facebook has given researchers clearer guidelines on research procedures and has created an internal panel that will review projects. But there will not be an external review process and Facebook will continue to encourage researchers to study how people use its site.
"We believe in research, because it helps us build a better Facebook," Schroepfer wrote. "Like most companies today, our products are built based on extensive research, experimentation and testing."
A Facebook researcher involved in the news feed study, Adam D.I. Kramer, posted a detailed explanation of the project on Facebook last June, and issued an apology to users.
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused."
The authors note in the study that use of the information "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook."
Kramer noted that the experiment actually appeared to have minimal impact on the users whose feeds were manipulated. "The result was that people produced an average of one fewer emotional word, per thousand words, over the following week." A Facebook spokesperson told Forbes that the data used in the study could not be attributed back to any individual user.