Facebook admits "oversight" after leak reveals internal research on vulnerable children

Facebook is responding to a report that it conducted research on the emotional state of users as young as 14.

istockphoto

Facebook is doing damage control after an Australian media report suggested that the company guided an advertiser to specifically target emotionally vulnerable children on its platform. 

Citing a 23-page leaked document, The Australian newspaper reported that Facebook executives in Australia used algorithms to collect data on more than six million young people in Australia and New Zealand, "indicating moments when young people need a confidence boost." 

That data included Facebook posts, pictures and reactions from people as young as 14, the report said. 

The data analysis — marked "Confidential: Internal Only" — was intended to reveal when young people feel "worthless" or "insecure," thus creating a potential opening for specific marketing messages, according to The Australian. The newspaper said this case of data mining could violate Australia's legal standards for advertising and marketing to children. 

In a statement posted on Sunday, Facebook acknowledged it conducted the research and shared it with an advertiser.

But Facebook called the premise of the article "misleading" and insisted it does not help advertisers target any users based on their emotional states. It noted that the data was "anonymous and aggregated."

The company pledged to investigate the incident.  

"Facebook has an established process to review the research we perform," the statement continued. "This research did not follow that process, and we are reviewing the details to correct the oversight."

This is not the first time Facebook has come under fire for targeted advertising. 

Last year, a ProPublica investigation revealed that Facebook enabled advertisers to exclude specific racial groups from receiving certain ads using Facebook's "ethnic affinity" tag. The news prompted an outcry from policymakers and civil rights leaders, who argued that Facebook was making it easy for businesses to continue the long history of discrimination against racial minorities, particularly African-Americans. 

Facebook subsequently changed course, saying "Discriminatory advertising has no place on Facebook." The company removed the feature allowing advertisers to select target based on race in the areas of housing, employment and credit.