Facebook is scrambling, once again, to address the trustworthiness of news shared on its platform. A week after announcing that posts from news outlets and other brands will be deprioritized from users' news feeds, the social media site says that where news appears, items from "trusted sources" will be given preference over other content.
What defines a "trusted source," however, will be left to its users. In a statement posted to his Facebook page, CEO Mark Zuckerberg said: "We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking."
"We decided that having the community determine which sources are broadly trusted would be most objective," he said.
This appears to be a highly questionable tactic. Previously, Facebook tasked unpaid third party partners to alert users to questionable news content on the platform. Despite that assistance, or perhaps because of it, these newly-informed users clicked on fake news more aggressively, prompting Facebook to abandon its efforts to flag it in the first place. Now, those very same users are being polled to define trustworthy news sources.
To say that the initiative has been met with cynicism by Facebook-watchers would be to dramatically understate things.
The move comes as marketing consultance Edelman prepares to release its annual "Trust Barometer" report, an analysis of how much trust the public places in organizations, platforms and entities. Its 2017 report showed that trust in media had plummeted, as fake news rose in prominence, and as the role of fake news -- its spread facilitated by Facebook and other platforms -- was thought to have had an impact on the 2016 U.S. election. That link has resulted in Facebook's treatment of news coming under constant scrutiny in 2017.
Facebook has historically shied away from making value judgements about the content shared on the platform, insisting that it is not a publisher, nor a media company — prompting commentators to accuse it of abdicating its responsibility as a platform, its denomination of choice.
Despite that, CEO Mark Zuckerberg acknowledged that once again, his company won't be deciding which sources are trustworthy or not, or pouring any financial or staffing resources into doing so. Facebook will again punt the ultimate responsibility to users to define what they see, polling them about their trust in a variety of news sources. Ultimately, said Zuckerberg, this will help the company determine what's a broadly trustworthy source and what isn't. It will also insulate Facebook from taking the responsibility of promoting certain outlets, and thus suppressing others.
Seeing Zuckerberg make the announcement on his own Facebook page is also significant. While he hasn't gone to Capitol Hill to publicly testify about how his service was abused, he has made it his goal to spend this year fixing issues that have spread on his service like hate and abuse. He said his goal is "making sure that time spent on Facebook is time well spent."
The company declined to make Zuckerberg available for an interview. A spokesman said that the goal is not to punish any one news organization (the company doesn't plan to release publisher's scores), but rather to show people more from their favorite sources, as well as trusted sources.
"My hope is that this update about trusted news and last week's update about meaningful interactions will help make time on Facebook time well spent," he wrote Friday.