Internal Facebook guidelines reveal how the social network relies on its editors' news judgment, on top of its algorithms, to determine the content of its influential "Trending Topics" section. Meanwhile, Facebook is strongly defending the list, calling it a "high-quality product, consistent with Facebook's deep commitment to being a platform for people of all viewpoints."
The guidelines, first obtained by The Guardian newspaper and then posted publicly by Facebook, are the latest development in a string of stories peeling back the mystery of Facebook's news operation.
The fight over what's "trending"
For many users, Facebook's "trending" list -- located on the desktop homepage, right next to the News Feed -- provides the first exposure to major breaking news or viral stories. With more than one billion active daily users, Facebook is the biggest distributor of information in the world.
It has been under close media scrutiny since Monday, when the tech news site Gizmodo published a story quoting an anonymous former curator on the trending stories team who said his colleagues showed a bias against conservative-interest stories. The source claimed conservative stories were sometimes "blacklisted," while curators chose to "inject" other stories into the list.
In response, Tom Stocky, Facebook's vice president of search, who oversees the trending news team, said the company was investigating but has "found no evidence that the anonymous allegations are true." He asserted there are rigorous guidelines in place that "do not permit the suppression of political perspectives."
Republican Senator John Thune, chairman of the U.S. Senate Committee on Commerce, Science and Transportation, jumped into the fight when he penned a letter to Mark Zuckerberg asking Facebook, a private company, to tell the public more about the internal process behind trending stories and brief the committee's staff on the issue.
Guidelines for "trending" team
The Facebook document -- internal guidelines for members of the trending stories team on how to manage the section -- is notable for several reasons.
One: It reveals an exhaustive list of rules showing that Facebook cares more about maintaining a consistent editorial voice and following established standards than users might have thought. "Avoid puns, innuendos and cliches," the guidelines say. "Write for a general, PG-13 audience." "To the best of our ability, fact check to make sure our descriptions are accurate and not speculative. Avoid defamatory allegations." It even weighs in on issues of style, such as, it's not "Coachella Valley Music and Arts Festival," but "Coachella."
Two: The document reveals Facebook's perception that mainstream news outlets carry authority, even in the fractured media landscape of 2016. Facebook leans heavily on ten news organizations to determine whether a news event is of major national or international significance, including the New York Times, the Wall Street Journal, and Fox News, among others. A trending topic can rise to the level of a "national story" if it's the lead on at least five of these news organizations' websites.
Facebook also keeps a list of 1,000 news sources, including well-known conservative outlets like Redstate, Breitbart, the Drudge Report and the Daily Caller, whose reporting can be used for corroboration purposes. Facebook released that list of news sources today, an apparent attempt to quiet critics who say the company does not adequately engage with conservative media.
Three: The document shows how Facebook editors can "blacklist" a topic from the trending list. This is supposed to be done for only two reasons -- if the trending phrase doesn't represent a real-world event, or to eliminate duplication.
Four: The document sheds light on Facebook's internal rules about the controversial practice of "injecting" stories into its highly visible trending section. Earlier this week, Gizmodo's anonymous source claimed Facebook has a record of "injecting" topics such as #BlackLivesMatter into the trending stories module. The guidelines say this may only be done for limited reasons: "The editorial team CAN inject a topic... to consolidate a story/clean up appearances.... The editorial team CAN inject a newsworthy topic that is not appearing in the review tool but is appearing in the demo tool (in the corresponding scope). The editorial team CANNOT inject a newsworthy topic if it is not appearing in the demo or review tools."
Facebook opens up on "trending"
Before this week, Facebook provided bare-bones context on trending stories, simply saying that the section is "based on a number of factors including engagement, timeliness, Pages you've liked and your location." Today, the powerhouse social network revealed more details on how the trending stories sausage is made.
In a statement and blog post published today, vice president of global operations Justin Osofsky said trending topics are first surfaced by an algorithm that identifies topics that have recently spiked in Facebook mentions. The algorithm also uses an external RSS website to monitor media coverage. From there, trending topics are personalized for each user based on several factors, including pages users have liked, feedback from users about previous trending topics, users' locations, and more.
Osofsky insisted that trending stories are powered primarily by algorithms, and only secondarily by people who sift through that content for quality control.
"The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum," Osofsky stressed in a statement to CBS News. "Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period. What these guidelines show is that we've approached this responsibly and with the goal of creating a high-quality product -- in the hopes of delivering a meaningful experience for the people who use our service."