Watch CBS News

Does Facebook's News Feed control your world view?

A growing list of critics have attacked the over-personalization of the web, contending that website algorithms create a "filter bubble" that limit what you see on Facebook and other social media sites.

It's become an even more pressing issue since social media sites are increasingly getting in the news game. Internet activists like Eli Pariser have criticized the practice for essentially limiting our view of the world. They say that by guessing what information the user would like to see based on their profile, sites are shielding Internet users from opposing viewpoints - serving to isolate us in our own cultural and ideological bubbles.

Facebook, however, has challenged that notion. And now Facebook researchers are behind a study that for the first time attempts to quantify the company's role in the phenomenon.

Writing in the journal Science, several Facebook researchers suggest that users are, for the most part, seeing what they want to see. The authors argue we are largely to blame for filtering content ourselves - more so than the computer algorithm, which chooses content to surface in your News Feed after learning what you like. They also say that the "filter bubble" trend isn't as bad as some Internet activists would suggest.

"News Feed basically is servicing content that is, in fact, slightly more aligned with an individual's own ideology, but the friends that people are choosing and the content they are clicking on are really more important factors in terms of how much diverse content people are (encountering)," Solomon Messing, one of the authors and a data scientist with Facebook, told CBS News.

The study published Thursday used data from 10.1 million Facebook users who self-reported their political affiliation. It found that just under 29 percent of hard news that people see in their News Feed reflects opposing views. Less than 30 percent of content that users share cuts across ideological lines. On top of that, less than a quarter of users report having Facebook friends with an "ideological affiliation from a different party."

When it came to clicking on stories, individual choices played a greater role than algorithmic ranking in limiting exposure to ideologically opposing views. Interestingly, the study found conservatives are exposed to more content with opposing views than liberals - about 34 percent of self-described conservatives share divergent news on their News Feed, compared to 23 percent for liberals.

Messing said the findings show fears of the algorithm's impact are overblown and that people are being exposed to more diverse content that previously believed - even if it's still only a minority of what's in their News Feeds.

"People are seeing content and clicking on it. I'm frankly surprised people are clicking on content from the other side to the extent that the do," he said. "I think that's really important. I don't know if it's a good or bad thing but it's really important."

In a blog post summarizing the study, the researchers conclude, "who [people] friend and what content they click on are more consequential than the News Feed ranking."

Pariser, who wrote "The Filter Bubble" and was briefed on the study by Facebook, praised the company for doing the research. Still, he argued the findings show that Facebook's algorithm - combined with our own choices - plays a role in creating those barriers he finds so worrisome.

"The effects I was concerned about are clearly there and they are scientifically significant, if a little smaller on average than I might have guessed," Pariser said.

"The degree to which the algorithm narrows what content you consume more towards stuff you are interested in is almost as strong as your choices about what to click on, which, to me, is pretty strong," he continued. "It also shows when you use this combination of social and algorithm filters, you do get at every step of the way news that more likely validates your political beliefs."

In an article that accompanied the Science study, Northeastern University's David Lazer suggested the power of the algorithm is often overrated.

"Curation does ideologically filter what we see. However, this effect is modest relative to choices people make that filter information, including who their friends are and what they choose to read given the curation," Lazer, who is also part of Harvard Univesity's John F. Kennedy School of Government, wrote. "The deliberative sky is not yet falling, but the skies are not completely clear either."

Lazer warned it's important to remain watchful for signs that algorithms may be gaining greater influence. He noted changes that Facebook made in April to the curation of News Feed - including one in which you see more updates from "friends you care about" - could end up providing users increasing amounts of like-minded content.

"This is an important finding and one that requires continued vigilance. A small effect today might become a large effect tomorrow, depending on changes in the algorithms and human behavior," he wrote.

Facebook's Messing refused to discuss the broader significance of the News Feed. But the fact the company was digging into the numbers, Pariser said, shows the importance it is placing on the algorithm at time when Facebook has been shown to have an increasing influence on such civic activities as voting.

"I do see this study as Facebook saying, yes, there are some real important questions of algorithmic ethics that need to be explored here," he said. "A big conversation around what effect these algorithms have is a good thing."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.