News Release

Facebook users the main filter of content

Peer-Reviewed Publication

American Association for the Advancement of Science (AAAS)

This news release is available in Japanese.

Do online social networks, such as Facebook, create "filter bubbles" around their users so that people only see what they want to see? Eytan Bakshy and colleagues analyzed the activity of more than 10 million Facebook users to find out -- and their results suggest that, even though the Web site does filter content for its users, those users are still their own biggest censors. Individuals' choices regarding what they clicked on (and what they didn't click on) limited their exposure to attitude-challenging ideas and perspectives much more than Facebook's algorithms, according to the researchers. The site's social algorithms, which try to anticipate content that users will like and present it to them, could potentially prove detrimental to democracy if they shield people from conflicting points of view. And since people are increasingly relying on social media for news and civic information, it is important that researchers understand how such personalized Web sites and Web browsers shape users views of the world compared to more traditional media sources. Bakshy et al. studied Facebook users in the United States who publicly listed their political preferences on the social media site. The researchers considered the news that users posted online for friends -- noting whether it was liberal or conservative -- and then determined what kind of news, posted by the users' friends, actually reached users via the site's social algorithms. The researchers also analyzed the content that users ultimately clicked on and consumed. Taken together, their findings indicate that Facebook filtered about 15% of the news that may have challenged users' beliefs, but that users chose to ignore about 70% of the challenging posts that Facebook did present them with. So, while Bakshy and the other researchers insist that social media networks such as Facebook need to be studied continuously, they also suggest that users hold the key to challenging their own thoughts and ideas - and that Facebook probably presents individuals with more ideological-challenging ideas than most blogs and news wires. A Perspective article by David Lazer discusses these findings in greater detail and highlights the need for a new field to study dedicated to the "rise of the social algorithm."

###

Article #21: "Exposure to Ideologically Diverse News and Opinion on Facebook," by E. Bakshy; S. Messing; L. Adamic at Facebook in Menlo Park, CA; L. Adamic at University of Michigan in Ann Arbor, MI.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.