One in three people believe they don’t have to seek the news from traditional outlets like newspapers and television. Instead, they think the “news will find me” (NFM), relying on algorithms and social networks to get their information. A research team led by Penn State scholars recently found that these individuals often consider their online networks to be as trustworthy as professional editors and journalists.
This mindset may make people more vulnerable to believing and sharing misinformation, according to the researchers, who published their findings in the journal Social Media & Society.
To understand news consumption behavior, the researchers designed an experiment that allowed them to observe how individuals with different levels of NFM engage with news. The researchers found users with higher NFM considered news recommended by algorithms or shared by others in their social network to be just as credible as news recommended by editors and reporters.
However, mid- and low-NFM individuals more critically evaluated news sources and placed higher value on stories from editors and reporters.
“The good news is that, overall, professionals are still valued,” said corresponding author S. Shyam Sundar, Evan Pugh University Professor and James P. Jimirro Professor of Media Effects at Penn State. “But people with this tendency to rely on news coming to them — which is becoming more and more people — are trusting algorithms and social media friends to be their news sources.”
When readers grant algorithms and social networks the same authority as journalists, it’s easy for bad actors to manipulate that digital space versus imitating a trusted news source, the researchers said.
“The underlying psychological mechanism was not parsed out in previous studies,” said first author Mengqi Liao, assistant professor at the University of Georgia who completed her doctoral studies with Sundar at Penn State. “We did this experiment to understand and explain why respondents evaluate the recommended news the way they do.”
The web-based experiment included 244 participants. Each user completed a pre-questionnaire that measured NFM level using a standardized survey scale. Then participants were randomly assigned to one of three simulated news feeds, which recommended content by a news editor, social media friends or an algorithm.
The content stayed the same across news feeds, only the source of the recommendation — algorithm, friends or editors — changed. This allowed the researchers to examine how each source prompted participants to rely on different heuristics: “mental shortcuts,” or rules of thumb that people use to make quick judgments.
For example, when a news article is recommended by an editor, this activates the authority heuristic, prompting readers to trust the information because it comes from professional journalists.
When content is recommended by an algorithm, it triggers the machine heuristic. Articles recommended by social media friends activate the homophily heuristic, meaning people are more likely to trust information shared by individuals they see as similar to themselves.
“For some people, the algorithm now carries the same weight as a journalist,” said co-author Homero Gil De Zúñiga, distinguished professor of media studies at Penn State. “We’re seeing a flattening of authority so that algorithms and social media feeds are being trusted like professional journalism.”
Sundar said the fact that this makes people with high NFM more vulnerable to misinformation and less informed overall is especially problematic with more people adopting an NFM approach to their news and information.
Liao added that it “would be a really big problem” if social media friends and algorithms are recommending very biased or even false information.
“Subscriptions are going down; people are not actually seeking news,” Sundar said. “Machine as a source is now becoming predominant, undermining the more traditional professional sources, and that’s worrisome.”
Sundar suggested possible strategies for combating the phenomenon, such as targeting high NFM people with customized media literacy interventions. These interventions could inform readers where information originated from, as well as the steps journalists took to uncover the information.
Yuan Sun, assistant professor at the University of Florida who earned a doctorate in mass communication and media studies from Penn State in 2023, and Timilehin Durotoye, doctoral candidate in the Donald P. Bellisario College of Communications at Penn State, were authors on the paper as well.
Journal
Social Media + Society
Article Title
When We Think “News Will Find Me”: Relative Credibility of Social-Media Friends, Algorithms, and Editors
Article Publication Date
3-Apr-2026