News Release

Small digital frictions can slow the spread of misinformation

New research from the University of Copenhagen and the University of Indiana, Bloomington, points to a simple yet effective method for combating misinformation on social media: make it slightly harder to share content.

Peer-Reviewed Publication

University of Copenhagen

Social media platforms like Facebook, Instagram, and X have made it incredibly easy to share content with friends and acquaintances through like and share buttons.

But we don’t just share cat videos and cake recipes—we also share content that turns out to be fake news and misinformation. Research has shown that such content is particularly attractive and spreads faster on social media than reliable information—partly because platform algorithms prioritize sensational posts that are widely shared.

But what if sharing content became a bit more difficult? That’s the idea proposed by researchers from the University of Copenhagen in a new article published in the Nature journal npj Complexity.

“Our idea is to introduce a small pause in the sharing process to make people reflect on what they’re sharing before clicking the button,” says PhD student Laura Jahn, lead author of the study alongside Professor Vincent F. Hendricks. She elaborates:

“We developed and tested a computer model that simulates how information spreads on social media platforms like X, Bluesky, and Mastodon. It shows that a small digital friction—such as a pop-up message—can effectively reduce content sharing.”

Learning improves quality

The researchers’ model shows that frictions can be an effective tool to reduce the number of shares. However, it also shows that frictions alone don’t necessarily improve the quality of the content being shared.

To address this, they added a learning element to the model that users encounter when attempting to share a post:

“It could be a pop-up with a short quiz asking questions like: How is misinformation defined, and what does this social media platform do to limit fake news? The idea is that this learning element will prompt users to reflect on their behavior and share fewer problematic posts,” explains Vincent F. Hendricks, concluding:

“And we can see from the model that when friction is combined with learning, the average quality of shared posts increases significantly.”

Field study ahead

The next step for the researchers is to test whether the strategy of introducing digital frictions with learning elements has the same positive effects in real-world settings.

“We hope our proposal will inspire tech giants to think innovatively in the fight against misinformation. They could help us test the promising computer model to see whether engagement with low-quality content decreases and whether users become better at recognizing misinformation in real situations,” say Laura Jahn and Vincent F. Hendricks.

If collaboration with a major social media platform isn’t possible, the researchers will use simulated platforms available for research purposes.

Read the article A perspective on friction interventions to curb the spread of misinformation in the journal npj Complexity.

The research was conducted at the Center for Information and Bubble Studies at the University of Copenhagen.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.