news-details

Team debunks research showing Facebook's news-feed algorithm curbs election misinformation

An interdisciplinary team of researchers led by the University of Massachusetts Amherst have published work in the journal Science calling into question the conclusions of a widely reported study—published in Science in 2023—finding the social platform's algorithms successfully filtered out untrustworthy news surrounding the 2020 election and were not major drivers of misinformation.

The UMass Amherst-led team's work shows that the research was conducted during a short period when Meta temporarily introduced a new, more rigorous news algorithm rather than its standard one, and that the previous researchers did not account for the algorithmic change. This helped to create the misperception, widely reported by the media, that Facebook and Instagram's news feeds are largely reliable sources of trustworthy news.

"The first thing that rang alarm bells for us" says lead author Chhandak Bagchi, a graduate student in the Manning College of Information and Computer Science at UMass Amherst, "was when we realized that the previous researchers," Guess and colleagues, "conducted a randomized control experiment during the same time that Facebook had made a systemic, short-term change to their news algorithm."

Beginning around the start of November 2020, Meta introduced 63 "break glass" changes to Facebook's news feed which were expressly designed to diminish the visibility of untrustworthy news surrounding the 2020 U.S. presidential election. These changes were successful.

"We applaud Facebook for implementing the more stringent news feed algorithm," says Przemek Grabowicz, the paper's senior author, who recently joined University College Dublin but conducted this research at UMass Amherst's Manning College of Information and Computer Science.

Related Posts
Advertisements
Market Overview
Top US Stocks
Cryptocurrency Market