news-details

Is big tech harming society? To find out, we need research, but it's being manipulated by big tech itself

For almost a decade, researchers have been gathering evidence that the social media platform Facebook disproportionately amplifies low-quality content and misinformation.

So it was something of a surprise when in 2023 the journal Science published a study that found Facebook's algorithms were not major drivers of misinformation during the 2020 United States election.

This study was funded by Facebook's parent company, Meta. Several Meta employees were also part of the authorship team. It attracted extensive media coverage. It was also celebrated by Meta's president of global affairs, Nick Clegg, who said it showed the company's algorithms have "no detectable impact on polarization, political attitudes or beliefs."

But the findings have recently been thrown into doubt by a team of researchers led by Chhandak Bagch from the University of Massachusetts Amherst. In an eLetter also published in Science, they argue the results were likely due to Facebook tinkering with the algorithm while the study was being conducted.

In a response eLetter, the authors of the original study acknowledge their results "might have been different" if Facebook had changed its algorithm in a different way. But they insist their results still hold true.

Related Posts
Advertisements
Market Overview
Top US Stocks
Cryptocurrency Market