News

Facebook is going after misinformation superspreaders

Facebook says it will penalize individuals who repeatedly share misinformation. The company introduced new warnings that will notify users that repeatedly sharing false claims could result in “their posts moved lower down in News Feed so other people are less likely to see them.”

Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners,” the company wrote in a press release. While demonstrably false posts are already demoted in the News Feed rankings, users who share misinformation regularly will now see all of their content pushed down the dashboard’s endless scroll.

The update comes after a year when Facebook has struggled to control viral misinformation about the coronavirus pandemic, the presidential election and COVID-19 vaccines. “Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” the company wrote in a blog post.

Facebook didn’t indicate how many posts it would take to trigger the reduction in News Feed, but the company has used a similar “strike” system for pages that share misinformation.

Researchers who study misinformation have pointed out that it’s often the same individuals behind the most viral false claims. For example, a recent report from the Center for Countering Digital Hate found that the vast majority of anti-vaccine misinformation was linked to just 12 individuals.

(Visited 37 times, 1 visits today)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.