Accuracy in Media

A study released by Yale University found Facebook’s attempt to combat fake news is not working — and in some cases, it has helped spread fake stories.

Facebook’s third-party fact-checker came out in December, according to Politico, after efforts to use its user base to flag fake news proved unsuccessful. Now, the site uses an algorithm backed by five fact-checking sources: ABC News, the Associated Press, FactCheck.org, Politifact and Snopes.

“These results suggest that the currently deployed approaches are not nearly enough to undermine belief in fake news,” the study’s abstract read. “New (empirically supported) strategies are needed.”

The study showed that even when the stories were marked as “disputed by third party fact-checkers,” it had little impact on readers’ perception of the stories.

Because of the volume of stories that are flagged as fake news, Facebook has not been able to directly determine whether stories are factual. As a result, stories with bad information have been passed through, marked as reputable and shared across the platform. In some cases, those stories went viral, boosting them far beyond their original reach.

Facebook has historically been protective of its data. Critics say this protection has prevented fact-checkers from prioritizing the most popular fake news stories, allowing those subjects to slip through the cracks.

 





Ready to fight back against media bias?
Join us by donating to AIM today.

Comments