Accuracy in Media

Social media behemoth Facebook which has been plagued by fake news content posted by its users is stepping up its efforts to identify and remove the misleading posts faster.

Facebook announced in a blog post that it is piloting a program that will employ what they are calling “community reviewers” to research questionable posts and share their findings with the thrid-party fact-checkers the company uses to help review content posted on Facebook. These reviewers will be contractors hired through one of Facebook’s partners and will not be company employees.

Here’s how it will work: 

  • Our machine learning model identifies potential misinformation using a variety of signals. These include comments on the post that expresses disbelief, and whether a post is being shared by a Page that has spread misinformation in the past.
  • If there is an indication that a post may be misinformation, it will be sent to a diverse group of community reviewers.
  • These community reviewers will be asked to identify the main claim in the post. They will then conduct research to find other sources that either support or refute that claim, similar to the way a person using Facebook may search for other news articles to assess it if they believe the main claim in a post. Fact-checking partners will then be able to see the collective assessment of community reviewers as a signal in selecting which stories to review and rate.

Facebook is partnering with YouGov, a global public opinion and data company to ensure that the community reviewer pool is representative of the Facebook community in the U.S. and reflects the diverse viewpoints — including political ideology — of Facebook users.

The company said that by combining the expertise of third-party fact-checkers with a group of community-based reviewers, they believe that they can evaluate misinformation faster and make even more progress reducing its prevalence on Facebook.




Ready to fight back against media bias?
Join us by donating to AIM today.

Comments

Comments are turned off for this article.