Facebook has begun to rate its users in an attempt to stem the flow of fake news posted on the site.
The social media giant is assigning users a reputation score, predicting their trustworthiness on a scale from zero to one.
The network took this action after users began falsely reporting news items as untrue. Those false flags further complicated their effort to rid the site of fake news.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Tessa Lyons, a Facebook product manager, said in an interview with the Washington Post.
Facebook does not intend to use the trustworthiness score as an absolute indicator of credibility, Lyons said.
There is no single unified reputation score, she said. Rather, it is a single measurement among thousands of behavioral clues Facebook uses to understand risk.
It also monitors which users tend to flag content published by others and which publishers are considered trustworthy by users.
But it isn’t clear what other criteria Facebook uses to create the score or if all users have a score. There is no guarantee that rating the users will do much to combat fake news, as evidenced by Facebook’s previous efforts.