- Accuracy in Media - https://www.aim.org -

Outlet inadvertently reveals the danger of fact-checking Facebook

It could be about time that folks made up their minds about fact-checking and moderation of dangerous material on Facebook. The debate has moved on from just removing falsehoods and hate speech to pushback against removing people news outlets agree with. But banning hate is different from banning people we don’t like and allowing people we do.

Sadly, this is the way that the whole media debate is going.As soon as someone gets to decide what can appear, they will decide to allow what they like and remove what they don’t. It won’t be censorship by the government, but censorship by fact-checkers.

The example in front of us today is at The Intercept [1]. The complaint is that Facebook aggressively moderates – on Instagram as well as the main pages – the output of a paper called the Tamil Guardian. This is important as in that part of the world to a large extent Facebook is the internet.

The actual complaint though, well, it’s really that the Tamil Guardian isn’t as bad as all that. Or even, that the reporters think that it’s just fine, and therefore so should Facebook. And yet Facebook has come under great pressure because it didn’t moderate pages in the next country over, Burma. In both cases, the underlying subject is about two racial/political groupings sharing one unitary state. In one, Sri Lanka, it led to a years-long war. In the other, it led to the violent expulsion of the Rohingya. Logic would suggest that Facebook should simply heavily moderate any content that deals with racial differences or groupings in such places.

This is what the suggestion here actually is – Facebook should be nuanced, delicate even, in its decisions. But of course, calls to be delicate and nuanced are only ever made about groups broadly agreed with. Which is where the dangers lie. If the global conversation is to be monitored, moderated and banned at times then it cannot be on the basis of who the moderators like or agree with. It must be on the basis of simple and well-understood rules.

The Intercept was set up with a donation from the founder of eBay to be an entirely independent journalistic foundation. It gains some 4 million views a month. As that entirely independent and funded organization, it’s seen as something of a blueprint for the future.

As such it might be worth their doing that little more thinking about their demands. Moderation of the internet needs to be done to all on the same basis, or it needs not to be done at all. Being partial to those we agree with well, the problem there becomes what happens when it’s not us moderating our friends, but our enemies doing so? The protection against that is hard and fast rules, not discrimination in actions.