Accuracy in Media

Vice is falling into the journalistic trap of believing the campaigners on an issue rather than reporting reality. The issue in this journalistic mistake is the racism of AI systems.

Well, the perceived racism of AI systems. Vice writes “according to the company’s researchers, the system has the same problem as its predecessors: It’s extremely bad at avoiding results that reinforce racist and sexist stereotypes.”  This leads to the demand that AI’s must be changed to represent the world as the activists would prefer it be.

The problem is that those who want AI to change misunderstand the basics of what an AI is. It’s not intelligence, it’s artificial intelligence. It’s pattern recognition.

Feed it lots of information, and it will start to recognize patterns. These patterns can be used to predict things about new information offered.

But large language models have been repeatedly criticized for encoding biases into machine-learning systems, and Facebook’s model seems to be no different—or even worse—than the tools that preceded it.

The thing being criticized is inherent in the idea or artificial intelligence. If we create a model using real world information, it will contain the same biases as the real world. If we create it using information that doesn’t reflect the real world – without those biases of human beings – then it will not predict anything useful about the real world.

There is no way out of this, which is something that tech journalism needs to understand and accept. The activists demanding that AI be – for example – non-racist are being pure of heart, but disconnected from reality. Journalism is, after all, the portrayal of reality, not dreams.

Vice is a major part of the modern media landscape. The magazine has a distribution of some 900,000 copies, the cable channel reaches 60 million American homes, the website gets 25 million visits a month. A lot of people get their news from Vice.

Tech journalism should be explaining the background issues, not simply repeating a particular line or taking sides. Taking one side is also known as bias.

 

https://www.vice.com/en/article/epxeka/facebooks-new-ai-system-has-a-high-propensity-for-racism-and-bias




Ready to fight back against media bias?
Join us by donating to AIM today.

Comments

Comments are turned off for this article.