The Verge just found out how algorithms work for social media recommendations — and they are not happy about it.
Which is odd, as The Verge pitches itself as an outlet that knows everything tech – surely they knew this by now? The algorithms and recommendations on YouTube and the like are designed to uncover more material that the reader/watcher might like to look at. By looking at the choices made among the offered alternatives, the algorithm refines what it thinks the individual is interested in.
The Verge has conniptions about this: “YouTube’s recommendations pushed election denial content to election deniers” Well, yes, that’s what it’s supposed to do.
“Skeptics got more recommendations for election fraud videos, a study found,” Entirely the point of the exercise. “YouTube’s recommendation algorithm pushed more videos about election fraud to people who were already skeptical about the 2020 election’s legitimacy, according to a new study. “ The system’s doing exactly what it is designed to do – showing people more of what they want. Sounds good to us, if only all new technologies worked as advertised.
The reason for the conniptions is, of course, that The Verge regards all of this as bad. Both election skepticism and videos about election fraud. So, bad people get more of something bad, that’s bad. That in itself is a certain bias. If, for example, searching for election fraud videos had brought up Baghdad Bob insisting there’s nothing happening here we’re sure they would be praising that algorithm. Critiques of a computing system by its outcome, not process, is indeed a form of bias.
But we can go further here. For all this it’s bad, terribly bad, is in the first part of the piece. As we’ve noted before there’s that journalistic trick of putting the important information well down the piece. So that it’s still possible to say it’s been mentioned but also so that most people – who don’t read that far – don’t see it.
So, down in paragraph 12 we get this from the author of that original report:
Crucially, Bisbee sees YouTube’s algorithm as neither good nor bad but recommending content to the people most likely to respond to it. “If I’m a country music fan, and I want to find new country music, an algorithm that suggests content to me that it thinks I’ll be interested in is a good thing,” he says.
So, yes, The Verge has just found out how recommendation algorithms work and it doesn’t like it. Because, you know, bad people get more bad stuff – the definition of bad being things The Verge doesn’t agree with. And something that’s “crucially” should almost certainly feature rather closer to the beginning, don’t you think?
The Verge actually ranks No. 45 is U.S. news media outlets. It gains some 45 million visits a month from that position. It’s actually in the top 500 of all US internet sites. We do think it’s about time they understood out how recommendations work.
The entire point of these algorithms is that people get offered more of what they’re interested in. The shock, the horror here, is simply that The Verge thinks that people shouldn’t be interested in these things.