Reporters on Tuesday continued to examine two reports submitted to the Senate that contend social media companies helped Russians plant misleading information favoring President Trump during the 2016 presidential election.
Media outlets have embraced the two reports – submitted to the Senate Intelligence Committee but not endorsed or even scheduled for hearings by the committee – as feeding their narrative that Russia colluded with social media executives known to have opposed Trump’s candidacy to in fact aid it by not only providing pro-Trump administration to supporters but by sowing confusion among opponents about when or how to vote, supporting Bernie Sanders’s disputes with Hillary Clinton over delegates, or otherwise misleading the public.
“It should be well known by now that Russian operatives made memes and fake activist pages to try to sway the 2016 presidential campaign for Donald Trump,” wrote Slate’s April Glaser under “Congress Now Has A Very Full, Very Ugly Picture of How Russia Targeted Black Americans” – subhead: “Will lawmakers finally do something about it?”
“A new report that was put together by the Oxford University Computational Propaganda Project and a network analysis firm for the Senate Intelligence Committee reveals the scope of the Russia disinformation campaign was larger than previously known and sought to directly benefit Republicans and President Trump,” wrote Nicole Lafond for Talking Points Memo under “Report Prepared for Senate Shows Scale of Russia Disinformation Larger Than Expected.”
“Russia’s Bid to Help Trump Revealed as Much Wider Than Once Known,” read the headline on a Bloomberg story by Steven Dennis, Ben Brody and Sarah Frier. “Fake posts led to hundreds of millions of views, reports say,” read one subhead.
In a New York Times story headlined “Social Media’s Forever War,” reporter Kevin Roose wrote: “One takeaway from these reports might be that the Russian influence campaign of 2016 was a freak occurrence enabled by a perfect storm of vulnerabilities: growth-obsessed social media companies, unsuspecting intelligence agencies and an election featuring two hyper-polarizing candidates, one of which had a Russian blind spot and an army of supporters willing to believe convenient lies and half-truths.”
Tuesday’s fresh news – sculpted in part to respond to a statement by a Facebook executive that most Russian spending on social media ads took place after the election – is that indeed these companies continue to help Russians promote Trump and undermine his opponents.
The Washington Post, which broke the story of the reports on Monday, followed up on Tuesday with “Russian disinformation teams targeted Robert S. Mueller III, says report prepared for Senate.”
The lead, by reporters Craig Timberg, Tony Room and Elizabeth Dwoskin, read: “Months after President Trump took office, Russia’s disinformation teams trained their sights on a new target: special counsel Robert S. Mueller III. Having worked to help get Trump into the White House, they now worked to neutralize the biggest threat to his staying there.”
“The effort started earlier than commonly understood and lasted longer while relying on the strengths of different sites to manipulate distinct slices of the electorate,” Timberg, Room and Dwoskin wrote, citing the reports.
Mother Jones reported that Russians seeking to influence American politics made 2,611 posts on Instagram in 2016 but 5,956 posts in 2017.
Bloomberg wrote that “Russia’s plot to wield social media sites to divide Americans and aid Donald Trump in the 2016 election was even more massive and sophisticated than previously understood, and efforts to disseminate disruptive messages continue.”
It pointed to efforts aimed at discouraging African-American voters. Researchers, it wrote, “found a cross-platform effort to target black Americans, often with memes about police brutality, and later feeding them voter suppression messages.”
The New York Times seemed to suggest more censorship would help. “Any new social network competing with Facebook, Instagram and Twitter will need to consider, from Day 1, how propaganda can be kept at bay,” the Times wrote. “It is not longer enough to build a platform, attract millions or billions of users, and then deal with the consequences.”