The New York Times’ Farhad Manjoo reported on the ecosystem that allows disinformation to spread through Twitter and to mainstream and right-wing media, including a conspiracy theory surrounding the murder of a Democratic National Committee (DNC) staffer that was cooked up in message boards and eventually promoted by Fox News host Sean Hannity.
Though Facebook has enabled a fake news ecosystem that empowers total lies, anonymous Twitter bots have also played a part in undermining discourse and truth. Former FBI agent Clint Watts testified in front of the Senate intelligence committee in March about “how Russians used armies of Twitter bots to spread fake news using accounts that seem to be Midwestern swing-voter Republicans,” as NPR described it. And according to a McClatchy report, the FBI is investigating Russian operatives and far-right news websites for their use of bots to spread misinformation.
In a May 31 report, the Times’ Manjoo detailed the role of Twitter in spreading disinformation, fake news, and conspiracy theories, using the story of murdered DNC staffer Seth Rich as one example. The report noted that though “Hannity pushed the theory the loudest, … it was groups on Twitter -- or more specifically, bots on Twitter -- that were first to the story and helped make it huge”:
[T]he biggest problem with Twitter’s place in the news is its role in the production and dissemination of propaganda and misinformation. It keeps pushing conspiracy theories — and because lots of people in the media, not to mention many news consumers, don’t quite understand how it works, the precise mechanism is worth digging into.
We recently saw the mechanism in action when another baseless conspiracy theory rose to the top of the news: The idea that the murder last year of Seth Rich, a staff member at the Democratic National Committee, was linked, somehow, to the leaking of Clinton campaign emails. The Fox News host Sean Hannity pushed the theory the loudest, but it was groups on Twitter — or, more specifically, bots on Twitter — that were first to the story and helped make it huge.
Hannity’s obsession with Rich’s murder is a strong example of how this ecosystem shapes media narratives. Hannity has come under fire and lost a number of his advertisers for his relentless promotion of a conspiracy theory alleging that Rich’s 2016 murder was a result of his leaking DNC emails to WikiLeaks. Hannity’s promotion of the theory continued after Fox News retracted an online story making similar claims and after Rich’s family requested that Hannity stop pushing the story. Hannity has a long history of pushing conspiracy theories.
Manjoo quotes an expert on internet propaganda who said Twitter bots amplify groups’ messages and allow them to “use Twitter as a megaphone” and eventually “manufactur[e] consensus” for ideas. Manjoo contextualized how that works for conspiracy theories. First, groups take to message boards like Reddit or 4chan or Facebook groups to “decide on a particular message to push.” Then bots “flood the network, tweeting and retweeting thousands or hundreds of thousands of messages in support of the story.” These tweets often include a “branding hashtag” such as #sethrich. The report noted that “the initial aim isn’t to convince or persuade, but simply to overwhelm,” and that as stories become Trending Topics, reporters are forced to respond, thereby aiding the propagandists’ spread of the story even as media outlets debunk it. Others, like Hannity, pick up the weaponized disinformation, attempt to legitimize it, and put it on a larger platform like his prime-time Fox News show or radio program:
“Bots allow groups to speak much more loudly than they would be able to on any other social media platforms — it lets them use Twitter as a megaphone,” said Samuel Woolley, the director for research at Oxford University’s Computational Propaganda Project. “It’s doing something that I call ‘manufacturing consensus,’ or building the illusion of popularity for a candidate or a particular idea.”
How this works for conspiracy theories is relatively straightforward. Outside of Twitter — in message boards or Facebook groups — a group will decide on a particular message to push. Then the deluge begins. Bots flood the network, tweeting and retweeting thousands or hundreds of thousands of messages in support of the story, often accompanied by a branding hashtag — #pizzagate, or, a few weeks ago, #sethrich.
The initial aim isn’t to convince or persuade, but simply to overwhelm — to so completely saturate the network that it seems as if people are talking about a particular story. The biggest prize is to get on Twitter’s Trending Topics list, which is often used as an assignment sheet for the rest of the internet.
I witnessed this in mid-May, just after the Fox affiliate in Washington reported that a private investigator for Mr. Rich’s family had bombshell evidence in the case. The story later fell apart, but that night, Twitter bots went with it.
Hundreds of accounts with few or no followers began tweeting links to the story. By the next morning, #SethRich was trending nationally on Twitter — and the conspiracy theory was getting wide coverage across the right, including, in time, Mr. Hannity.[...]
Because they operate unseen, bots catalyze the news: They speed up the process of discovery and dissemination of particular stories, turning an unknown hashtag into the next big thing. A trending hashtag creates a trap for journalists who cover the internet: Even if they cover a conspiracy theory only to debunk it, they’re most likely playing into what the propagandists’ want.