How to investigate — and understand — disinformation by foreign adversaries.

5 takeaways:

The Russians have figured out how to drive wedges between Americans, fueling the extreme levels of partisanship now common. Researchers analyzed a dataset of 70 million tweets from Jan. 1 to May 31, 2020, to probe disinformation, said William Marcellino, a behavior scientist at the RAND Corporation. They found the Russians “were working both sides of the aisle” to inflame both conservative and liberal Americans. For example, the Jeffrey Epstein case was deployed to incense the right and the left alike. Russians used both trolls (fake personas that purport to be American but aren’t) and “super-connectors” (networks of accounts that stayed just below Twitter’s limits on friend-follower ratios to avoid being identified as bots.)  “I can’t say for sure, but they look, talk and act a lot like a botnet — a network controlled by a computer program that wakes up, spreads some information and then goes quiet again,” Marcellino said.

Researchers identified 11 different tactics that journalists can investigate. They include tailoring disinformation that has content designed to pit different groups against each other; amplifying conspiracy narratives; hack and leak operations, such as Wikileaks; and supporting secessionist ideas and movements. (The list of tactics is here.) They are usually used in concert, so don’t focus on any single one. The goal is to evoke emotional reactions in individuals, inflame hyper-partisanship, and prevent national consensus, said Marek Posard, a military sociologist who has studied foreign interference in U.S. elections.

Journalists can use four types of tools to investigate suspicious online activity. Marcellino recommends Google Images and Tineye for reverse image searches to vet photographs posted on social media. “It might very well be that the inflammatory picture you see is actually not from a Black Lives Matter protests last year — it’s from Venezuela four years ago,” he said. To detect bots and spam, he suggests Botometer, which “gives you a pretty decent” rating of the likelihood that an account is a bot. The site will also show you the followers of any account you enter. To check who is spreading what kind of information across which platforms, journalists can use CrowdTangle Checker extension on the Chrome browser, or WhoPostedWhat. “What’s really cool is, if you are logged into [a Facebook, Reddit or Twitter] you can very quickly take, say, a suspicious or an interesting link and see who shared it, how often, and what platforms,” Marcellino said. If you are suspicious of media bias and want to check the credibility of a source, try Media Bias Fact Check.  (Twitter is also trying to crowdsource misinformation tracking with its Birdwatch program for selected users only.)

Russian methods are evolving, including campaigns of disinformation on COVID-19 and vaccines aimed at boosting the Sputnik vaccine.  “They are doing a lot to promote disinformation on vaccinations to weaken people’s support for U.S.-made vaccines, I think, ideally, maybe to improve support for the Sputnik vaccine,” said Todd Helmus of the RAND Corporation. Other tactics include developing proxy web pages, proxy news organizations and a “terribly executed campaign to get influencers in Europe to start promoting some of their anti-vax content.” But the key technique of appealing to people’s angriest emotions has not changed, Marcellino said. “Americans have an appetite for this kind of stuff. Many of us are willing to believe that that conservative over there is the devil, and we should hate them, not our neighbor who we disagree with,” Marcellino said. As long as Americans want to share it, it will be effective — and other countries, including China, are learning to play the same game.

Nobody wants to be played by a foreign power. One study interviewed people who self-identified as being strongly Republican or Democratic and hating the other side, Posard said. Then the survey participants were informed that what they were seeing was Russian-made content. “And it was interesting, no matter how your strong beliefs on Republicans or Democrats, or Hillary Clinton or Joe Biden or Donald J. Trump, nobody likes a foreign adversary showing up to Thanksgiving dinner and picking fights between family members. And that’s exactly what they’re doing,” Posard said. Partisans “will gladly hit people on the other side of the aisle if they already hate people on the other side of the aisle. They just do not want Russia or China to tell them to hate.”

*You may also like: Cybersecurity Villains and Superheroes 


Speakers: 

Todd Helmus, Senior Behavioral Scientist, RAND Corporation

William Marcellino, Senior Behavioral & Social Scientist, RAND Corporation

Marek Posard, Social Scientist and Project Leader, RAND Corporation


This program was funded by Microsoft, the RAND Corporation and donations to the National Press Foundation. NPF is solely responsible for the content.

Todd Helmus
Senior Behavioral Scientist, RAND Corporation
William Marcellino
Senior Behavioral & Social Scientist, RAND Corporation
Marek Posard
Social Scientist and Project Leader, RAND Corporation
1
Transcript
20
Covering Cyberconflict 2021
Three RAND researchers explain Russian disinformation
Subscribe on YouTube
Help Make Good Journalists Better
Donate to the National Press Foundation to help us keep journalists informed on the issues that matter most.
DONATE ANY AMOUNT
You might also like
Cybersecurity Villains and Superheroes
Sponsored by