The Afterlife? of Wagner's Yevgeny Prigozhin




 

 Matt Burgess, Wire

Posts praising the Wagner Group boss following his death in a mysterious plane crash last month indicate he was still in control of his "troll farm," researchers claim.

YVENENY PRIGOZHIN’S wartime atrocities propelled the brutal mercenary into the limelight. But Prigozhin—who was once Russian president Vladimir Putin’s chef and a small-time criminal—also held a title as one of the world’s biggest disinformation peddlers. For years, Prigozhin operated the notorious Internet Research Agency, a Russian troll farm that meddled in US elections and beyond.

When Prigozhin suddenly died in a mysterious plane crash on August 23, around two months after he led his Wagner Group mercenaries in a failed mutiny against Putin, the trolls didn’t stop posting. Instead, according to a new analysis shared with WIRED, some continued to show their support for him.

In the days immediately after his death, a coordinated network of pro-Prigozhin accounts on X (formerly known as Twitter) pushed messages saying that the warlord was a hero and good for Russia, despite the Wagner Group’s failed rebellion against Putin in June. These messages also blamed the West for the plane crash and said that the Wagner Group would continue operating in Africa.

“It was not profitable for Putin to kill Prigozhin. PMC [private military company] carry a lot of weight in Africa, and Prigozhin skillfully managed it, despite his ‘character quirks,’” one account posted on X. “Prigozhin served for the good of Russia, remained faithful to his military oath, and was killed by saboteurs, or terrorists mined the plane,” another speculated. “In short, he just ditched his phone and disappeared into the sunset, just like in a typical action movie,” a third posted.

The organized accounts were all identified and shared with WIRED by Antibot4Navalny, an anonymous group of volunteers who track Russian-language influence operations on X. A person behind the group, whom WIRED granted anonymity due to safety concerns, says they started inspecting the posts of suspected X accounts after the crash when they “noticed that Prigozhin is surprisingly covered in an exclusively positive light.” The group found 30 accounts pushing pro-Prigozhin narratives, they say.

The activity could be a sign that Prigozhin remained in control of the Internet Research Agency troll factory until he died, the group claims, adding that it echoes similar activity they previously saw. Reports have said that after the attempted June uprising, Prigozhin-owned news websites and the troll factory were being shut down or looking for new owners. “Domestically, there was a lot of debate whether or not Prigozhin lost his control over the troll factory as one of the immediate aftermaths of the mutiny,” the Antibot4Navalny member says.

While the posts on X are only a tiny snapshot of social media activity, they highlight how Russian-linked propaganda has changed since the Internet Research Agency interfered in US politics in 2016, experts say. The Russian misinformation and disinformation industry has evolved into a rich ecosystem of state-backed media, massive Telegram channels, and more conventional social media posts. Millions of people follow so-called military bloggers and war journalists on Telegram—some of these channels are linked to the Russian state, while others are aligned with Pirgozhin and the Wagner Group. But all can muddy the waters or repeat set lines.

Don't miss the latest from WIRED. Sign up for stories you won't find anywhere else.

“Confusion in the information space is one of the aims of the Kremlin information operations—to make everything equally unbelievable so people’s trust in all kinds of sources is undermined,” says Eto Buziashvili, a disinformation and influence operations researcher with a focus on Russia at the Atlantic Council’s Digital Forensic Research Lab. Since the start of its full-scale war in Ukraine in February 2022, Russia has blocked and censored social media websites, banned independent news media, and pushed reams of disinformation.

Kyle Walter, head of research at misinformation and disinformation research company Logically, reviewed the posts shared by Antibot4Navalny and says they show “signs of being inauthentic.” The X accounts were largely created earlier this year, have low volumes of original posts, and mostly retweet or reply to accounts, and some of them also follow each other, Walter says. The themes the accounts posted about around the plane crash also match what Logically has seen from monitoring Telegram channels linked to the Wagner Group, he says. Walter adds, however, that linking them directly to the Internet Research Agency is harder to do.

The Antibot4Navalny researcher says that based on their previous research, they believe that the pro-Prigozhin trolls operate in similar ways. They “primarily serve” the interests of Putin, but they also push pro-Prigozhin narratives when it doesn’t “hurt” the Russian president, the researcher says. The approach “still worked in the plane-crash episode: Cover Putin as strongly as possible, but also, it is a nice opportunity to praise Prigozhin,” they say. The researcher says they are reporting the accounts to X.

As well as the posts around the plane crash, the Antibot4Navalny group also shared previous research and analysis with WIRED. In one instance, the group reported more than 7,000 suspected accounts to X. We tested dozens of these accounts and found that they have all been removed from the Elon Musk-owned social media company. Antibot4Navalny says the “troll” accounts are often active in groups, pushing the “same set of talking points” and mostly replying to tweets about news related to Russia and Ukraine or pro-Ukrainian channels. X did not immediately respond to WIRED’s request for comment.

On July 14, the Antibot4Navalny researcher says, some of the accounts they have tracked replied to posts discussing comments from Putin, who said that the Wagner Group “does not exist” and that there is no legal basis for the group. The accounts, the researcher says, sent messages saying that Wagner operated legally and referenced Concord, the catering company owned by Prigozhin. The Antibot4Navalny researcher claims that the points were not included in any Kremlin-controlled media and that mentions of the company “served interests of the troll factory/its owner—rather than interests of the Kremlin.”

Buziashvili, the Atlantic Council researcher, says she believes the troll factory is still operating. “Part of them might be still supporting Prigozhin,” she says. “For most of the people who were working there, they would just continue their work regardless of who is their current boss.”

Following the plane crash, Buziashvili says, Russian officials and state media pushed multiple “theories” simultaneously. On one TV show, both the UK and NATO were blamed for the plane crash, she says. Other instances blame Ukraine and claim that Prigozhin was not killed in the crash. Pro-Wagner Telegram channels were also pushing claims that the plane was shot down by Russian aviation, Buziashvili says, and that they wanted “revenge.” Nobody has formally claimed responsibility for the explosion—both Putin and Ukraine have denied involvement.

Despite the changing information ecosystem, the amount of Russian disinformation on social media is colossal. During the first year after it launched the war in Ukraine, Russia’s disinformation reached an audience of “at least” 165 million and generated 16 billion views across Facebook, Instagram, Twitter/X, YouTube, TikTok, and Telegram, according to a European Commission study of Russia-linked activity published last week. Subscribers to pro-Kremlin Telegram channels have “more than tripled” since the start of the war, the report says. “Preliminary analysis suggests that the reach and influence of Kremlin-backed accounts has grown further in the first half of 2023, driven in particular by the dismantling of Twitter’s safety standards.”

“What we typically see these days is narratives formulated on Telegram,” says Logically’s Walter. The company has recently found pro-Russian channels pushing disinformation about Niger’s military coup, and it has also linked a Russian fact-checking website and Telegram account to a presenter on Russia’s “biggest” propaganda TV show. “You have Western influencers who are sympathetic to Russian causes that will translate those narratives and then share them on mainstream platforms. And they circulate more broadly,” he says.

Walter says that over time, and largely because of the war, it has become easier for Russian propaganda and disinformation to attack the West. “From a tactics perspective, there’s a lot less direct involvement that we can attribute from the Russian state itself, and it is more these proxies,” he says. “Russian disinformation efforts are gradually adapting to any sort of Western countering that gets put in place.”

Comments