An ambitious online disinformation campaign that impersonated major media outlets, used fake Twitter accounts to spread false articles, and targeted real journalists is likely linked to Iran, according to researchers who tracked it for close to two years.
Since early 2016, the operation published 135 fabricated articles on websites designed to mimic outlets such as The Guardian, Bloomberg, Al Jazeera, The Independent, The Atlantic, and Politico. In one example, a fake article claiming that six Arab nations called on FIFA to strip Qatar of its role as host of the 2022 World Cup was covered by Reuters, which caused other outlets to spread the disinformation. Reuters retracted its story once it realized the information originated on a website that had impersonated a real Swiss news outlet, but some of the false stories from other outlets remain online.
“It just underscores how difficult it is to verify information when you have bad actors spreading malicious content deliberately, and can disguise their motivations,” said Ron Deibert, director of the Citizen Lab at the University of Toronto’s Munk School, the research group that conducted the analysis and published it in a new report.
Deibert told BuzzFeed News this operation shows how “social media is being treated as a disinformation laboratory by a number of state and non-state actors.”
Citizen Lab concluded with “moderate confidence” that the network is tied to actors in Iran after analyzing domains, articles, and Twitter accounts among other information, and combining it with findings from investigations published by Facebook and FireEye. Citizen Lab did not find evidence linking it to the Iranian government, but did conclude that the operation was focused on spreading anti-Saudi narratives.
It remains unknown who led and executed the disinformation operation, which Citizen Lab dubbed “Endless Mayfly.”
The report provides essential new information about the rash of spoofed news websites and fake articles that generated scrutiny and media attention since 2016. In one well-known example, a website impersonating Belgian newspaper Le Soir published a false story claiming then-candidate Emmanuel Macron’s campaign for the French presidency was financed by Saudi Arabia. In another example, BuzzFeed News investigated a false article from a site that impersonated The Guardian and found that it had quickly gained traction in Russian media.
Deibert said Citizen Lab published its findings because the operation used new techniques to spread disinformation, and researchers, journalists, and the public need to be aware of how quickly these operations are evolving.
“It’s important to remind ourselves that Russian disinformation isn’t the only game in town,” he said. “This illustrates how difficult it is to have a healthy public sphere when we have an ecosystem set up to promote the opposite … it’s a perfect environment for the spread of disinformation.”
Citizen Lab began researching what became Endless Mayfly in April of 2017 after an article hosted on the spoofed domain independent.co.ukuk was posted to Reddit. The story appeared on a website that copied the design of the real Independent, a UK news outlet. The story falsely claimed that “Theresa May attempt to get away with Brexit consequences by ‘kissing up to Arab regimes’ in vein.” Along with analyzing details about the domain name, the lab identified social media accounts used to spread the story.
Over time, the researchers tracked 11 fake Twitter personas used in the operation. “The personas created by Endless Mayfly were typically thin, with limited depth beyond a Twitter bio and a history of tweeting on a narrow band of topics. Personas included fake students, journalists, and activists,” the report says.
These personas would tweet the false articles and in some cases contact legitimate journalists via direct message to try and get them to further amplify the content.
“Endless Mayfly Twitter personas repeatedly tweeted out links to the inauthentic articles, made strategic use of Twitter mentions targeting established journalists and activists, posted screenshots of the inauthentic articles, and sent private direct messages to journalists and activists,” the researchers write.
They also noticed that the fake articles on spoofed websites were often deleted after they gained traction on social media.
“Typically, after the inauthentic articles were posted to Twitter, amplified by third parties, or covered by mainstream media, Endless Mayfly deleted the content and redirected visitors to the legitimate media outlets that they were impersonating,” the report said.
Deibert said this tactic might create the impression for some that the fake story had originally appeared on the real site.
“Part of the characteristics of social media is short attention span and people focusing high level details,” he said. “It struck us as an innovation in disinformation tactics.”
The report dubs this technique “ephemeral disinformation” because “the message remains even though the evidence is ephemeral.”
Endless Mayfly adapted its approach over time. In the summer of 2017 it shifted away from creating spoofed domains of legitimate media outlets to host false articles. Instead, it used fake Twitter personas to publish articles on websites that allow members of the public to post content, such as Medium and BuzzFeed.
In one case, it placed a false story in the Community section of buzzfeed.com. That section allows members of the public to post their own content and is separate from the buzzfeednews.com domain where this story appears. The article in question falsely claimed that six Arab nations had told FIFA they object to Qatar hosting the 2022 World Cup. The post was removed when BuzzFeed News learned of its existence after receiving an advance copy of the Citizen Lab report. The post received a total of 17 views prior to being removed, according to analytics.
“While the Community section is a great place for BuzzFeed’s audience to share positive, original content, we have zero tolerance for posts that violate our guidelines — which prohibit ‘deceptive’ and ‘fraudulent’ posts,” Matt Mittenthall, a company spokesperson said in an emailed statement. “We removed this piece as soon as it was brought to our attention, and remain vigilant about keeping BuzzFeed free of the kind of fake news and disinformation that has proliferated elsewhere on the internet.”
The false article uploaded to BuzzFeed Community claimed that Reuters and The Local, a Swiss online publication, were the source of the claim that Arab states were opposing Qatar’s World Cup. In fact, Reuters was fooled by a fake article published on a website masquerading as The Local. Endless Mayfly continued to exploit the error even after Reuters retracted it as part of its strategy of posting to third-party sites.
Deibert said the lab had found information linking Endless Mayfly to Iran during its investigation, but important new evidence came in in August 2018. That month Facebook and cybersecurity firm FireEye announced they had identified a network of social media accounts and websites that were part of an Iranian information operation. Many of the accounts and websites identified by those companies had also been used to help amplify Endless Mayfly content, according to Citizen Lab. “Soon, Google and Twitter then took action against the same network, citing “state-sponsored activity” and “coordinated manipulation.”
Deibert said this provided important outside confirmation of what his team had found. But he also emphasized that attribution is difficult when actors with different motivations who can converge on the same narrative for different reasons.
“This is a very messy ecosystem — that’s precisely why you have this type of operation going on, hoping to push out a narrative and amplify it,“ he said.
“Those who are looking to run information operations for whatever reason are seeing this time as one of experimentation in what works what doesn’t, and what is going to have greatest impact for the lowest effort.”
Ultimately, the report said it’s difficult to determine how much impact Endless Mayfly had, aside from its occasional success at fooling media outlets.
“While there is evidence that there was some interaction with the inauthentic articles and personas based on the number of clicks, retweets, and coverage from mainstream media, it is unclear to what extent the operations swayed public opinion.”