One month before the recent midterm elections in the Philippines, a political Facebook page called Malacañang Catering Events and Services shared an article that said Chinese tourists were largely responsible for lawless behavior on the resort island of Boracay.
The page called the tourists “dog eaters,” and referred to Chinese people as “ching chong.” It invited its followers to submit memes mocking them. The same page also posted a photo of what appeared to be a Chinese child defecating on a public street and called on its many followers to “fight together” against Philippine President Rodrigo Duterte and his close relationship with China.
Three years after Duterte’s 2016 campaign rode a wave of false stories, paid trolling, and the resulting Facebook engagement to victory, opposition candidates who once lambasted the president and his legions of digital disinformation agents have adopted some of the same tactics. The result is a political environment even more polluted by trolling, fake accounts, impostor news brands, and information operations, according to a new study.
Alarmingly, this uptick occurred in spite of Facebook investing in third-party fact-checking and acting to remove pages and accounts that violated its policies — including the takedown of a network belonging to a key Duterte social media adviser. The goal was to prevent the same outbreak of falsehoods, harassment, and digital manipulation that characterized the 2016 campaign.
Instead, these tactics became even more widespread among digital campaigns and were adopted by those who once condemned them, according to Jonathan Corpus Ong, an associate professor at the University of Massachusetts Amherst and a coauthor of the research. He told BuzzFeed News that the worsening situation in the Philippines’ digital environment is a warning sign to the US for 2020.
“I would stress that the Philippines could preview disinformation innovations in other countries,” he said.
That was the case in early 2016, when Duterte supporters’ successful weaponization of Facebook was followed by similar tactics in the online battles that preceded Brexit and Donald Trump’s election victory.
“[Duterte’s win] was the beginning because a month later it was Brexit and then Trump got the nomination and then you had the US election,” said Katie Harbath, Facebook’s public policy director for global elections, in a speech last year.
She referred to the Philippines as “patient zero” when it comes to the weaponization of digital platforms during elections.
By that measure, there is reason for global concern. The research by Ong and two colleagues paints a bleak picture of the ability of platforms such as Facebook to counter bad actors, and details the increasing professionalization of trolling and information operations within political campaigns.
Ong worked with Ross Tapsell of Australian National University and Nicole Curato of the University of Canberra to monitor social media and interview roughly 20 political consultants and digital workers engaged in trolling, spreading disinformation, and otherwise acting as paid digital touts for candidates during the midterms. A copy of their report was provided to BuzzFeed News in advance of its release later this week.
They write that “social media and disinformation have become more central and entrenched in the conduct of Philippine political campaigns” while “disinformation producers are becoming more insidious and evasive.”
Ong and his colleagues show that new election rules aimed at bringing more transparency to digital campaigns, as well as efforts by Facebook to support fact-checking and execute takedowns, didn’t beat back the tide of digital disinformation. The official and unofficial digital campaigns supporting Duterte and the opposition simply adapted to new rules and instituted countermeasures to avoid being flagged by fact-checkers or the platforms. And the PR firms executing these tactics still face little or no accountability, Ong said.
The result was that the use of trolls, false news, misleading memes, microtargeting, and other so-called black ops was even more widespread in 2019 than in 2016. Budgets for social media campaigns also increased, according to Ong and his colleagues.
“This is practiced by both Duterte’s allies and the opposition, and even by politicians who previously decried the rise of disinformation practices, showing they felt they had to adapt, rather than continue to oppose, these new forms of digital campaigning,” they write.
“Facebook has become a very big business here,” one person running a social media campaign in support of Duterte told them.
Nathaniel Gleicher, Facebook’s head of cybersecurity policy, said in a statement to BuzzFeed News that the company uses a variety of methods to identify and root out bad actors, and is committed to disrupting those trying to manipulate public debate.
“This includes automation tools that have become extremely effective in catching fake accounts, spam and other types of abuse, and our team of expert investigators who are focused on uncovering the most sophisticated information operations,” he said in an emailed statement. “We’ve disrupted two of these operations in the Philippines this year.”
One key theme in the new digital tactics of 2019 was what the researchers call “micro-media manipulation.” This involved seeding “political propaganda aimed at discreet groups of potential voters” and took the form of recruiting ostensibly nonpolitical influencers on Facebook, Twitter, and Instagram to spread messages.
One such example is Senyora, a pop culture Facebook page with more than 3.6 million fans that’s named after a character from a Mexican soap opera. The page promoted Senator Nancy Binay during the election, and “Senyora” even coauthored a self-help book with the politician.
Other, much smaller accounts were also recruited to post supportive messages for politicians, help hashtags trend, and attack the other side. This was a departure from 2016, where the focus was on overtly political influencers with large followings, such as the “queen of fake news,” pop star and model Mocha Uson.
The micro approach also took the form of targeting closed Facebook groups, which receive far less moderation than other parts of the platform. Campaigns also made a bigger investment in seeding political content on Facebook pages dedicated to local news.
The researchers said these and other tactics in 2019 were “undercover operations aimed at hacking attention and manipulating conversations at the level of small communities and private groups.”
The shift by the opposition Liberal Party to adopt some of Duterte’s playbook, rather than condemn it, was also a major change from 2016. But it was unsuccessful at the ballot box, as none of its slate of senatorial candidates were elected. Duterte’s candidates won all 12 Senate seats up for grabs.
Ong said the standard practice is for someone at arm’s length from a candidate to hire a black hat PR firm and ensure there is a level of plausible deniability. These operations supplement the official digital campaigns of candidates.
“Sometimes [the candidate] is not so knowledgeable of all the social media executions happening on their behalf,” Ong said. “So sometimes there could be a businessman who wants a particular politician to win. So what this [businessman] might do would be to contract a digital operator on behalf of that politician.”
Tapsell, his coauthor, said there’s a significant difference between the official campaign’s activities and these shadow efforts. He said a candidate’s digital consultants “attend to the official social media pages of the candidate, as well as the ‘positive messages’ on social media that are policy-oriented and related to good governance that the candidate will undertake. In contrast to black campaigning which is libelous and informal.”
Thanks to the lessons learned from 2016, which increased domestic demand for manipulation services this election cycle, trolling operations are more professionalized than ever in the Philippines. The country is a global center for outsourced digital labor, which has helped create a digital and tech-savvy population of young English-speaking workers. These workers, who might otherwise work in content moderation, digital marketing, or in call centers for Western companies, can now also find contract work as professional trolls when campaign season heats up.
The existence of these professionally run operations is a concern for some disinformation researchers, who say they could be hired to try and influence campaigns in the US or other English-speaking countries.
“This is what disinformation will look like in the U.S. in 2020,” said Camille François, the chief innovation officer of network analysis company Graphika, in a recent Washington Post story focused on the political troll farm industry in the Philippines.
Evading Facebook and YouTube
Leading up to Election Day, Facebook invested significant resources to remove accounts and pages in violation of its policies. It announced takedowns of pages and accounts engaged in what it described as “coordinated inauthentic activity.” It attributed one network to Nic Gabunada, who helped lead Duterte’s social media efforts in 2016. This put digital campaign leaders on notice that they could lose their networks and be called out publicly. The company also launched a digital literacy effort in the Philippines.
Ong gave Facebook credit for the takedowns and for engaging with the Philippine elections commission and election watchdog, unlike other big tech platforms. But he was concerned that Facebook chose to publicize some takedowns and not others, creating the perception of an unequal level of transparency.
“Some senators’ fake accounts were called out here while others were taken down discreetly without press hoopla,” he said.
Facebook said in response that it announces takedowns related to coordinated inauthentic activity, but does not issue public notices when it removes accounts, pages, or groups for other types of policy violations.
One encouraging change from 2016 was that in 2019 there were fewer big fakes being pushed out by prominent Duterte-aligned influencers, according to Ellen Tordesillas, who runs the Vera Files fact-checking website, which is one of Facebook’s partners in the Philippines. But this led to another, perhaps even more troubling, problem: They shifted to harder-to-check material that served the same purpose.
“The disinformation is more subtle and maybe more effective because it’s easier for the audience to absorb it. It’s also more difficult to fact-check. You can’t level it outright as ‘false,’” she said.
As an example, Tordesillas pointed to a since-removed Facebook video that featured a song with the lyric, “The days of poverty and hunger are over.”
A shift away from big false claims to more general ones was a calculated strategy to avoid having the material downranked by Facebook, according to Ong. Content rated false by a fact-checkers sees its distribution on Facebook drop dramatically. So rather than make false claims — such as a notorious fake video that targeted a woman politician in 2016 — digital operations walked up to the line of falsehood but didn’t cross it.
Ong said the focus on targeting niche audiences using private Facebook groups and other types of “micro-media manipulation” also helped campaigns avoid scrutiny. In the case of a Filipino Flat Earth group, the researchers saw the group’s moderators share “pro-Duterte, pro-Marcos, anti-vaccine, and anti-opposition posts.” Groups for overseas Filipinos were also a common target for political messaging.
Not every tactic was successful. In one case, a “thirst-trap” Instagrammer who typically posts images of his physique suddenly shared multiple images of an opposition senatorial candidate without any context or disclosure of it being a paid post. Ong called it “clumsy and ineffective” because “it didn’t feel authentic.” The posts were later deleted and the Intagrammer did not reply to a request for comment from BuzzFeed News.
Deleting content to erase the digital trail was also a tactic used by what the researchers called impostor news accounts, which sought to create confusion with legitimate news brands. One YouTube channel put the logo of a top Philippines TV newscast in its videos to falsely imply the content had originated there. The channel also altered text to avoid detection by YouTube’s moderation systems, and mass-deleted its partisan content once the election was over.
“For example, the channel deliberately converts some letters into numbers when using explicit language,” the researchers wrote. “In the aftermath of the elections, the channel purged thousands of hyper-partisan videos, replacing much of its content spreading disinformation with lifestyle videos.”
Ong said the tactics will continue to evolve, but the core concern is that digital disinformation campaigns have so quickly become entrenched in the political process.
“It is a new normal, and we haven’t really held the the right people accountable,” he said. ●