Cybersecurity & Tech Executive Branch

Foreign Influence Operations After 2020

Emerson Brooking
Wednesday, October 28, 2020, 11:26 AM

The past four years have seen extraordinary growth in the study of foreign influence and social media manipulation. Over the next four years, the field will need to move toward sustainability and equitability.

People on their phones. (https://pxhere.com/en/photo/1559045; CC0 1.0, https://creativecommons.org/publicdomain/zero/1.0/deed.en)

Published by The Lawfare Institute
in Cooperation With
Brookings

In partnership with the Stanford Internet Observatory, Lawfare is publishing a series assessing the threat of foreign influence operations targeting the United States. Contributors debate whether the threat of such operations is overblown, voicing a range of views on the issue. For more information on the series, click here.

After 2016, the issue of foreign influence and social media manipulation seized the public’s attention. Broadcast media have aired hundreds of segments on the dangers of shadowy Russian bots and trolls. Meanwhile, a vast new research community—according to the Partnership for Countering Influence Operations, more than 400 distinct organizations and initiatives—has sprouted to study every dimension of the problem.

The result has been a perpetual national conversation about information operations originating from abroad. My team at the Digital Forensic Research Lab has tracked 80 distinct allegations of foreign interference since the 2018 midterms, with widely variant credibility and transparency. Articles about these claims have received a cumulative 26 million social media shares and engagements.

It’s a lot. But is it too much? Has the threat of interstate influence operations been overblown?

Even as Josh Goldstein and Renee DiResta began this series, the issue has continued to rise in visibility. Recent days have seen what may or may not be another foreign hack-and-release operation targeting former Vice President Joe Biden, along with what was almost certainly an Iranian influence operation targeting Florida voters. With less than a week before the 2020 election, it’s useful to tally the arguments of series contributors and draw out broader trends. It is also important to consider what might come next for the field of disinformation studies.

Laura Rosenberger and Lindsay Gorman don’t think that the threat of foreign influence operations is overblown. Instead, they argue, it represents one facet of the broader information conflict between democracies and autocracies: a struggle that will define geopolitics in the decades ahead. “This is a contest not just to wield digital tools but also to shape information realities,” they write. To spend too much time parsing the particulars of Russian troll behavior is “missing the forest for the trees.”

Darren Linvill and Patrick Warren similarly argue that the threat is not overblown. “To be effective,” they write, “foreign operations don’t have to be successful in whatever overt goal they may have—they simply have to exist.” They draw parallels to the fear that even a foiled terrorist attack provokes. If the national conversation is derailed by the specter of Russian bots, then actors like the Russian Internet Research Agency have done their job—and for a very cheap price.

On the other side of the debate, Claire Wardle asserts that coverage of foreign influence has now veered into overcorrection. The result has been the rise of a “disinformation industrial complex,” powered by a “zealous belief that Russians were under every bed.” This obsession has imposed a grave opportunity cost, drawing away resources that might have been better spent tracking and confronting domestic disinformation in the United States.

Yochai Benkler also believes that the threat has been overblown. Too often, he argues, the efforts of a few Russian trolls have been given disproportionate attention and credited with outsize influence. “[W]ithout pickup from more influential media … these Russian efforts [would] languish unnoticed,” he notes. This speaks to a broader media failure. “To appear powerful and dangerous, all Russian actors need to do is make sure they are described as powerful and dangerous by credible sources in the United States. And for now … they are succeeding in doing just that.”

Finally, Josh Tucker echoes the “overblown” position, focusing on the paucity of evidence that Russian bots and social media personas ever actually influenced voter behavior in the 2016 election. He notes the small pool of “persuadable” voters and the trivial reach of the Russian effort as compared to that of the presidential candidates and partisan influencers. “Coordinated campaigns … on social media are likely neither necessary nor sufficient to signify serious foreign threats to electoral integrity,” he concludes.

Interestingly, there is quite a lot of overlap between the two sides of this debate. Even as he questions the effectiveness of foreign trolls, Tucker acknowledges that the Russian hack-and-release operation laundered through Wikileaks was potentially pivotal in deciding the 2016 election. Meanwhile, the skeptical Benkler nonetheless contends that influence operations can pose real dangers in more constrained information environments, as with the Russian campaign against the White Helmets in Syria. On the other side of the issue, Linvill and Warren readily admit that the effect of individual operations has been “sensationalized and overhyped.” Among this distinguished group, the facts are not really in contention.

Instead, whether someone believes the threat of foreign influence operations has been overblown depends a lot on where that person sits. Former policymakers like Rosenberger and Gorman, considering the future of global information conflict, see nothing “overblown” in publicizing these early experiments by Russia and other adversaries. For a communications and media scholar like Wardle, foreign influence operations are overblown, but mostly for the oxygen they steal from those trying to fight disinformation at home. For a political scientist like Tucker, they are similarly overblown because such overemphasis warps Americans’ understanding of modern electoral politics.

If there is a point of general agreement among all contributors, it is regarding the domestic polarization that has made the nation susceptible to foreign influence operations in the first place. As Benkler writes and others echo, “[T]he origins of this disorientation are not foreign. They are part of a much broader decline in trust in institutions in America, and that decline is itself the result of four decades of broad-based economic insecurity and cynical manipulation of genuine pain by domestic political and media elites.” After all, Russian propagandists did not kill George Floyd or cast 63 million votes for Donald Trump in 2016.

Critical questions remain about how to conceptualize foreign interference. One consideration is the extent to which overheated U.S. rhetoric around foreign influence can drive repressive actions elsewhere in the world. It is an unhappy truth that rhetoric around “fake news”—a term coined by journalists but quickly co-opted by President Trump—has been used to justify draconian speech codes in Thailand, Egypt, Singapore and elsewhere. As Gabrielle Lim writes for the Centre for International Governance Innovation, “[O]ur fears and concerns that foreign actors are somehow interfering with democracy … are, counterintuitively, allowing for the further erosion of democracy.”

One also wonders about how the broader “disinformation industrial complex” might contract over the next few years. Put simply, it was a good time to be a self-declared counterterrorism expert in Washington, D.C. in 2002; less so by 2011. Since 2016, it has been a very good time to declare oneself an expert in foreign interference and social media manipulation. But what will happen when the spigot turns off again? How many donors, for instance, have funded counter-disinformation initiatives due mostly to their hostility to Trump? How might their funding decisions change under a different administration? There is a real danger that research standards may slip as organizations turn to more salacious claims of foreign interference to seize the attention of a diminishing pool of donors.

There also remains the difficult question of just how much foreign influence can be prevented—and how much should be. Russian interference in 2016 was clearly unacceptable. However, was it similarly unacceptable when a global network of K-pop fans brigaded the registration form of a Trump campaign event this summer? What of pro-Trump internet celebrities, based in Colombia, who have spread misinformation throughout Latino communities in the United States? There is a point where concerns about foreign influence intersect with the fundamental reality of seamless, global communication where most internet users have acquired a basic familiarity with American culture and politics. By and large, researchers have yet to think seriously about where these lines should be drawn.

The threat of interstate influence operations may or may not be overblown in the U.S.—but it is certainly not overblown in many other nations of the world. In August 2020, Facebook removed a U.S.-based network that targeted Bolivia and sought to spread disinformation in support of the military-backed government; in April, a network of Egyptian and UAE-based accounts that sought to influence the civil wars in Libya and Yemen. In June, my colleagues at the Digital Forensic Research Lab identified a Tunisian operation that had reached millions of people across nearly a dozen African nations. Wherever central governance is weak, foreign influence operations have more power. And outside the English-speaking West, there are fewer people with the right connections and knowledge to catch these operations early.

The past four years have seen extraordinary growth in the study of foreign influence and social media manipulation. Over the next four years, the field will need to move toward sustainability and equitability—a task that will require maintaining high research standards, avoiding exaggeration or hyperbole and applying the same level of vigilance that has been afforded to the United States to elections around the world. The stakes were high in 2016, and they remain high now.


Emerson T. Brooking is a Resident Fellow at the Digital Forensic Research Lab of the Atlantic Council, focused on disinformation, technology policy, and electoral security. He is the coauthor of LikeWar: The Weaponization of Social Media (Houghton Mifflin Harcourt, 2018) and has advised numerous U.S. military commands and government agencies regarding the evolution of information warfare and public diplomacy. Previously, Brooking was a Research Fellow at the Council on Foreign Relations, where he studied U.S. defense policy. He holds a B.A. in Political Science and Classical Studies from the University of Pennsylvania.

Subscribe to Lawfare