Foreign Relations & International Law

How the Kremlin Uses Agenda Setting to Paint Democracy in Panic

Lindsay Hundley
Thursday, February 11, 2021, 8:01 AM

The Kremlin’s old media tactics illustrate the persistent challenges posed by influence operations.

The Kremlin at dusk. (Pavel Kazackhov, https://tinyurl.com/1e0l0kjp; CC BY 2.0, https://creativecommons.org/licenses/by/2.0/deed.en)

Published by The Lawfare Institute
in Cooperation With
Brookings

Since November 2020, the world has watched the presidential transition in the United States with unease. After a violent mob of Trump supporters stormed the U.S. Capitol on Jan. 6 in an effort to overturn Joe Biden’s election, headlines around the world questioned, for the first time, whether a democratic transfer of power would occur as expected. These reports also included the well-documented risks of violence that might occur at President Biden’s inauguration.

Thankfully, the threat from right-wing extremists did not materialize at the inauguration. Instead, many world leaders have celebrated the occasion as “a demonstration of the resilience of American democracy.”

But Russian media tell a different story. By flooding the front pages of its media with headlines of continued unrest, opposition criticism and government suppression, the Kremlin has pulled out an old playbook in its efforts to sway global opinion against the promise of Western liberalism. And these tactics, compared to the shadowy bots and trolls we’ve grown to associate with Russian influence operations, may prove even tougher to counter.

The Kremlin’s main method of influencing opinion rests on a seemingly benign tool: the topics that its state-led media report. Reading through Russia Today, you’d be left with the distinct impression that the presidential transition did more to undermine American democracy than to signify its resilience. Headlines abound with stories highlighting Republicans’ allegations that liberals and Big Tech are seeking to stifle free speech. Dozens of articles are dedicated to questioning the objectivity of U.S. media in their reporting on the Biden administration and to documenting Americans’ declining trust in U.S. institutions. And the striking images of violent unrest—not only at the Capitol but also from local protests and riots—are hard to miss.

Here’s the thing: None of these stories is false, and much of the reporting refrains from overt editorializing. But these stories are designed to leave readers and viewers with particular beliefs about the attractiveness of the U.S. political system.

The Russian media, and the Kremlin by proxy, are using what scholars of political communication refer to as agenda setting in order to sway public opinion. By choosing what to report on, news outlets also influence what the public tends to think about. For instance, if the media run numerous stories on criminal activity, readers tend to report higher anxiety about crime and are more supportive of increased policing—regardless of actual crime levels in the area.

Of course, all media outlets engage in agenda setting when selecting which stories to run, but governments that exert control over their media can use agenda setting for political ends. A recent study documented how Russian state-led media run an excess of stories about the United States in periods of Russian economic downturn to distract domestic audiences from the Kremlin’s policy failures. Subtle strategies of media manipulation like this are especially useful to authoritarian governments like Russia’s that rely to some degree on popular support, rather than ruling purely through coercion.

The strategic use of agenda setting features prominently in the Russian government’s efforts to undermine democracy globally. By selecting stories that portray liberal states as chaotic and dysfunctional, Russian President Vladimir Putin seeks to minimize the risk that Russians will take to the streets to demand democratic reform at home—as they have in the protests that erupted after the Jan. 17 arrest of opposition leader Alexei Navalny.

Many commentators have noted that Putin’s information campaigns against the West are rooted in his fears of “color revolutions”—a term used to refer to a series of pro-democratic uprisings that led to the removal of authoritarian leaders in Georgia, Ukraine and Kyrgyzstan after fraudulent elections in the 2000s. But much less attention has been given to an important question: What can the U.S. learn from the Kremlin’s efforts to sway public opinion on the value of these movements?

My research shows that the Kremlin’s tactics for countering the color revolutions are remarkably similar to those being used in reporting on the U.S. presidential transition today. Instead of peddling deliberately false or highly editorialized stories, Russian state media increased their reporting on protests and opposition allegations that newly elected regimes were engaged in undemocratic behavior. Prior to the color revolutions, state media typically reserved coverage of voices critical of incumbent regimes for opposition to particular policies that the Russian government viewed unfavorably—like cooperation with the United States in the Iraq War.

In the case of the color revolutions, these simple tactics were also shockingly successful. In July 2005, only between 3 and 6 percent of Russians believed that the government transitions had improved the lives of citizens in Georgia, Ukraine and Kyrgyzstan. By comparison, 40 percent of Ukrainians indicated around the same time that they supported and continued to support the Orange Revolution that had resolved earlier that year.

Since Russia’s interference in the 2016 U.S. elections, analysts have shared a fascination with “fake news” and sophisticated methods of influence operations—be it increasingly tailored cyberattacks, the use of automated bots to amplify content on social media, or even the potential deployment of deepfakes. To be sure, these developments are all worrisome. But policymakers should not lose sight of the bread-and-butter media tactics that can make propaganda powerful in the first place.

Subtle manipulation strategies will likely remain significant tools of influence for the foreseeable future. While computers can be trained to detect inauthentic behavior or digitally manipulated media, bias in agenda setting is unlikely to have a clear technical solution. Like the challenge of “fake news,” combating agenda-setting bias veers quickly into thorny issues of regulating speech and media independence. But unlike “fake news,” there is little room for more reliable outlets to debunk state-led media since many stories do not rely on false claims. Likewise, while humans are not particularly skilled at identifying false news stories, recognizing agenda-setting bias is even more challenging; it requires evaluating the quantity of reporting on an issue and having a clear sense of what the appropriate amount of coverage should be.

How should democratic societies combat this trend? One promising strategy adopted by both Twitter and Facebook last year is to label accounts and content from state-controlled media on their platforms, which a recent study has shown reduced engagement on Twitter with Chinese state media by more than 20 percent.

More broadly, curation and technological solutions will go only so far in solving a fundamentally social problem. Many commentators have argued that America’s best defense against influence operations is to get its own house in order. But even if the Biden administration lives up to that monumental task, dissent is still a crucial part of a healthy, functioning democracy. And that means there will always be room for others to paint democracy in panic.


Dr. Lindsay Hundley leads Meta’s policies on state media and supports the company’s efforts to counter covert influence operations. She has a PhD in Political Science from Stanford University and was previously a research fellow at Harvard Kennedy School's Belfer Center for Science and International Affairs and at Stanford’s Center for International Security and Cooperation.

Subscribe to Lawfare