Cybersecurity & Tech

Senate Hearing on Social Media and Foreign Influence Operations: Progress, But There’s A Long Way to Go

Evelyn Douek
Thursday, September 6, 2018, 2:38 PM

Wednesday’s hearing before the Senate intelligence committee on “Foreign Influence Operations’ Use of Social Media Platforms” was, as Chairman Richard Burr called it, the “capstone” of the committee’s inquiry into Russian interference in the 2016 election. Committee members focused on what had been learned and what had changed over the course of the investigation, presaging future action in broad terms only.

Sheryl Sandberg testifies before the Senate intelligence committee. (Photo: US Senate)

Published by The Lawfare Institute
in Cooperation With
Brookings

Wednesday’s hearing before the Senate intelligence committee on “Foreign Influence Operations’ Use of Social Media Platforms” was, as Chairman Richard Burr called it, the “capstone” of the committee’s inquiry into Russian interference in the 2016 election. Committee members focused on what had been learned and what had changed over the course of the investigation, presaging future action in broad terms only. As a result, the hearing was much less dramatic than any of the committee’s previous three hearings on the matter—there were no stunning revelations about the extent of activity of foreign actors on the platforms, and the bulk of the day’s media coverage on the Hill focused on confirmation hearings for Judge Brett Kavanaugh’s nomination to the Supreme Court. But the lack of fireworks does not mean the hearing was not productive. It provided useful—if largely sound-bite-free—insight into how the lawmakers and tech companies are coming to grips with the new challenges that online platforms pose for democracies.

In contrast to the combative tone of previous sessions, lawmakers had largely congenial exchanges with lawmakers and Facebook Chief Operating Officer Sheryl Sandberg and Twitter CEO Jack Dorsey. The committee praised the extensive efforts taken by social media companies since 2016 to clean up their platforms, and senators congratulated themselves on how much they have learnt in the past 18 months. Sandberg and Dorsey appeared largely contrite about the harm their companies’ platforms have caused and sincere about their efforts to do better. The committee reserved its harshest words for Google—or, rather, the empty chair where a representative from Google would have sat had it sent someone senior enough to satisfy the committee. (Google offered to send its top lawyer, Kent Walker, but turned down the committee’s requests for Alphabet CEO Larry Page or Google CEO Sundar Pichai.)

Members of the committee presented a bipartisan front in underlining the importance of tackling the threats posed by bad actors exploiting social media. Senators seemed to accept that everyone—tech companies, government and civil-society alike—were caught flat-footed in the 2016 election and that there remains work to be done to clean up what Vice Chairman Mark Warner called “the dark underbelly of the entire online ecosystem.”

The Senate hearing was largely free of the partisan, political broadsides that surfaced during Dorsey’s afternoon testimony before the House Energy and Commerce Committee, in which one Democratic representative called the hearing “a load of crap” and Republican representatives cited “general agreement” that Twitter discriminated against conservatives. The most sensational aspect of the Senate hearing, by contrast, was Dorsey’s decision to live-tweet his testimony as he gave it.

Although less exciting, the Senate hearing was far more constructive. Lawmakers pressed the companies on a wide range of issues far beyond the headline topic of foreign interference, including transparency around bots, sales of opioids, hate speech and harassment. While the discussion ranged broadly, three key themes are worth particular attention

1. Despite optimism, regulation is still in lawmakers’ sights

In contrast to earlier hearings where lawmakers seemed awed by the scale and complexity of the issues, Burr opened the hearing by declaring that although “technology always moves faster than regulation … there are no unsolvable problems.” Lawmakers repeatedly affirmed the importance of collaboration between companies, between the public and private sector, and across the political aisle. Nevertheless, it appears unlikely that that collaboration will remain entirely voluntary. As Warner stated, “Congress is going to have to act … The era of the wild west of social media is coming to an end.”

Nevertheless, it is still unclear what form this regulation will take. No lawmakers explicitly endorsed the proposals outlined in Warner’s recent white paper on social media regulation, for example. As Sen. James Risch put it, “The problem has been really well laid out … [but] I’m still not hearing what will be done after this.” Burr closed by agreeing that “There is no clear path forward.”

This may not sound very encouraging. But it is still striking how far the conversation has come in such a short period of time. When Sen. Joe Manchin brought up the topic of Section 230 of the Communications Decency Act—a law sometimes called the “Magna Carta” of the internet, which provides immunity to tech companies for content posted on their platforms —Dorsey said he was “open to dialogue” about the provision. Sandberg pushed back, saying that the safe harbor provided by Section 230 has been critical in enabling companies to take proactive measures to clean up their platforms. Yet the fact that exchange even occurred is remarkable. As Danielle Citron said on a recent podcast, “If you had told me ten years ago that we would ever even have a conversation about Section 230 and that lawmakers would be open to revising it, I would’ve told you you were out of your mind.”

Another stunning concession came when Sandberg agreed with Warner that Facebook had a moral and legal obligation to take down accounts that incentivize violence in countries such as Myanmar. As I have examined previously, Facebook’s legal obligations are murky at best and unenforceable at worst in these foreign jurisdictions beset with ethnic tensions exacerbated by social media. Warner seemed so surprised by Facebook’s acceptance of a legal duty that he repeated twice how happy he was to hear that Facebook acknowledged this obligation. It’s important that he did so—as he closed by noting, it is likely that ever more manipulation will take place that directly incentivizes violence. Recent in depth reporting has shed light on incidents in India, Sri Lanka, the Philippines and Libya, amongst other examples.

2. Foreign interference is not about foreignness

Although the Senate hearing ostensibly focused on foreign influence operations using social media platforms, it became clear that the companies themselves do not focus on foreignness—that is, whether posts or profiles originate from outside a particular country’s borders. This was clearest in an exchange with Sen. Roy Blunt, who asked how the companies differentiate between foreign and domestic influence operations. Sandberg said that this was not how Facebook determined takedown decisions—instead, she said, “our focus is on authenticity,” meaning content or profiles that are “deliberately misleading people about their identities and intentions.” In other words, the question for Facebook is not where the content originates, but whether that origin is transparent.

Similarly, Dorsey stated that Twitter does not focus on whether political content originates abroad in determining how to treat it. Because Twitter, unlike Facebook, has no “real name” policy, Twitter cannot prioritize authenticity. Dorsey instead described Twitter as focusing on the use of artificial intelligence and machine learning to detect “behavioural patterns” that suggest coordination between accounts or gaming the system. In a sense, this is also a proxy for a lack of authenticity, but on a systematic rather than an individual scale. Twitter’s focus, according to Dorsey, is on how people game the system in the “shared spaces [on Twitter] where anyone can interject themselves,” rather than the characteristics of profiles that users choose to follow.

It may seem counterintuitive to use proxies to take down foreign influence operations rather than just directly looking at whether content originating from abroad is seeking to influence domestic politics. But both companies admitted several times that they cannot always tell where an account is—bad actors are able to mask their location, and they are getting better at doing so. As Sandberg said when asked whether Facebook could even track sources of money, “there are a lot of ways to game the system.” Dorsey further explained that the intention of an actor is not a viable signal for determining if content should be taken down—intention is not something that Twitter has information about or can always infer. When it comes to intention, Dorsey said, the company relies on law enforcement, which will have information that private companies do not have access to.

But more fundamentally, focusing on foreignness and where content originates is at odds with the companies’ missions. Throughout the hearing, Dorsey repeatedly called Twitter a “public sphere,” which would “default to free expression.” Sandberg told the committee that Facebook’s mission is to “connect the world.” These statements highlight a basic tension between acknowledging both the vulnerability of these platforms to “foreign influence operations” and the difficulty of identifying criteria that are both technically operationalizable and theoretically consistent to mitigate that vulnerability. Choosing the right criteria can also help companies steer clear of fraught issues. As Sandberg put it, “On a lot of issues that we face, like hate speech, there’s broad debate. [But] when it comes to things that are inauthentic ... they’re hard to find, but once we find them, we know what they are.”

3. Technology is not neutral

Although questions of political bias were confined to the afternoon hearing before the House committee, the morning Senate session also acknowledged that platforms cannot remain entirely neutral because choices of platform design are choices about the values platforms prioritize. As Sen. Marco Rubio put it, the decisions made in this area “define what your companies are.”

Both Sandberg and Dorsey accepted that the designs of their platforms create particular incentives for users and thus shape public discourse. Both also accepted that their companies need to do better in aligning those user incentives with the underlying values of their platforms.

But what specifically are those values? Sens. Rubio and Tom Cotton unsuccessfully sought to extract commitments to explicitly American ideals. Dorsey stated that Twitter is “an American company,” but also talked about consistency in application of terms of service in different countries and the importance of complying with legal processes. Perhaps the biggest concession came from Sandberg, who said that Facebook would only operate in countries in which it could do so according to its values‚ and agreeing that Facebook would not work in China until this was the case. But still, neither company fully bought into the idea that they were responsible for the spread of an American conception of democracy around the world.

The value of free speech remained front and center, but the witnesses were much readier to acknowledge the limits of this principle. While Facebook still professes the fundamental view “bad speech can be countered by good speech,” Sandberg also was strong in denouncing hate speech and saying it had no place on Facebook’s platform. Dorsey agreed, speaking about Twitter’s new focus on “what healthy conversation looks like” and its role in amplifying conversation.

***

In some ways, this week’s hearing was a testament to how far both lawmakers and tech companies have come in understanding, defining and starting to combat the problems with social media that were highlighted by the 2016 election. But it also underlined how far there is to go. There’s agreement on many of the problems and quite a lot of ideas for solutions. Implementation, however, is another issue. As Warner put it, “getting into how we actually spell all that out will be a challenge.” The two-year investigation so far is just the start.


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare