Terrorism & Extremism

De-platforming and the Online Extremist’s Dilemma

Bennett Clifford, Helen Christy Powell
Thursday, June 6, 2019, 2:09 PM

During the past several years, platforms like Twitter,YouTube and Facebook have used a combination of automated detection and human review to identify and remove extremist accounts and content from their sites—in effect “de-platforming” extremists from mainstream social media.

Telegram Messenger Desktop Source: Wikimedia

Published by The Lawfare Institute
in Cooperation With
Brookings

During the past several years, platforms like Twitter,YouTube and Facebook have used a combination of automated detection and human review to identify and remove extremist accounts and content from their sites—in effect “de-platforming” extremists from mainstream social media. Now, under the recent Christchurch Call initiative, a chorus of world governments is pushing major tech companies to expand takedown policies even further, with the goal of “eliminat[ing] terrorist and violent extremist content online.” Indeed, some platforms are more aggressively removing a broader range of extremist groups and hateful content. But how effective is de-platforming in combating extremism?

At the George Washington University Program on Extremism, we recently conducted a study of more than 636 English-language, pro-Islamic State groups and channels on the online instant messenger service Telegram. Our results point to an “online extremists’ dilemma” in utilizing digital communications tools in the wake of de-platforming—an offshoot of Jacob Shapiro’s “terrorists’ dilemma,” which elucidates the challenges that all terrorist groups face in management of organizational priorities. This dilemma describes how online extremists are forced to balance public outreach and operational security in choosing which digital tools to utilize.

Recruiting potential new supporters and maintaining operational security are essential for continued survival of an online extremist milieu. Yet these two goals are inherently in conflict. Supporters of terrorist groups can try to reach a mass audience using mainstream social media, but this also increases the risk of detection, infiltrations by law enforcement or content removal. One study found that approximately 90 percent of a sample of U.S.-based extremists who used social media were arrested before carrying out their plots.

Alternatively, online extremists can choose to reinforce operational security, limiting all communication to closed platforms, developing operational security protocols and weeding out “spies.” The downside of this approach is that it makes recruiting nearly impossible. Closed forums keep out the general public, and administrators have difficulties distinguishing potential recruits from detractors.

This is where Telegram comes in. In response to takedowns on major platforms, extremists often migrate to lesser-known or protected online forums—among them Telegram, which offers communications in the form of secret chats, channels and chat groups. While a variety of extremists use the platform, perhaps no group has established a more prominent foothold than supporters of the Islamic State. After major social media companies increased efforts to remove Islamic State content in 2015, the organization’s global network of supporters adopted Telegram. Currently, the group’s online supporters use the platform to communicate internally, share pro-Islamic State propaganda, and distribute operational instructions across the world. Compared to platforms like Twitter and Facebook, Telegram afforded Islamic State supporters a greater deal of anonymity and protection due to its encryption offerings and the company’s promise to “disclose 0 bytes of user data to third parties, including governments.”

Telegram offers both outreach and operational security, but it is limited in both arenas. The application offers broadcasting options to large audiences (e.g., channels with an unlimited number of members) and end-to-end encrypted messenger options (e.g., secret chats) designed to hide users’ activities. In our study sample, Islamic State supporters took extensive steps to ensure operational security. Seventy percent of all channels and groups analyzed in the report used Telegram’s “private” setting, which makes the streams unsearchable on both Telegram and the public web, and requires members to join through a unique URL key. Users in the sample frequently distributed instructions on privacy-maximizing services, including encryption, virtual private networks (VPNs), secure browsers and mobile security applications. They also reminded followers that no online platform—including Telegram—was safe, and that all Islamic State supporters should refrain from posting personal information, directly communicating with other supporters, or posting details about operations in channels or groups.

Simultaneously, however, the need of Islamic State supporters to proselytize sometimes reduced the effectiveness of their operational security protocols. Twenty-nine percent of the sample is composed of public channels. Public channels and groups are necessary to reach a broader audience from Telegram, as they are the only type of content searchable on Telegram, meaning that the only chance of an uninitiated potential recruit “stumbling upon” terrorist content on the platform would be through accessing a public channel or group. However, public messaging streams on Telegram are also subject to removal, do not offer end-to-end encryption, and can be joined by anyone who finds the channel—including the Islamic State’s adversaries.

Moreover, even on these public channels, supporters broadcast to a far smaller audience than they would have been able to reach on mainstream social media platforms. For instance, Twitter boasted in February 2019 that it hosted 321 million monthly users—nearly double Telegram’s count. Unlike most major social media platforms, even material posted on Telegram’s public channels is not searchable on the public web. To bridge this gap, Islamic State supporters in our sample posted more than 40,000 links to content on mainstream social media and file-sharing sites. However, in doing so, they risked inadvertently releasing details—like their IP address or account information—that could compromise their operational security and leave them vulnerable to detection by law enforcement.

As Islamic State supporters struggle to balance operational security and public outreach, those wishing to combat the organization online must also try to balance the potential positives and negatives associated with de-platforming the Islamic State from specific applications. De-platforming can reduce the organization’s ability to inject its message into public discourse and recruit, but it can also push supporters to obscure and opaque platforms where it is substantially more difficult for law enforcement to monitor their activities. Thus, a marginalization strategy is a preferable approach over the unachievable goal of removing all terrorist content from the internet. In this strategy, a “swathe of empowered actors” attempts to cordon off extremists on platforms where it is both “1) difficult for extremist ideas to reach the public, and 2) possible for law enforcement to detect, monitor, and investigate extremist activity on the platform.”

Identifying which platforms meet the criteria of the marginalization strategy is the first step; less technologically sophisticated platforms that reach a narrow portion of the general public are excellent candidates. In effect, the goal of marginalization is to force extremists into the online extremist’s dilemma between broad-based messaging and internal security. Those tasked with online counterterrorism must ask whether the benefits that extremists gain from using a particular online platform outweigh the potential negative effects of de-platforming. The employment of online counterterrorism policies based on the principles of marginalization can complement takedown efforts and keep extremist narratives on the periphery by denying them virality, reach and impact, and by compelling supporters of terrorist groups to make difficult choices about the online extremist’s dilemma.


Bennett Clifford is a senior research fellow at the Program on Extremism at George Washington University. He studies violent extremist movements and organizations in the United States, as well as in the Caucasus, Central Asia, and the Balkans.
Helen Christy Powell is a Presidential Fellow at the George Washington University’s Program on Extremism. She studies Islamic State social media and U.S. counterterrorism policy.

Subscribe to Lawfare