Cybersecurity & Tech Democracy & Elections

Nonconsensual Pornography, Political Scandals and a Warning for 2020

Quinta Jurecic
Thursday, October 24, 2019, 1:28 PM

Amid the hubbub of L’Affaire Ukrainienne, you could be forgiven for overlooking another story that has emerged out of Congress over the past week. It’s a grubby, unpleasant story—so much so that it feels ugly to draw attention to it. But the times are ugly, after all, and the story is a concerning harbinger of what might be to come in the lead-up to 2020.

Rep. Katie Hill at the 2019 California Democratic Party State Convention. (Source: Flickr/Gage Skidmore, CC BY-SA 2.0)

Published by The Lawfare Institute
in Cooperation With
Brookings

Amid the hubbub of L’Affaire Ukrainienne, you could be forgiven for overlooking another story that has emerged out of Congress over the past week. It’s a grubby, unpleasant story—so much so that it feels ugly to draw attention to it. But the times are ugly, after all, and the story is a concerning harbinger of what might be to come in the lead-up to 2020.

On Oct. 18, the right-wing publication RedState released what appears to be an explicit photograph of California Rep. Katie Hill, along with allegations of romantic relationships between Hill and two of her staffers. The publication also indicated that it had obtained other “intimate photographs,” though these remain unpublished. Days later, Hill released a statement denying one of the alleged relationships and pointing to the role of her estranged husband in the release of the photograph; the Capitol Police, she indicated, are investigating the photograph’s publication. As of Oct. 23, Hill acknowledged the second relationship, which is depicted in the photograph. The House Ethics Committee has announced that it is investigating the allegations of Hill’s conduct.

It may be tempting to dismiss the incident as nasty but not particularly noteworthy. Negative stories about politicians, including from politically unfriendly publications, are nothing new. But as one more ingredient in the toxic stew of hyperpolarization and mis- and disinformation that make up American politics, the editorial decision to release an explicit photograph of an opposition politician without her consent is a worrying signal about where things are headed.

To be clear, none of this is to weigh in on the merits or lack thereof of the allegations against Hill. At a bare minimum, her admitted relationship with a campaign staffer speaks to extremely poor judgment, particularly in someone seeking public office. The underlying conduct, though, is separate from the decision to publish the photograph itself.

The photograph appears to be an example of nonconsensual pornography—that is, an intimate or explicit image published without the consent of the person photographed, sometimes called “revenge porn.” It’s not surprising that the Capitol Police are involved. Since 2014, the publication of nonconsensual pornography has been illegal in the District of Columbia, though the D.C. statute includes an exception for the publication of images “in the public interest.” (The same is true in Hill’s home state of California.) The phenomenon is closely related to sextortion and deepfakes, issues Lawfare has previously covered in depth. In the case of sextortion, perpetrators use the threat of distributing intimate images and video to blackmail victims for further material; in the case of deepfakes, the sexually explicit content posted online is not real but is engineered by machine learning. In all three cases, the internet and developing forms of technology enable the rapid and widespread distribution, or the threat of distribution, of intimate images without the consent of the person depicted.

Publication of an explicit image without consent might seem ephemeral, but it is a profound violation, as Danielle Citron and Mary Anne Franks have written. While anyone can be a victim of nonconsensual pornography, the practice is often used to harass and control women; Citron and Franks cite a study indicating that 90 percent of victims are female. The same study showed that more than 80 percent of victims “experience severe emotional distress and anxiety.” Multiple victims described being afraid to leave their houses. Some were fired after their images were circulated. These cases describe instances of nonconsensual pornography targeting private individuals, rather than public figures; but nonconsensual pornography sadly fits well into the larger picture of online harassment of public officials, particularly female public officials.

As part of that pattern of harassment, the publication of Hill’s photograph may not seem new. There is a long history in the United States of politically inclined writers and publications releasing information damaging to politicians of the opposing party affiliation, even sexually compromising information: In 1802, for example, James Thomas Callendar reported in a series of articles about then-President Thomas Jefferson’s sexual relationship with Sally Hemings, an enslaved woman on his plantation. There is also a long history of blackmailing public figures. Alexander Hamilton famously released an account of his own affair in an effort to end a blackmail scheme by the husband of the woman involved.

Likewise, in more recent times, members of Congress have been harassed before over the internet—just this June, a former Democratic staffer was sentenced to four years in prison after posting the home addresses and phone numbers of multiple Republican senators on Wikipedia, a practice known as doxing. This isn’t the first time a member of Congress has been the apparent victim of nonconsensual pornography, either. In 2017, an anonymous account on Twitter published an explicit selfie taken by Texas Rep. Joe Barton. The Washington Post later reported on a recording of a phone call between Barton and a woman with whom he had a consensual affair, in which Barton expressed worry that the woman was “in a position to use [photos of Barton] in a way that would negatively affect my career.” The Capitol Police investigated the matter, though no charges resulted. And, of course, in 2011 Rep. Anthony Weiner accidentally made public an explicit photograph of himself; right-wing blogger Andrew Breitbart subsequently published more photos of Weiner, though none were explicit.

But as far as I am aware, the Katie Hill photograph is the first instance in which a politically aligned publication—or, indeed, any publication—has released nonconsensual pornography depicting a politician of the opposing party affiliation. The Barton photo was posted on Twitter, not as a result of an editorial decision. And though publications have long released damaging material on politicians, it is one thing to report on such allegations and another thing entirely to publish the underlying photograph.

This is an ugly line to have crossed. The United States has not historically had a culture in which political media outlets publish nude photographs of opposition politicians for sport. It’s also a disappointing irony that this is taking place in a period in which legislatures are increasingly recognizing the harm of nonconsensual pornography—as of October 2019, 46 states, as well as D.C. and Guam, have criminalized the activity.

This would be unfortunate in any circumstances, but it’s particularly noteworthy given the unpleasantness of the present moment, in which the initial promise of the internet has curdled into something ugly and dangerous. Rising concern about the dangers of sextortion, deepfakes and nonconsensual pornography goes hand in hand with an environment of near-constant worry over the usefulness of the internet as a tool to cause harm and distort truth for political gain. The common thread is power: who uses it, against whom and at what scale. Just months ago, Amazon CEO and Washington Post owner Jeff Bezos announced that the National Enquirer’s parent company had attempted to sextort him into releasing a statement disclaiming any concern about political bias in the Enquirer’s coverage. On the matter of deepfakes, a recent study indicates that 96 percent of fakes involve manipulating images to create nonconsensual pornography—but Bobby Chesney and Danielle Citron have also written about the harms to democracy and national security that could result from use of the technology to create fake news.

The photograph of Hill is not, by Hill’s own account, a deepfake. But it does represent a worrying overlap between the use of the internet to engineer invasions of sexual privacy and a hyperpartisan online environment that rewards the publication of shocking material about political opponents. There are a lot of other bad things that could dwell in that overlap. Writing in Lawfare earlier this year, Alex Stamos imagined a hypothetical election interference effort focused on hacking and releasing compromising photos of a leading presidential candidate’s family member. Hypothesizing further, it’s all too easy to imagine the emergence of a sexually explicit deepfake, passing itself off as real, of a promising political candidate—perhaps timed strategically to disrupt an upcoming election.

Without minimizing the Hill case, it’s possible to see it as a test run of sorts: Are American political antibodies strong enough to resist the temptation to circulate salacious and politically damaging material? That is, despite RedState’s decision to post the image of Hill without her consent, is the norm against publishing such photographs generally strong enough that a deepfake or calculated leak of explicit photos wouldn’t make its way into the press and thus the political conversation?

Of course, it’s impossible to know if the RedState story would have received more or less engagement in the absence of the photograph. But the ability of nonconsensual pornography to draw clicks, given the particular harm of such a photograph, is worth attention.

So far, the results are not encouraging. The mainstream press has behaved responsibly, reporting on the allegations only in context of Hill’s denial and referencing the existence of the photograph without linking to it. Among right-leaning media, though, the story is different. Many outlets reported on the story shortly after RedState published it, basing their coverage solely on RedState’s article. Breitbart and the Daily Wire highlighted the existence of the photo, providing links for curious readers. The Washington Examiner’s article on the subject embeds the photo into the article itself, so the reader unavoidably sees the image while scrolling down—something even RedState, which linked the photograph for readers to click through, didn’t do. Fox, by contrast, reported on the matter only after Hill’s denial and gave the photo relatively little attention, though it did provide a link back to RedState.

On Twitter, prominent right-wing accounts with large followings have tweeted about the photo and made jokes about Hill’s sexual attractiveness. Twitter bans the publication of nonconsensual pornography, but the photograph still appears in tweets from users critical of Hill, including on the account of Chuck Woolery, a prominent commentator. Many users have voiced frustration that mainstream media has not more aggressively covered the story, while others are spamming Hill’s Twitter account with comments about the photo. “[W]ho knew naked pictures of attractive women were so bad for ratings and clicks,” groused a Republican elections blog, linking to the photo and complaining that more mainstream outlets were not covering the story.

The norm against publication of these kinds of images is not entirely gone, but it is not entirely healthy either. If Hill’s case is an example of how the press and the online political ecosystem would respond to a deepfake, so far it’s a very mixed bag.

In recent years, major social media platforms have been pushed to moderate misinformation and damaging materials more aggressively. The Hill case, though, raises the question of what happens when publications break the norms that platforms are trying to enforce. This is not an entirely new problem—Yochai Benkler, Robert Faris and Hal Roberts wrote in 2018 about the role of far-right “radicalized” media, rather than social media alone, in “creating the current crisis of disinformation and misinformation”—but this instance drives home how thorny the problem really is. Ironically, many accounts that have reposted the photo of Hill from RedState have been suspended for violating Twitter’s rules, leading to complaints of politically motivated censorship by the platform.

Discussions of harms enabled by new technology—like deepfakes—often focus on the technology itself as the matter of concern. But the harm they can cause is enabled in significant part by the dismantling of guardrails—like an information environment already hyperpolarized and flooded with misleading material. If American political and internet culture were more resilient, RedState’s decision to publish the Katie Hill photograph might be nothing more than an unpleasant footnote. Yet given the eagerness of some corners of the right-leaning press and internet to make use of the photograph, it would be a mistake to ignore the warning it represents—especially as the U.S. heads deeper into a contentious election cycle already beset by misinformation and dirty tricks.

The guardrails aren’t gone. But they are wobbly.

Update: Hours after this piece was published, the Daily Mail published an article including multiple additional explicit photos of Hill, embedded into the article; the Mail also published the name of the campaign staffer with whom Hill was in a relationship. The New York Post quickly picked up the story.


Quinta Jurecic is a fellow in Governance Studies at the Brookings Institution and a senior editor at Lawfare. She previously served as Lawfare's managing editor and as an editorial writer for the Washington Post.

Subscribe to Lawfare