Deepfake Porn and Deepnude
The technique known as Deepfake came to public consciousness, I think, around the 2016 election in the US. We began to see and hear videos and audio clips that faked the voice of a politician. Initially, they were a bit creaky and not very convincing. But they’ve quickly improved. For example, a BBC video demonstrates a video created at the University of Washington in 2017. In it, we see a “photorealistic” version of Obama talking about the advantages of the Affordable Care Act. But the footage and the speech are entirely fake. The term Deepfake itself is a combination of “deep learning” and “fake.” It is a computer technique the uses Artificial Intelligence to synthesize and overlay human voices and human figures onto existing figures. The AI studies thousands of examples in order to render a lifelike recreation. It can study thousands of hours of speeches. It can study tens of thousands of images and video clips. It then uses that data in order to render lifelike images and sounds. We have arrived at a moment in which the faked video are indiscernible from the non-faked videos.
This technique obviously has enormous ramifications for political life. In an age where politicians regularly cry “fake news,” in an age where the voting public already has a deep distrust of the news — unless delivered by their own politically and ideologically compatible news network — what does the creation and dissemination of utterly realistic digitally created clips mean for political discourse? For the notion of “the truth?” One could easily image a video in which Donald Trump — quite convincingly — says, yes, I did grope those women. How might that affect politician’s and the public’s perceptions of Trump? One could easily imagine a video in which Elizabeth Warren says she would immediately convert all undocumented residents into citizens on her first day of office. What might that do to potential voters? What might that do to the whole election process? The political possibilities are truly the stuff of science fiction dystopias — we really should have been paying more attention to SF!
While these consequences are real and real scary, I want to look at two other recent trends in deepfake: Deepfake porn and Deepnude. Both of these concepts employ the very same AI techniques; both of these techniques are designed to render lifelike nudes. Deepfake porn was, initially, focused on celebrities. The deepfake technology renders the likeness of a celebrity into a pornographic video. Et voilà. A porn clip with your favorite movie star. A quick internet search will bring you to dozens of sites dedicated to Deepfake porn. One fan favorite seems to be Emma Watson, the child star of the Harry Potter films, and later UN Goodwill Ambassador for Women. Deepfake AI technology has studied all the footage of Emma Watson and rendered porn videos with her as the star.
What are the ramifications of this? Some have argued that the practice is creepy. Not much of a counterargument to make. Pornographic videos of a beloved child actor are creepy. Some who have been exploited in deepfake videos have argued that it is a violation of privacy. That it must be a violation of some privacy or pornography statute. In 1964, the Supreme Court infamously punted on defining obscenity in Jacobellis v. Ohio, with Justice Potter Stewart claiming that he could not define hard-core pornography, but “I know it when I see it.” Since then, courts have ruled on, and rendered illegal, several forms of pornography, including child pornography and bestiality. Both of those were ruled illegal because the participants cannot consent to their participation. So, it might seem, then, that deepfake porn would be covered by the consent rule. These celebrities, such as Ms. Watson, have not consented to appear in these videos. However, the Supreme Court also made an exception to child porn for digitally created pornography. Purveyors of porn can create an image or a video on a computer, and that video can feature one or more children, but it does not fall under the consent rule because consent cannot apply to a digitally created image. Consent has not been violated. So, Deepfake porn creators might argue that the person represented in the deepfake porn video is entirely digitally created, that it is not Emma Watson in the video but a digital representation that resembles Watson, and that consent is not necessary. Under current interpretations of the law, I suspect that Deepfake porn videos are wholly legal — which is not to say they are justifiable or ethical.
Deepnude is a similar technique. The Washington Post reported (28 June 2019) that the developer of the app, who goes by the handle “Alberto,” created that app to render any photo of any woman as a nude image. An image of Watson, an image of Elizabeth Warren, an image of a high school classmate can all be rendered into a nude photo. Alberto justifies the app by arguing A) that he is not a pervert, and that B) he’s just a tech geek. He wanted to see what he could do with the deepfake technique. The Deepnude app has scanned thousands and thousands of images of nude women. Alberto says that he did not intend for the gender bias — the simple reality is that far more images of nude women appear on the web than images of nude men. The AI has studied the shapes and sizes and appearance of all these nude women, and uses that “deep learning” in order to then render an image of a clothed woman as she would appear nude.
Why would someone use these techniques? Alberto says he’s just a tech geek, though that hardly seems a justification. Deep learning could be used for — literally — thousands of other projects. That he chose to denude women’s photos must be significant. The technique might also be used for political ends. The edited video of Nancy Pelosi in which she appears to slur her words, had a purely political aim — to discredit Pelosi. And, as we saw above, the deepfake technique could be used to create a speech out of whole cloth. Another aim might be revenge porn, which is the dissemination of nude or suggestive images or videos in order to exact revenge on someone — an ex, a co-worker, a neighbor, etc. Strategically placed nude images or videos could irreparably damage someone’s reputation or employment, not to mention self-esteem and self-image.
Indeed, it’s hard to image a use for Deepnude that is not nefarious.
Following the online backlash to Deepnude, Alberto has pulled the app from the site, though copies of it still circulate on the web. But it’s too late. In both these cases, Deepfake Porn and Deepnude, the technologies have left the building, and it’s too late to close the door behind them.
I suspect it will take years for the laws and the courts to catch up to technologies like this. The courts are still struggling to adjust to cell phones and cell phone data and metadata. By the time the courts address deepfake technologies, something new will have emerged, further complicating the legal and ethical issues of property, privacy, and piracy. The prevalence of women in these videos is also telling. We continue to view women as objects of attraction and desire, and as objects to manipulate and control. Psychologists remind us that sexual assault, sexual violence, and rape are acts of power and control. Someone feels powerless and exerts power over someone else. I suggest that these two techniques operate in similar ways. Someone can control a celebrity or an ex or a colleague in ways they he never could without deepfake technology. Perhaps such acts can be considered and treated in the same way, as criminal acts.
Ritch Calvin is an Associate Professor of Women’s, Gender, and Sexuality Studies at SUNY Stony Brook. He is the author of a book on feminist science fiction and editor of a collection of essays on Gilmore Girls.