Over the past few years, there has been a massive increase in deepfake pornography, a new form of sexual harassment where the perpetrator is anonymous.

Deepfakes are problematic in general, as they offer dangerous opportunities to spread disinformation. Many deepfake videos have been created that manipulate politicians or public figures to say different things. Deepfake porn is an especially invasive and evil form of disinformation. Advanced software, like the Windows program “FakeApp,” allows any user to upload a photo set of a face onto a pornographic video. As of 2021, it was estimated that between 90 and 95 percent of all deepfake videos are nonconsensual porn, and 90 percent feature women. While deepfake videos can vastly vary in quality, the psychological effect on its victims is the same. In addition to stealing and manipulating people’s identities, deepfake porn can affect employment, harm interpersonal and romantic relationships, cause body dysmorphia in victims, and subject people to stalking or harassment.

The porn industry has already immensely profited from the exploitation and abuse of women. For example, in June of 2021, 34 women sued PornHub for having profited from “nonconsensual content involving rape, child sexual abuse, and human trafficking.” Deepfakes continue to be uploaded to porn sites each month. Deepfakes inherently ignore consent, allowing for people to create and distribute manipulated and compromising images of people anonymously. Some defenders of deepfake imaging argue that it is part of the “natural progression of technology” and that people can do what they’d like with pictures of others. However, if made realistic enough, deepfakes threaten to influence others’ perceptions of their subjects. Regardless of the quality of the deepfake, they strip individuals of the agency to determine how they are presented. Additionally, deepfakes jeopardize movements to empower sex workers and protect their sexual agency and consent.

Another harmful dimension of deepfakes is their ability to sexualize anyone. Deepfake technology has been used as a tool to sexually abuse children by imposing their faces on explicit imagery and video, a disgusting and illegal act that endangers children.

Fortunately, there have been efforts to create software to detect deepfakes. Unfortunately, the very method a program uses to detect a deepfake can be used to “train” new deepfake creation algorithms. Recently, researchers from Watson College here at Binghamton University worked with Intel Corp. to develop an online tool called “FakeCatcher,” which can detect if videos have used deepfake editing by analyzing differences in skin color caused by heartbeats — the technology is 90 percent accurate. Hopefully, these technologies can be used by pornography websites to ensure that they are not posting deepfake videos.

So, what protections do people have against being taken advantage of by deepfake technology? In the United States, the Stop Hacks and Improve Electronic Data Security Act (“SHIELD Act”) amendment to the Violence Against Women Reauthorization Act of 2021 provides privacy protections against deepfake pornography and its pernicious effects on employment and educational opportunity, ad well as the harassment it often causes. Additionally, New York State Senate Bill S5959D provides a private right of action for those victim to “unlawful dissemination or publication of a sexually explicit depiction of an individual.” Forty-six U.S. states also ban revenge porn, though only Virginia and California’s bans specifically enumerate deepfaked media. While some may argue that regulating deepfakes violates First Amendment free speech rights, the amendment states that the free speech guarantee is not extended to “the lewd and obscene, the profane, the libelous and insulting words or ‘fighting’ words.” Other legal claims that victims of deepfake pornography may be able to pursue are defamation, violation of privacy, appropriation of personality or copyright infringement. Additionally, child pornography is criminalized on a federal level. However, the difficulty of bringing a deepfake case to court is that the perpetrator is usually anonymous.

While awareness and legal protections surrounding deepfake pornography are increasing, it is important that research continues to target those perpetrating these heinous crimes. Additionally, we should begin to break down the broad stigma surrounding pornography, so that support can be given to those who are victims of deepfake technology. Technology is only going to advance, so it is up to the law to advance with it.