
When Jennifer, a pseudonym used to protect her privacy, landed a research job at a nonprofit in 2023, she put her new professional headshot through a facial recognition program. She was curious if the technology would unearth any of the porn videos she’d made over a decade ago in her early twenties. The program did indeed return some of that old content, but also revealed something deeply unsettling she had never seen before: one of her vintage videos, but with someone else’s face superimposed onto her body.
“At first, I thought it was just a different person,” Jennifer recalled. However, a distinctive, gaudy background from a video she filmed around 2013 instantly clicked. “Somebody used me in a deepfake,” she realized, a chilling discovery that sent shivers down her spine.
The Unseen Victims: Bodies in Deepfakes
Eerily, the facial recognition technology had identified her because the image still retained some of Jennifer’s unique features—her cheekbones, her brow, the distinctive shape of her chin. “It’s like I’m wearing somebody else’s face like a mask,” she profoundly observed, encapsulating the disturbing nature of the experience.
Discussions around sexualized deepfakes, which fall under the umbrella of nonconsensual intimate imagery (NCII), typically focus on the individuals whose faces are digitally manipulated onto other bodies, often without their consent. While celebrities were initially the primary targets, this disturbing trend has expanded to affect a growing number of people, predominantly women and sometimes even minors, leading to widespread alarm and calls for legislation. However, these conversations rarely extend to the bodies onto which these deepfaked faces are attached.
Jennifer, now a 37-year-old psychotherapist in New York City, highlights this critical oversight: “There’s never any discussion about Whose body is this?” For years, the implicit answer has generally been adult content creators. Deepfakes themselves earned their name in November 2017 when a Reddit user uploaded videos showcasing the faces of stars like Scarlett Johansson and Gal Gadot pasted onto the bodies of porn actors.
Corey Silverstein, an attorney specializing in the adult industry, confirms that the nonconsensual use of these bodies “happens all the time” in deepfakes. This issue has become even more complex and dangerous as generative AI has advanced and “nudify” apps have proliferated. Now, porn actors’ bodies are not just being directly lifted from existing content but are also used as invaluable training data, shaping how new AI-generated bodies look, move, and perform, threatening the livelihoods and rights of performers as their work inadvertently fuels AI nudes that could ultimately displace them.
A History of Digital Exploitation and Its Impact
The concept of “deepfakes” isn’t entirely new. Spike Irons, a porn actor and president of the adult content platform XChatFans, recalls his preteen years in the 1970s, captivated by what appeared to be nude photos of Farrah Fawcett, though she never posed nude. “People were cutting out faces and pasting them on bodies,” Irons explains, noting that “Deepfakes, before AI, had been going around for quite a while. They just weren’t as prolific.”
Early iterations of the internet saw websites capitalizing on the idea of digitally “seeing” celebrities naked, often using simple tools like Microsoft Paint to mash up faces with pornographic images. Later, software such as Adobe After Effects and FakeApp made face-swapping accessible to a broader audience without requiring significant expertise. This low barrier to entry, combined with the vast amount of online pornographic content, led to the widespread prevalence of face-swap deepfakes utilizing real bodies by the 2010s. When deepfakes of celebrities like Gal Gadot and Emma Watson caused a public outcry, their faces were allegedly superimposed onto the bodies of established porn actors.
However, it wasn’t just high-profile performers whose bodies were exploited. Jennifer, who describes herself as “a very minor performer,” emphasizes, “If it happened to me, I feel like it could happen to anybody who’s shot porn.” Silverstein confirms that since 2006, numerous clients have reported instances of their bodies being used without consent, often discovering the exploitation by chance or through vigilant fans.
Embodied Harms and Legal Complexities
Both individuals whose faces appear in NCII deepfakes and those whose bodies are used experience profound distress, a type of damage experts call “embodied harms.” Anne Craanen, a researcher of gender-based violence, explains that this term reflects how virtual content can trigger physiological and psychological effects, including body dysmorphia. The uncanny distortion of self-perception can lead to self-censorship, with victims withdrawing from public life. Allison Mahoney, an attorney working with abuse survivors, notes that clients affected by NCII often experience severe depression, anxiety, and even suicidal ideation.
Jennifer describes the experience of having her body used in a deepfake as “a really terrible feeling, knowing that you are part of somebody else’s abuse,” likening it to “a new form of sexual violence.” The uncertainty and lack of control over one’s digital likeness can be profoundly unsettling. While some victims discover these deepfakes by chance, others are alerted by dedicated fans who recognize distinguishing features like tattoos or scars.
Organisations like Takedown Piracy utilize digital fingerprinting technology to help combat copyright violations for adult content creators. This process creates unique, invisible digital identifiers for videos, allowing them to be tracked and removed even if altered. Reba Rocket, Takedown Piracy’s COO, notes they’ve digitally fingerprinted over half a billion videos, leading to the removal of 130 million copyrighted videos from Google alone.
Legal avenues for combating NCII, such as claims of invasion of privacy or intentional infliction of emotional distress, are challenging, especially when bodies lack unique identifying features. Eric Goldman, a law professor, notes that US law often struggles to attribute content as an invasion of privacy if the body isn’t clearly identifiable. Furthermore, proving “intent to harm” can be difficult when only a body is featured.
The Looming Threat of AI-Generated Bodies
In recent years, Silverstein has observed a decline in clearly identifiable real adult content creators’ bodies in deepfakes. Instead, generative AI and simpler editing tools are increasingly used to manipulate bodies, often through minor alterations like removing a birthmark or changing a body part’s size. These subtle edits make legal recourse nearly impossible, as performers struggle to definitively prove that an altered image originated from their body.
Even more concerning is the rise of NCII created with entirely AI-generated bodies. “Nudify” apps, which allow users to upload a clothed photo and replace it with a fake naked one, have become increasingly prevalent. “So [much] of this content being created is just someone’s face on an AI body,” Silverstein explains. While these apps have garnered significant attention, particularly concerning minors, there has been relatively little focus on the source of their training data. These models almost certainly draw from the vast repository of online porn, leaving performers with virtually no recourse.
One major hurdle is the inability of creators to demonstrate with certainty that their content is being used to train AI models. Hany Farid, a professor and expert, describes these AI systems as “a black box,” making it nearly impossible to trace the origins of their training data. This lack of transparency and accountability leaves adult content creators, already marginalized, in an even more vulnerable position, facing a future where their work could be exploited to train AI that ultimately undermines their livelihood and digital autonomy.
Source: MIT Tech Review – AI