Jennifer discovered the problem in 2023 after running a new professional headshot through facial recognition software. The search surfaced some of the porn videos she had made more than 10 years before, but it also revealed one of her old videos with another person’s face placed on her body. She recognized the set from a shoot around 2013 and realized her body had been used in a deepfake. The result still carried enough of her physical features that the software identified her, leaving her with the disturbing sense that her body had been turned into a vehicle for someone else’s fabricated image.
Debates around nonconsensual intimate imagery usually focus on the person whose face appears in the fake content, often celebrities or other public figures. Adult performers whose bodies are used have received far less attention, even though experts and lawyers say this has been happening for years. Deepfakes took off publicly in November 2017, when videos appeared online showing celebrities’ faces pasted onto porn actors’ bodies, but attorneys in the adult industry say the practice long predates that moment. Researchers describe the fallout as “embodied harms,” reflecting how digital violations can produce real psychological effects, including anxiety, self-censorship, body dysmorphia, and a profound loss of control over one’s image and labor.
The threat has expanded as generative Artificial Intelligence tools and nudify apps have become easier to use. Some fakes still rely on real performers’ bodies, sometimes edited just enough to avoid clear identification. Others use fully Artificial Intelligence-generated nude bodies, likely informed by massive quantities of online porn. One expert says these systems almost certainly draw on more than 10,000 terabytes of online porn, while another says it is a reasonable assumption that adult material is being used in training even if the process remains a black box. For performers, that means their past work may help create synthetic sexual content that competes with them, imitates acts they never agreed to perform, or fuels scams targeting fans.
Enforcement remains uneven and often ineffective. Takedown Piracy says it has digitally fingerprinted more than half a billion videos and the organization has gotten 130 million copyrighted videos taken down from Google alone. Even so, lawyers say it is increasingly hard to identify who created or distributed deepfakes, especially when pirate sites operate anonymously or outside the US. Existing legal tools such as copyright, invasion of privacy, and emotional distress claims can help in some cases, but they often depend on proving that a body is clearly identifiable. The Take It Down Act requires websites to remove nonconsensual intimate imagery within 48 hours, yet critics warn it could also be used to target lawful consensual porn and further undermine performers’ livelihoods.
Some adult creators have tried to protect themselves by signing deals with platforms that host official Artificial Intelligence replicas, hoping contracts will establish ownership over their synthetic likenesses. But even that offers limited security when companies fold or when monitoring infringement across the internet becomes unmanageable. Jennifer says the risks surrounding recorded sexual content now feel too unpredictable. She no longer sees the issue as a narrow question of fake images online, but as a widening system that repurposes performers’ bodies, labor, and identities while leaving them exposed to trauma, lost income, and vanishing control over how they are seen.
