Deepfake porn’s hidden victims

Nonconsensual sexual deepfakes are harming not only the people whose faces are inserted into explicit content, but also adult performers whose bodies and likenesses are repurposed without consent. As generative Artificial Intelligence tools spread, performers face growing psychological, legal, and financial risks with limited protection.

Jennifer discovered the problem in 2023 after running a new professional headshot through facial recognition software. The search surfaced some of the porn videos she had made more than 10 years before, but it also revealed one of her old videos with another person’s face placed on her body. She recognized the set from a shoot around 2013 and realized her body had been used in a deepfake. The result still carried enough of her physical features that the software identified her, leaving her with the disturbing sense that her body had been turned into a vehicle for someone else’s fabricated image.

Debates around nonconsensual intimate imagery usually focus on the person whose face appears in the fake content, often celebrities or other public figures. Adult performers whose bodies are used have received far less attention, even though experts and lawyers say this has been happening for years. Deepfakes took off publicly in November 2017, when videos appeared online showing celebrities’ faces pasted onto porn actors’ bodies, but attorneys in the adult industry say the practice long predates that moment. Researchers describe the fallout as “embodied harms,” reflecting how digital violations can produce real psychological effects, including anxiety, self-censorship, body dysmorphia, and a profound loss of control over one’s image and labor.

The threat has expanded as generative Artificial Intelligence tools and nudify apps have become easier to use. Some fakes still rely on real performers’ bodies, sometimes edited just enough to avoid clear identification. Others use fully Artificial Intelligence-generated nude bodies, likely informed by massive quantities of online porn. One expert says these systems almost certainly draw on more than 10,000 terabytes of online porn, while another says it is a reasonable assumption that adult material is being used in training even if the process remains a black box. For performers, that means their past work may help create synthetic sexual content that competes with them, imitates acts they never agreed to perform, or fuels scams targeting fans.

Enforcement remains uneven and often ineffective. Takedown Piracy says it has digitally fingerprinted more than half a billion videos and the organization has gotten 130 million copyrighted videos taken down from Google alone. Even so, lawyers say it is increasingly hard to identify who created or distributed deepfakes, especially when pirate sites operate anonymously or outside the US. Existing legal tools such as copyright, invasion of privacy, and emotional distress claims can help in some cases, but they often depend on proving that a body is clearly identifiable. The Take It Down Act requires websites to remove nonconsensual intimate imagery within 48 hours, yet critics warn it could also be used to target lawful consensual porn and further undermine performers’ livelihoods.

Some adult creators have tried to protect themselves by signing deals with platforms that host official Artificial Intelligence replicas, hoping contracts will establish ownership over their synthetic likenesses. But even that offers limited security when companies fold or when monitoring infringement across the internet becomes unmanageable. Jennifer says the risks surrounding recorded sexual content now feel too unpredictable. She no longer sees the issue as a narrow question of fake images online, but as a widening system that repurposes performers’ bodies, labor, and identities while leaving them exposed to trauma, lost income, and vanishing control over how they are seen.

58

Impact Score

Deepfake porn and chatbot privacy breaches

Nonconsensual deepfake pornography is harming not only people whose faces are inserted into explicit media, but also adult creators whose bodies and likenesses are reused without permission. Generative Artificial Intelligence chatbots are also exposing private phone numbers, making personal information easier to retrieve and harder to control.

European Union Artificial Intelligence Act raises layered compliance demands for finance

Banks, insurers and financial intermediaries face a more complex compliance environment as the European Union Artificial Intelligence Act overlays existing financial regulation and the GDPR. Proposed changes in the Digital Omnibus Package may delay some obligations, but the core challenge remains managing overlapping rules, roles and regulators.

Europe and US discuss biometric data-sharing framework

European Union and US officials are negotiating a border security arrangement that could enable continuous biometric data exchanges on EU citizens. The UK says the US has also requested access to fingerprint records as part of Visa Waiver Program discussions.

Apple plans Intel 18A-P for M7 and 14A for A21

Apple is expected to use Intel’s 18A-P process for M7 chips in MacBook models and Intel’s 14A process for A21 chips in iPhones. The shift points to a broader supplier strategy as Apple moves beyond TSMC for parts of its future silicon roadmap.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.