Peril of Deepfake Porn and Indefinite Contracts

As artificial intelligence becomes increasingly integrated into the adult content industry, promises of innovation mask a disturbing reality. Last year, Adult Industry News called out an emerging AI-powered deepfake porn platform, Eva AI, warning creators that falling victim to the promise of a $10,000 sign up bonus would chain them to the platform indefinitely. Now rebranded in the shorthand for “jerk off instruction,” JOI  crafts AI chatbots and deepfake “duplicates” of real-life porn performers and content creators, inviting users to interact with eerily lifelike simulations of their favourite stars.

On the surface, this sounds like a fun tech-powered fan experience.

On the surface, this sounds like a fun tech-powered fan experience. However, behind the sleek branding and promise of lucrative income is a system that raises serious ethical concerns about consent, ownership and exploitation, particularly for sex workers. JOI claims to be at the cutting edge of personalised, AI-powered intimacy. In truth, it’s a textbook example of how tech innovation can be used to entrap and erase the agency of the people it claims to empower.

At Venus Berlin 2024, an adult entertainment trade show, JOI wowed crowds with flashy demos where attendees could create their own AI likenesses in real-time. While marketed as harmless fun, did anyone read the fine print before handing over their biometric data? Based on eyewitness accounts and the company’s opaque history, there’s reason to believe that these moments of novelty could be quietly feeding AI training datasets without meaningful or informed consent.

As one attendee noted, “They had a massive booth and were inviting people in to try creating their AI self—but no one was clearly told what would happen to that data after they left.” The challenge with deepfakes—like many emerging technologies—is that their legal and ethical ambiguity makes them particularly vulnerable to misuse and exploitation. A good place start is their terms of service, where coercive legal jargon often weaves a sticky web.

Contracts Designed to Entrap, Not Empower

Tech-Slick Promises, Predatory Realities

The Terms & Conditions of Joi AI present multiple unethical and coercive elements. While framed as standard platform agreements, they embed exploitative clauses that strip creators of control, impose excessive obligations and minimise consent once content is submitted. In this context innovation is synonymous with novel digital servitude. This can be seen in the following clauses:

Perpetual Use & Ownership of Likeness: “All rights to any Virtual Character… belong exclusively to Joi AI. You do not acquire any ownership rights…” (Clause 6.1)

Creators permanently relinquish control over their own digital persona, which is ethically troubling, especially when the likeness is derived from their sexual labour or identity. This turns sex workers’ identities into commodified assets owned by a capitalist third party.

Exploitative Content Quotas: “…a minimum of 250 media files per month for a period of three months…”(Clause 2.3)

This imposes a burdensome and coercive production schedule, pressuring Models to produce excessive content, effectively locking them into high-volume digital labour for AI training with no real recourse if they fail.

Training AI Without Limit: “…use the uploaded Content and Digital Duplicate to train, improve and develop our artificial intelligence models…” (Clause 1.3)

There is no cap, timeframe, or opt-out on this clause. It allows the company to profit perpetually from a model’s likeness for training future systems, even if the Model leaves. This is a clear abuse of power and exploitation of biometric data.

No Right to Reclaim Ownership: “We retain the right to use the Digital Duplicates… even after termination.” (Clause 8.2)

No Refunds, No Exit: “In the event of account termination… you will not be entitled to any refunds or compensation.” (Clause 3.2)

The platform reserves the right to expel creators with no compensation, despite having monetised their image and data. This asymmetry increases creators’ vulnerability and discourages resistance to exploitative terms. Such a clause is actually quite typical of adult-services platforms.

Jurisdiction & Legal Burden:“You agree to the jurisdiction of… the United States Federal District Court…”(DMCA Counter-Notice Clause)

This forces even international creators — many of whom are sex workers with limited resources — to submit to foreign legal systems, making legal recourse inaccessible and heavily favouring Joi AI in disputes.

In essence, once you’re in, there’s no easy exit.

Like many exploitative platforms, JOI baits creators with seductive offers. Adult entertainer Alana Evans tweeted in June 2024 to warn her fellow industry performers:

Ever hear of JOI Creators? Well if you’re a performer and you signed up to them WITHOUT READING THEIR TOS, and getting some special consideration, you just sold the rights to an AI version of you to them FOREVER.”

Scholars like Maddocks (2020) and Okolie (2023) have critiqued the overlap between deepfake porn, political instability and systemic abuse. As Maddocks explains, pornographic deepfakes operate too ‘silence critical speech,’ disproportionately targeting women and sex workers. Okolie’s research frames deepfakes as a form of image-based sexual abuse, especially when AI is trained without consent and used to simulate acts never performed.

VICE reports that JOI offers a ‘therapeutic edge,’ with some creators perceiving benefits of storytelling performance and emotional exchange. But JOI is commodifying connection, transforming real emotions, bodies and performances into synthetic imitations that serve the platform. It’s an act of erotic erasure: the performance remains but the performer is no longer in control.

What Needs to Change

  1. Regulatory reform: The UK and EU must update data protection and digital rights laws to reflect the unique vulnerabilities of sex workers in AI systems.
  2. Transparent consent: Any use of biometric data or likeness should require granular, revocable consent—not buried clauses and legalese.
  3. Creator-led platforms: We need platforms that centre creator participation and democratic accountability.
  4. Industry-wide education: Creators must be armed with legal knowledge and community support to navigate AI contracts safely.

Final Thoughts

If we want a future where technology enhances, not erases, human intimacy, we must demand it be built on the foundations of consent, dignity, and shared power. Our data is inexplicably part of us and, as such, our data bodies should remain under our ownership and control at all times. The law needs to catch up to ensure this principle is enshrined in digital rights frameworks, with clear protections against exploitation, commodification and indefinite use.

 

Post tag :

All, Ethics

Eilidh Maclachlan Thornhill Digital Profile Picture

Thornhill Digital CEO & Co-Founder

Categories

Recent Posts

Business Inquiries

Please enable JavaScript in your browser to complete this form.
=

Follow Eilidh