ai

Clothoff’s Global Deepfake Porn Expansion Sparks Legal and Ethical Outrage

San Francisco, CA — Clothoff, one of the most notorious AI-powered “nudify” apps, is aggressively expanding its operations worldwide—despite ongoing lawsuits and mounting backlash over its role in generating non-consensual deepfake pornography.

Key Revelations from a Whistleblower

A former employee with “access to internal company information” told Der Spiegel that Clothoff’s operators:

  • Own at least 10 other nudify apps, collectively attracting millions of monthly views.
  • Have an annual budget of $3.5 million, largely spent on Telegram, Reddit, and 4chan ads targeting men aged 16-35.
  • Plan global marketing campaigns using fake nudes of celebrities (without consent) to lure users in Germany, the UK, France, and Spain.

Celebrities and influencers named in Clothoff’s plans told Der Spiegel they never consented and may take legal action.

Legal Battles and Failed Shutdown Attempts

  • San Francisco City Attorney David Chiu sued Clothoff in 2024, but the app remains operational.
  • Two rival sites (porngen.art and undresser.ai) shut down in settlements, but Clothoff’s operators—believed to be based in Eastern Europe—have evaded legal service.
  • A New Jersey high schooler is suing a classmate who used Clothoff to create and share a fake nude of her at age 14. She seeks $150,000 per shared image.

Clothoff’s Dangerous Reach

  • Used in schools: Boys are generating fake nudes of classmates, leading to suspensions and criminal charges.
  • Weak age filters: A user admitted bypassing Clothoff’s “underage” detection to create a fake nude of a young-looking singer.
  • New “video deepfake” feature: Recently marketed to over a million users.

Clothoff’s Denials vs. Reality

A spokesperson (“Elias”) claimed:

  • No celebrity exploitation (despite leaked marketing plans).
  • No underage use (despite user-confirmed loopholes).
  • No knowledge of team members (after Der Spiegel exposed a Russian-linked database).

Why This Matters

  • Deepfake porn is exploding: Victims—especially teen girls—have few legal protections.
  • Laws lag behind tech: The Take It Down Act helps remove AI nudes but may face free speech challenges.
  • Profit over ethics: Clothoff’s expansion shows how cheap, accessible AI tools enable mass harassment.

What’s Next?

  • More lawsuits expected as victims fight back.
  • Pressure on platforms (Telegram, Reddit) to block Clothoff ads.
  • Global regulation needed to hold anonymous AI porn operators accountable.

william hart

I'm a tech content writer with 7 years of experience in technology, automotive topics, and electronic gaming. I specialize in creating clear, engaging, and SEO-friendly articles that simplify complex ideas for all types of readers. My passion for writing is fueled by a deep interest in innovation, whether it's the latest gadgets, cars, or video games. Outside of work, I enjoy reading and drawing—hobbies that inspire creativity and fresh perspectives in my content.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button