BotBeat
...
← Back

> ▌

AI ModelForgeAI ModelForge
POLICY & REGULATIONAI ModelForge2026-05-01

Women Sue Operators of AI-Generated Deepfake Porn Schemes for Non-Consensual Image Misuse

Key Takeaways

  • ▸Three men allegedly orchestrated a large-scale scheme using CreatorCore software to generate non-consensual sexual imagery based on real women's photos scraped from Instagram, generating hundreds of thousands of fake images while amassing significant monthly revenue
  • ▸The defendants sold instructional courses teaching hundreds of additional men how to create their own AI-generated fake influencers, with the scheme producing over 500,000 images and videos and maintaining 8,000+ active subscribers to the underlying platform
  • ▸The lawsuit raises urgent questions about consent, privacy protection, platform responsibility, and the legal frameworks needed to protect individuals from non-consensual deepfake pornography in the AI era
Source:
Hacker Newshttps://www.wired.com/story/ai-porn-lawsuit-arizona/↗

Summary

Three Phoenix men—Jackson Webb, Lucas Webb, and Beau Schultz—face a lawsuit filed in Arizona in January by three women alleging a coordinated scheme to create non-consensual sexually explicit AI-generated content. The defendants allegedly scraped photos from women's Instagram accounts and used AI software called CreatorCore to generate fake images and videos depicting the women in revealing or sexually explicit scenarios, then sold this content on the subscription platform Fanvue.

Further allegations claim the men profited substantially by selling instructional courses on Whop for $24.95 per month under the brand AI ModelForge, teaching other men how to replicate the scheme. The lawsuit alleges these courses provided detailed "Blueprints" on how to identify, target, and generate AI-generated content from unsuspecting women's social media photos, with specific guidance on selecting victims unlikely to pursue legal action. According to the complaint, the platforms generated significant revenue—reportedly exceeding $50,000 in a single month—and CreatorCore alone had over 8,000 subscribers producing more than 500,000 images and videos.

One plaintiff, identified as MG to protect her privacy, discovered the non-consensual content when a follower alerted her to Instagram Reels featuring her face and distinctive tattoos superimposed onto another body in revealing clothing. MG described the experience as a "reality check that I don't have any control over my own image." The lawsuit characterizes the scheme as deliberately predatory, alleging it specifically instructed perpetrators to target women least likely to defend themselves and provided step-by-step guidance on scraping images from social media.

  • This case exemplifies a growing phenomenon of AI influencer businesses that exploit generative AI to create fake social media personalities, often depending entirely on stolen likenesses and identities of real women without their knowledge or consent
Generative AIRegulation & PolicyEthics & BiasPrivacy & DataMisinformation & Deepfakes

Comments

Suggested

DeepSeekDeepSeek
PRODUCT LAUNCH

DeepSeek Releases V4 Models with 1M-Token Context and Aggressive Pricing Strategy

2026-05-01
OpenAIOpenAI
UPDATE

OpenAI Launches Advanced Account Security with Passkeys and Training Opt-Out

2026-05-01
AI Industry (Broad Trend Analysis)AI Industry (Broad Trend Analysis)
INDUSTRY REPORT

AI Uses Less Water Than the Public Thinks: Physics-Based Analysis of Data Center Consumption

2026-05-01
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us