BotBeat
...
← Back

> ▌

PikaPika
POLICY & REGULATIONPika2026-04-05

Pika's Terms of Service Contradict Privacy Assurances Over User Likeness Data

Key Takeaways

  • ▸Pika's FAQ claims user likenesses won't be used for training other AI models, but Terms of Service may grant broader rights
  • ▸The PikaMe platform creates persistent AI versions of users that operate autonomously across multiple platforms, raising data governance concerns
  • ▸Discrepancies between marketing materials and legal terms underscore broader industry tensions around user consent and AI training data practices
Source:
Hacker Newshttps://pika-not-me.vercel.app/↗

Summary

Pika Labs' new PikaMe platform has created confusion and concern after discrepancies emerged between its public FAQ and legal Terms of Service regarding how user face and voice data will be handled. While the company's FAQ explicitly states "We won't use your likeness or inputs to train other people's AI Selves or any general-purpose models," the actual Terms of Service reportedly grant Pika a perpetual, irrevocable, sublicensable license to use submitted content—including likenesses and voice recordings—potentially contradicting these stated protections.

PikaMe, launched in early 2026, is positioned as an "AI Self" platform allowing users to upload selfies, record voice samples, answer personality questions, and create persistent AI versions of themselves. These AI selves can then text, post on social media, join video calls, and operate autonomously across 16+ platforms. However, the apparent conflict between public communications and legal language raises questions about the true scope of Pika's data usage rights and user consent practices.

  • Users uploading biometric data (face and voice) may not have clear understanding of how their data will actually be used

Editorial Opinion

The apparent gap between Pika's public commitments and legal fine print exemplifies a persistent problem in the generative AI industry: the tension between user-facing reassurances and the business incentives to retain broad data usage rights. For a platform built entirely on user-submitted biometric data, transparency and alignment between marketing claims and actual terms is not optional—it's foundational to user trust. This discrepancy demands clarification from Pika and serves as a reminder that users should carefully scrutinize legal terms, not just FAQ pages, when uploading sensitive personal data.

Generative AIAI Safety & AlignmentPrivacy & Data

Comments

Suggested

AnthropicAnthropic
POLICY & REGULATION

Anthropic's Claude Code Accidentally Leaks Frustration-Tracking and Human-Impersonation Code

2026-04-05
Moody'sMoody's
RESEARCH

Moody's Develops LLM-Based Judge for Automating Search Relevance Evaluation in Financial Research

2026-04-05
AnthropicAnthropic
RESEARCH

ACE Benchmark Reveals Claude Haiku's Superior Robustness Against Adversarial Attacks on AI Agents

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us