BotBeat
...
← Back

> ▌

GrammarlyGrammarly
POLICY & REGULATIONGrammarly2026-03-23

Superhuman CEO Addresses Grammarly's AI Impersonation Controversy in Tense Interview

Key Takeaways

  • ▸Superhuman's Expert Review feature used cloned identities of real journalists without permission, sparking a class action lawsuit and significant industry backlash
  • ▸CEO Shishir Mehrotra apologized for the feature but defended the company's approach to AI integration across its productivity suite
  • ▸The incident highlights ongoing tensions between AI companies' use of creator content and the lack of clear consent frameworks in the industry
Sources:
Hacker Newshttps://www.theverge.com/podcast/898715/superhuman-grammarly-expert-review-shishir-mehrotra-interview-ai-impersonation↗
Hacker Newshttps://www.theverge.com↗

Summary

Shishir Mehrotra, CEO of Superhuman (the parent company of Grammarly, Coda, and Mail), sat down for an in-depth interview with Nilay Patel of The Verge to address the Expert Review feature controversy that sparked significant backlash from journalists and creators. In August 2025, Grammarly launched the Expert Review feature, which provided writing suggestions from AI-cloned versions of real journalists and writers—including Patel himself—without obtaining permission from those individuals. The revelation prompted outrage across the media industry, leading to a class action lawsuit filed by investigative journalist Julia Angwin and ultimately forcing Superhuman to kill the feature entirely.

The interview, which took place months after the incident, tackled difficult questions about AI ethics, creator rights, and the tension between corporate innovation and user consent. Mehrotra apologized for the decision, acknowledging the breach of trust, while also defending the company's broader vision of AI-native productivity tools. The conversation revealed fundamental disagreements about how extractive AI practices feel to creators whose work and identities were used without permission, though Mehrotra ultimately showed up to the interview despite the uncomfortable circumstances.

  • Superhuman has since discontinued the Expert Review feature and implemented opt-out mechanisms, but questions remain about broader attribution vs. impersonation standards
  • The controversy underscores growing concerns about how AI companies train and deploy models using real people's voices, writing styles, and identities without explicit consent

Editorial Opinion

The Grammarly impersonation incident represents a critical inflection point for AI companies: the difference between attribution and permission. While Superhuman ultimately corrected course, the initial launch reveals how easily AI companies can rationalize extractive practices when the technology enables them. This conversation demonstrates that technical capability alone doesn't answer the ethical question of whether something should be done—and that CEO accountability through direct dialogue, however uncomfortable, may be essential to establishing industry norms around creator consent.

Natural Language Processing (NLP)Generative AIEthics & BiasAI Safety & AlignmentPrivacy & DataMisinformation & Deepfakes

More from Grammarly

GrammarlyGrammarly
INDUSTRY REPORT

Grammarly's AI 'Expert Editors' Tool Faces Backlash for Unauthorized Voice Cloning of Journalists and Authors

2026-03-21
GrammarlyGrammarly
POLICY & REGULATION

Grammarly Faces Class Action Lawsuit Over Unauthorized Use of Names in AI 'Expert Review' Feature

2026-03-17
GrammarlyGrammarly
POLICY & REGULATION

Grammarly Removes AI Expert Review Feature After Legal Backlash Over Unauthorized Use of Writers' Identities

2026-03-13

Comments

Suggested

AnthropicAnthropic
RESEARCH

Inside Claude Code's Dynamic System Prompt Architecture: Anthropic's Complex Context Engineering Revealed

2026-04-05
OracleOracle
POLICY & REGULATION

AI Agents Promise to 'Run the Business'—But Who's Liable When Things Go Wrong?

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us