BotBeat
...
← Back

> ▌

AnthropicAnthropic
POLICY & REGULATIONAnthropic2026-04-02

Anthropic's DMCA Takedown Accidentally Removes Thousands of Legitimate GitHub Forks

Key Takeaways

  • ▸Anthropic's DMCA takedown inadvertently affected 8,100 legitimate forks of its official public repository, though the company has since worked with GitHub to reverse most of the removals
  • ▸Copies of the leaked Claude Code source remain widespread across multiple platforms, including GitHub and Codeberg, limiting the effectiveness of DMCA enforcement efforts
  • ▸Developers have already created AI-generated reimplementations of the leaked code in different programming languages, which may be legally distinct from the original despite functional similarity
Source:
Hacker Newshttps://arstechnica.com/ai/2026/04/anthropic-says-its-leak-focused-dmca-effort-unintentionally-hit-legit-github-forks/↗

Summary

Anthropic filed a DMCA takedown notice with GitHub targeting repositories containing its recently leaked Claude Code client source code. However, the takedown inadvertently resulted in the removal of approximately 8,100 legitimate forks of Anthropic's official public Claude Code repository, affecting developers who had not shared any leaked code. GitHub's automated network-wide processing applied the takedown more broadly than intended, sweeping up repositories that merely forked the official public codebase.

After receiving complaints from affected developers, Anthropic worked with GitHub to reverse the overzealous takedowns by Wednesday. The company requested that GitHub restrict the takedowns to only the 96 fork URLs specifically named in the original notice and reinstate all other disabled repositories. Anthropic attributed the error to "a communication mistake" rather than intentional overreach.

Despite the corrected approach, Anthropic faces significant challenges in containing the leaked source code. Multiple copies of the code remain readily available on GitHub and other platforms like Codeberg, which operates outside US DMCA jurisdiction. Additionally, developers have already created "clean room" reimplementations of the code using AI tools, converting the original TypeScript into other languages like Python and Rust, potentially circumventing copyright protections.

  • The complexity is compounded by Anthropic developers' own use of Claude Code to write parts of Claude Code, potentially complicating copyright protection claims since the US Copyright Office generally doesn't protect entirely AI-generated work

Editorial Opinion

While Anthropic's overreach with the DMCA takedown demonstrates the collateral damage that aggressive IP enforcement can cause, the incident also highlights the futility of trying to contain widely distributed source code once it has been leaked. The emergence of AI-generated clean-room implementations suggests that even perfectly executed takedown efforts would struggle to prevent functional recreation of the code, raising questions about whether DMCA-based approaches are the right tool for protecting AI software in an era of generative coding tools.

CybersecurityEthics & BiasAI Safety & AlignmentPolicy & Regulation

More from Anthropic

AnthropicAnthropic
RESEARCH

Inside Claude Code's Dynamic System Prompt Architecture: Anthropic's Complex Context Engineering Revealed

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Security Researcher Exposes Critical Infrastructure After Following Claude's Configuration Advice Without Authentication

2026-04-05

Comments

Suggested

OracleOracle
POLICY & REGULATION

AI Agents Promise to 'Run the Business'—But Who's Liable When Things Go Wrong?

2026-04-05
AnthropicAnthropic
POLICY & REGULATION

Anthropic Explores AI's Role in Autonomous Weapons Policy with Pentagon Discussion

2026-04-05
SourceHutSourceHut
INDUSTRY REPORT

SourceHut's Git Service Disrupted by LLM Crawler Botnets

2026-04-05
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us