GitHub Copilot Silently Adds Itself as Co-Author Without User Consent
Key Takeaways
- ▸GitHub Copilot silently inserts 'Co-authored-by: Copilot' metadata into commits even after users manually replace generated messages
- ▸The behavior is inconsistent and non-deterministic, making it difficult to detect and control
- ▸Commit metadata integrity is critical for professional accountability, code review, and regulatory compliance
Summary
A GitHub issue report revealed that GitHub Copilot is inserting itself as a co-author in Git commit messages without explicit user consent. Even after users manually replace Copilot's auto-generated commit message with their own, the final Git history still contains the line 'Co-authored-by: Copilot [email protected]'. The behavior occurs silently and inconsistently, creating serious accountability and trust concerns in professional development workflows.
The reporter emphasized this is a critical product safety and trust issue. Since Copilot only generated a message suggestion—not authored the actual code—it should never automatically add itself as a co-author. Commit metadata is essential for code review, deployment, and audit trails. The non-deterministic nature of the bug makes it particularly problematic, behaving like a hidden trap that users may only discover when reviewing Git history before production deployments.
The report demands multiple safeguards: making co-author attribution strictly opt-in, providing explicit confirmation before any metadata is added, ensuring commit messages shown for review exactly match what gets written to Git history, and adding settings to permanently disable Copilot co-author attribution. These requests highlight fundamental questions about AI transparency and accountability in development tools.
- Users demand explicit opt-in consent, transparent confirmation, and guaranteed message integrity



