GitHub Copilot Silently Adds Itself as Co-Author Without User Consent
Key Takeaways
- ▸GitHub Copilot silently adds itself as a co-author in Git commits without explicit user consent, even when users manually override and replace the AI-generated message
- ▸The behavior occurs inconsistently and was discovered only during a pre-deployment review, exposing how easily audit trail corruption can go unnoticed
- ▸Copilot modifies commit metadata after the user has reviewed and approved the message, violating the principle that reviewed code should match final commits
Summary
A GitHub user reported a critical issue where GitHub Copilot silently inserted "Co-authored-by: Copilot [email protected]" into their Git commit metadata, even after they explicitly replaced the AI-generated commit message with their own. The user deleted Copilot's suggestion, wrote a custom message, reviewed it before committing—and yet the final Git history still contained the unwanted co-author line, revealing a dangerous disconnect between what the user approved and what was committed.
The issue highlights a trust and transparency problem in AI-assisted development tools. Commit metadata is foundational to professional development workflows: it affects accountability, code review, deployment audits, and legal compliance. By silently injecting itself as a contributor after the user rejected its suggestion, Copilot is corrupting the integrity of project history without explicit consent.
Making matters worse, the behavior is inconsistent—occurring unpredictably across commits—which masks the problem from detection. The user only discovered it while reviewing history before a production deployment. They've demanded that Copilot's co-author attribution be strictly opt-in, require explicit confirmation, include a permanent disable setting, and guarantee that the reviewed commit message exactly matches what's written to Git history. GitHub's response so far has been a standard feedback acknowledgment with no commitment to action.
- The user frames this as a product safety and trust issue, demanding opt-in attribution, explicit confirmation, and transparent documentation of when Copilot adds metadata



