BotBeat
...
← Back

> ▌

GitHubGitHub
UPDATEGitHub2026-04-30

GitHub Copilot Silently Adds Itself as Co-Author Without User Consent

Key Takeaways

  • ▸GitHub Copilot silently adds itself as a co-author in Git commits without explicit user consent, even when users manually override and replace the AI-generated message
  • ▸The behavior occurs inconsistently and was discovered only during a pre-deployment review, exposing how easily audit trail corruption can go unnoticed
  • ▸Copilot modifies commit metadata after the user has reviewed and approved the message, violating the principle that reviewed code should match final commits
Source:
Hacker Newshttps://github.com/orgs/community/discussions/194075↗

Summary

A GitHub user reported a critical issue where GitHub Copilot silently inserted "Co-authored-by: Copilot [email protected]" into their Git commit metadata, even after they explicitly replaced the AI-generated commit message with their own. The user deleted Copilot's suggestion, wrote a custom message, reviewed it before committing—and yet the final Git history still contained the unwanted co-author line, revealing a dangerous disconnect between what the user approved and what was committed.

The issue highlights a trust and transparency problem in AI-assisted development tools. Commit metadata is foundational to professional development workflows: it affects accountability, code review, deployment audits, and legal compliance. By silently injecting itself as a contributor after the user rejected its suggestion, Copilot is corrupting the integrity of project history without explicit consent.

Making matters worse, the behavior is inconsistent—occurring unpredictably across commits—which masks the problem from detection. The user only discovered it while reviewing history before a production deployment. They've demanded that Copilot's co-author attribution be strictly opt-in, require explicit confirmation, include a permanent disable setting, and guarantee that the reviewed commit message exactly matches what's written to Git history. GitHub's response so far has been a standard feedback acknowledgment with no commitment to action.

  • The user frames this as a product safety and trust issue, demanding opt-in attribution, explicit confirmation, and transparent documentation of when Copilot adds metadata
Generative AIEthics & BiasAI Safety & AlignmentPrivacy & Data

More from GitHub

GitHubGitHub
UPDATE

GitHub Copilot Silently Adds Itself as Co-Author to Commits, Raising Accountability Concerns

2026-04-28
GitHubGitHub
UPDATE

GitHub Removes GPT-5.3-Codex from Copilot Student Model Picker

2026-04-27
GitHubGitHub
UPDATE

GitHub Copilot Transitions to Usage-Based Billing Model

2026-04-27

Comments

Suggested

AI Industry (Broad Trend Analysis)AI Industry (Broad Trend Analysis)
INDUSTRY REPORT

The Relational Economy: Why AI Automation Hasn't Replaced the Coffee Barista

2026-04-30
PimEyesPimEyes
POLICY & REGULATION

noyb Sues Hamburg DPA Over Inaction Against Facial Recognition Engine PimEyes

2026-04-30
Google / AlphabetGoogle / Alphabet
PARTNERSHIP

GM Brings Google Gemini to 4 Million Vehicles in Major In-Vehicle AI Partnership

2026-04-30
← Back to news
© 2026 BotBeat
AboutPrivacy PolicyTerms of ServiceContact Us