- Microsoft suffered backlash after ‘Co-authored-by: Copilot’ began appearing widely on VS Code
- The company has reversed this decision as of version 1.119.
- Developers are still unhappy that the ‘bug’ made it to production
Microsoft has reverted a controversial change in VS Code that automatically partially attributes Github commits to Copilot, even when the AI tool was not used.
Developers had previously taken to forums, including Reddit, to complain that ‘Co-author of: Copilot’ was being added to their commits, even though they had not used the assistant and had even disabled Copilot’s chat features.
It is unclear if this previous behavior was intentional, however, it appears that Microsoft has admitted the bug and rectified the issue in a new update.
Article continues below.
This explains the ‘Co-author of: Copilot’ issue on GitHub
A March 2026 change within VS Code reportedly added the Copilot authorship tag regardless of use of Copilot, although one VS Code reviewer has since apologized: “There was no ill intent on VS Code’s part.” [an] evil corporation, but rather a desire to support functionality that some customers expect from VS Code [with regard to] Code generated by AI.”
Following a more recent change in version 1.119, AI attribution will only be added if users explicitly choose it.
“Obviously, it should not be enabled when enableAIFeatures is enabled and should not report changes that have not been made by the AI,” wrote Dmitriy Vasyura. “I will work to fix them and in the meantime I will disable the default in update 1.119.”
The company has also reduced intrusive Copilot integrations following broader backlash from developers, as coders are less likely to trust a tool that automatically changes metadata without their explicit consent.
Even though the Microsoft worker confirmed that the changes to Copilot’s author tag have been reverted, users still expressed distrust in the company for allowing the feature to reach production in the first place. Many criticized the company for referring to such changes as mistakes, pointing out that they were intentional from the beginning.
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds.




