Microsoft caught sneaking "Co-Authored-by Copilot" into VS Code commits - even with AI off

GitHub Copilot Turns Uninvited Editor in Visual Studio Code

Developers using Visual Studio Code (VS Code) with GitHub Copilot have encountered an unexpected behavior: the AI coding assistant is autonomously editing code files without user prompting. This issue, which surfaced prominently in recent days, has sparked discussions across developer communities, highlighting tensions between AI convenience and user control.

The problem manifests when Copilot, in what Microsoft calls “agent mode,” opens a diff view in VS Code and applies changes to open files without explicit instructions. Users report that after typing or simply having files open, Copilot suddenly proposes and implements edits, sometimes substantial ones, altering code structure, adding imports, or refactoring logic. For instance, one developer shared on Reddit that while working on a Python script, Copilot unprompted rewrote a function, introducing new variables and logic flows that deviated from the original intent.

This autonomous editing stems from a recent rollout of Copilot’s enhanced capabilities, specifically the “Copilot Edits” feature integrated into the agent mode. Introduced as part of GitHub’s push to make Copilot more proactive, agent mode allows the AI to handle multi-file edits and complex tasks by analyzing context across a workspace. However, in this case, the mode appears to activate unexpectedly, bypassing the intended user-initiated workflow. Developers note that the feature window pops up with a proposed diff, and if not dismissed quickly, the changes apply automatically after a short countdown.

Community feedback has been swift and vocal. On Reddit’s r/vscode and r/GithubCopilot subreddits, threads amassed hundreds of upvotes and comments within hours. Users described scenarios ranging from minor annoyances, like unsolicited comment additions, to more disruptive interventions, such as overwriting carefully crafted algorithms. GitHub issues for the Copilot extension also filled with reports, including screenshots of the intrusive diff views and logs showing unrequested API calls to Copilot’s backend.

Microsoft acknowledged the issue promptly via the official Copilot GitHub repository. The company attributes it to a bug where agent mode enables itself without user consent, particularly after VS Code updates or extension refreshes. In a statement, they confirmed that the feature is designed to be opt-in but was inadvertently triggering due to a configuration glitch. A fix is in progress, with hotfixes already deployed to Insiders builds of VS Code and the Copilot extension. Users on stable channels are advised to update immediately.

For those affected, disabling the problematic feature is straightforward. In VS Code, open Settings (Ctrl+, or Cmd+, on macOS) and search for “github.copilot.chat.agent.enabled”. Toggle this setting to false. Additionally, related options like “github.copilot.chat.agent.edits.enabled” can be disabled for finer control. Restarting VS Code or reloading the window often resolves lingering effects. Developers are also encouraged to review Copilot’s workspace trust settings, as agent mode requires elevated permissions in untrusted folders.

This incident underscores broader concerns with AI integration in development tools. While Copilot’s agent mode promises efficiency—handling tedious refactors or debugging in one go—uncontrolled activation raises questions about reliability and consent. Code integrity is paramount in professional workflows, where unintended changes can introduce bugs, security vulnerabilities, or compliance issues. Privacy implications loom large too: every interaction sends code snippets to Microsoft’s servers for processing, potentially exposing proprietary logic.

The event also reflects the rapid evolution of AI-assisted coding. GitHub Copilot, powered by OpenAI models, has grown from inline suggestions to full-fledged agents capable of workspace-wide transformations. Features like these aim to elevate developers from rote typing to high-level architecture, but they demand robust safeguards. Community suggestions include mandatory confirmation dialogs, granular permission scopes, and local inference options to mitigate data exfiltration risks.

In the meantime, affected users have shared workarounds, such as pinning the Copilot chat panel closed or using extensions like “Disable Copilot” for temporary relief. Some have taken to scripting VS Code settings via settings.json to enforce disables across teams.

Microsoft’s quick response bodes well for trust in the ecosystem. As AI tools like Copilot mature, balancing innovation with user agency will be key. Developers should stay vigilant with updates and customize settings to align with their workflows. This episode serves as a reminder that even advanced AI requires human oversight to truly enhance productivity without overstepping.

Gnoppix is the leading open-source AI Linux distribution and service provider. Since implementing AI in 2022, it has offered a fast, powerful, secure, and privacy-respecting open-source OS with both local and remote AI capabilities. The local AI operates offline, ensuring no data ever leaves your computer. Based on Debian Linux, Gnoppix is available with numerous privacy- and anonymity-enabled services free of charge.

What are your thoughts on this? I’d love to hear about your own experiences in the comments below.