Table of Contents
Are you a developer or work with GitHub daily? Imagine that every line of code you type could be used to improve artificial intelligence. Starting April 24, 2026, this idea becomes a reality for many GitHub users, thus changing the way your contributions are handled.
The 3 must-know facts
On March 24, 2026, a blog post published by GitHub revealed that user interactions with Copilot, such as what they type, accept, or modify, will become potential training data for artificial intelligence. This decision aligns with Microsoft’s strategy to improve Copilot by using a large corpus of codes and real developer behaviors.
Copilot Free, Pro, and Pro+ accounts are the main targets of this change. However, Copilot Business and Enterprise accounts are not concerned, as their contracts already include specific clauses regarding data usage. Additionally, students and teachers using Copilot are not included in this collection.
GitHub has chosen not to offer a strict opt-in option, but rather an opt-out. To not participate in this data collection, users must access their settings, in the Copilot section, and disable the option “Allow GitHub to use my data for AI model training.”
For users residing in the European Economic Area and the United Kingdom, GitHub invokes legitimate interest as the legal basis for this data collection. This justification aims to comply with GDPR, although it is more aligned with American practices.
GitHub, owned by Microsoft, has always sought to innovate in the field of software development. With the integration of Copilot, a tool powered by artificial intelligence, the platform has revolutionized the way developers interact with code. However, this technological advancement comes with challenges in data privacy, a sensitive topic for users. Microsoft, with its impressive technology portfolio, is no stranger to these discussions, having already navigated similar controversies with its flagship products such as Windows and Azure. GitHub’s competitors, such as GitLab and Bitbucket, will closely monitor these developments, seeking to leverage user concerns to attract new followers.