October 14, 2024

magellan-rfid

More Computer Please

GitHub introduces Copilot for Business, with admin controls • The Register

GitHub introduces Copilot for Business, with admin controls • The Register

GitHub has launched a business version of its assistive programming service Copilot that provides administrators with a way to prevent suggestions using public source code.

For $19 per user per month, companies can deploy Copilot for Business with the assurance that they can prevent the underlying machine learning model from offering autocompletions based on code that can be found online.

“You can easily set policy controls to enforce user settings for public code matching on behalf of your organization,” explains Shuyin Zhao, senior director of product management, in a blog post.

According to Microsoft-owned GitHub, about one percent of suggestions potentially contain code snippets in excess of 150 characters that match training set code – which was culled from public online source code under a variety of software licenses.

This feature, a public code filter, is already available to individual users, who pay $10 per month for Copilot’s AI help. But for corporate accounts, control over this filter belongs to the IT administrator.

Copilot for Businesses is available to customers with GitHub Enterprise Cloud licenses, but it’s not the same offering. It provides centralized management of Copilot user licenses – desirable for managing usage and payment across a large team of developers.

Copilot for Business comes with a commitment that GitHub “won’t retain code snippets, store or share your code regardless if the data is from public repositories, private repositories, non-GitHub repositories, or local files.”

So in theory, business customers can rest assured that their super-secret, money-minting algorithm won’t get sent to GitHub for product improvement.

Copilot for Business, however, does transmit “engagement data,” events related to editing actions (e.g. competitions accepted or dismissed), errors, and data like latency and feature use, including potentially personal data like pseudonymous identifiers.

Whether Copilot for Business’ promise to disregard the code suggestions it generates will deny data that could be used to improve future output isn’t clear. But it may diminish concerns that the code coughed up by Copilot could invite copyright infringement or software licensing claims.

Lawsuit

Microsoft, GitHub, and OpenAI – maker of the Codex model upon which Copilot is based – have already been sued on that basis.

In November, lawyer and developer Matthew Butterick announced a lawsuit challenging Copilot, described as “an AI prod­uct that relies on unprece­dented open-source soft­ware piracy.” The lawsuit alleges that by training Copilot on public GitHub repositories, the defendants have violated the legal rights of numerous developers based on the terms of various open source software licenses.

But GitHub, aware that its enterprise customers might be put off by uncertain legal risk, has a standing offer to defend corporate clients against infringement claims based on Copilot output in its GitHub Copilot Product Specific Terms.

GitHub will defend you against any claim by an unaffiliated third-party that your use of GitHub Copilot misappropriated a trade secret

“GitHub will defend you against any claim by an unaffiliated third-party that your use of GitHub Copilot misappropriated a trade secret or directly infringes a patent, copyright, trademark, or other intellectual property right of a third party, up to the greater of $500,000.00 USD or the total amount paid to GitHub for the use of GitHub Copilot during the 12 months preceding the claim,” the enterprise customer agreement says.

There are some caveats. GitHub won’t ante up if: the allegedly infringing code differs from what Copilot suggested; “you fail to follow reasonable software development review practices designed to prevent the intentional or inadvertent use of Code in a way that may violate the intellectual property or other rights of a third party”; or if you failed to enable GitHub’s code filtering features.

Individual Copilot users and Copilot for Business customers not under enterprise accounts will have to face any legal action on their own – if it comes to that. Whatever the case, GitHub makes clear that Copilot users are responsible for vetting any suggested code for security and lawfulness.

Asked to comment on whether Copilot for Business addresses the concerns raised in the lawsuit, a GitHub spokesperson told The Register in an email, “We’ve been committed to innovating responsibly with Copilot from the start, and will continue to evolve the product to best serve developers across the globe.”

Matthew Butterick, the plaintiff in the case against Microsoft, GitHub, and OpenAI, told The Register in an email that Microsoft has not yet responded to the lawsuit and that he does not consider organizational policy controls to be relevant to his claim. ®