Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
An experiment to test GitHub Copilot's legality (seirdy.one)
38 points by modinfo on July 2, 2022 | hide | past | favorite | 8 comments


An experiment I like better:

- Gain research access to the Windows source code. (This isn't hard)

- Train an ML model on the Windows source code

- Release said model, so that anyone can recreate Windows source files

It's not clear what happens next, but hilarity ensues.


I'm sure there's less of it, but training on copilot's source would have more delicious irony.


(Disclaimer: not a lawyer so may be wrong here)

There's a third outcome not mentioned here though. The author loses both cases but Microsoft's Copilot is still legal. That's because there's major difference between releasing a product from AI generated code, and Microsoft releasing the AI itself. Microsoft Copilot might generate code derived from GPL, but it's up to the developer to ensure that it isn't before releasing it.

The precedent that Copilot sets is that you can train AI to generate copyleft code, but that doesn't mean you get to release it.


Microsoft doesn't come away a winner either in that scenario. They're left with a product that can infect their customer's code with unwanted license, not to mention questions about the ethics of reproducing code to such a level of fidelity that you trigger the license.


> They're left with a product that can infect their customer's code with unwanted license

Which is why the company I work for has banned Copilot (a decision I agree with).


Thanks MS for single-handedly breaking open source. You've done a huge disservice to everyone.


I think most of us cognizant of Microsoft's history expected they'd try the ol' Embrace Extend Extinguish approach on software freedom, but I can't say I anticipated exactly how wacky their attempts would end up being.


They just found a way to cure "cancer"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: