Hacker News new | past | comments | ask | show | jobs | submit login

BitTorrent is certainly not a good example to follow, but I do think that copilot is more wrong.

They should definitely include disclaimers and make seeding opt-in (though I don't know how safe you are legally when you download a Lion King copy labeled Debian.iso). That said, they don't have the information necessary to tell whether what you're doing is legal or not.

Copilot _has_ that information. The model spits out code that it read. They could disallow publishing or commercially using code generated by it while they're sorting it out, but they made the decision not to.

AI is hard, but the model is clearly handing out literal copies of GPL code. Github knows this and they still don't tell you about it when you click install.




It doesn't matter if the information is there or not, since an algorithm cannot commit a copyright violation. There is at least one human involved, and the human is the one who is responsible.

A car has all the information that it's going faster than the speed limit, or that it just ran a red light. But in the end it's the driver who is responsible. It's not the tool (car, Copilot) that commits the illegal act, it's the user using that tool


In the case of Copilot, you don't even have a speedometer.


So your point is that removing the speedometer from your car and then claiming "I didn't know I was driving too fast!" will make it somehow not your responsibility?

It is still your responsibility to know and obey the traffic laws, the same as it is your responsibility to obey the copyright laws....




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: