Class action against GitHub Copilot
Class action against GitHub Copilot
Posted Nov 15, 2022 9:44 UTC (Tue) by farnz (subscriber, #17727)In reply to: Class action against GitHub Copilot by sfeam
Parent article: Class action against GitHub Copilot
I disagree with your analysis of Copilot, by analogy to a human.
Letting a human read text is not, in and of itself, an infringing activity. Nor is the resulting brain state in the human, even if it includes literal copies of text code they read. But the output of a human can itself be infringing, if instead of using my training to inform what I do, I regurgitate memorised chunks of text.
I expect the same principles to apply to Copilot and similar systems; training Copilot is not infringing. The resulting model is not infringing in and of itself. The output from the system, can, however, be infringing, and the degree to which it is a legal problem depends on the degree of infringement, and the extent to which the system disguises the origins of the code (in terms of contributory infringement, if I tell you that I'm showing you sample code from a given source, and you copy it, that's a different case to if I give you code that I do not attribute).
Posted Nov 17, 2022 10:25 UTC (Thu)
by NRArnot (subscriber, #3033)
[Link]
for obj in object_list: obj.do_stuff() is surely fair, however the AI arrived at it.
for sd5obj in sd5_get_blue_meanies(): sd5obj.frobnicate_from_sd4( ) is surely a verbatim copy of somebody's identifiable code, and should at the very least be attributed.
Class action against GitHub Copilot