Quotes of the week
I've seen the kinds of things older LLMs produced and they looked like garish copy/pastes from Stack Overflow. What I've seen with current stuff looks much more like what happens when the patterns from reading a lot of code (and what's been written about code) get generalized. And this seems to be the point of software development (especially Open Source): learn from what everyone is doing to collectively improve the state of the art. One cannot cut/paste between mismatched licensed projects, but what one learns from the projects is portable.— Kees CookAnyway, I Am Not A (Copyright nor Patent) Lawyer. I think the tools are finally getting interesting, and I'm a pragmatist. :)
Developers have been using tools to help their work for decades, starting with Emacs macros, perl and sed scripts, and then on to Coccinelle, and more recently Large Language Models. So in many ways, this is nothing new.— Ted Ts'o
My point here is that AI can now add questions that maintainers can't answer. Is it really legal? Can the maintainer trust it? Yes, these too can fall under the "technical reasons" but having a clear policy that states that a maintainer may not want to even bother with AI generated code can perhaps give the maintainer something to point to if push comes to shove.— Steve Rostedt