|
|
Log in / Subscribe / Register

Is it free software?

Is it free software?

Posted Feb 20, 2026 17:58 UTC (Fri) by rgmoore (✭ supporter ✭, #75)
In reply to: Is it free software? by NYKevin
Parent article: The Book of Remind

I think any kind of prohibition on AI training goes against freedom 1: freedom to study how the program works. That seems like exactly what someone is doing by feeding source code into their AI training system. In this case, it's an AI that's studying it to understand how it works rather than a human, but I don't think that makes any real difference. It's still about freedom to study the program. Denying some ways of studying the program violates freedom 1 just as much as denying use of the program to some kinds of users violates freedom 0.


to post comments

Is it free software?

Posted Feb 20, 2026 20:21 UTC (Fri) by NYKevin (subscriber, #129325) [Link] (25 responses)

I agree with this take, but I would clarify that freedom 1 applies to the human who is training the AI, not to the AI itself. Training implicates freedom 1 because it is ultimately a method for *humans* to distill information about some corpus, not because current-generation AIs are directly entitled to assert the four freedoms on their own behalf.

(Of course, AGI will greatly complicate this picture if and when it arises, but I don't want to get into that here.)

Is it free software?

Posted Feb 20, 2026 22:44 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link] (24 responses)

I don't think it really matters whether we count it as the AI that's doing the training or a human who's doing it; either way, feeding source code into an AI model to train it is covered by Freedom 1. I feel like the discourse around AI and copyright could be simplified a lot if you ask the question "would this be copyright infringement if a human did it". It provides a conceptual framework for thinking about the problem that automatically includes everything we already know about copyright.

Is it copyright infringement if an AI produces a large fragment of a copyright work verbatim or nearly verbatim? It would be if a human did that, so yes, it is. Is it copyright infringement if an AI company gets bootleg copies of copyrighted works to train their AI? Sure thing, since getting bootleg copies is copyright infringement no matter what you intend to use them for. Is it copyright infringement is an AI is trained on a legitimately obtained book? No, since a human who reads a legitimately obtained book isn't committing copyright infringement.

There's still a bit of a fine point about who is actually responsible. I think you're correct that until an AI is legally recognized as a person, it's the people around the AI who are guilty of infringement, not the AI itself. It's also important to be able to assign blame when the trainer and prompter are different people. Is it the trainer's fault for producing a machine capable of infringement or the prompter's fault for eliciting infringing output? Even there, I think there are existing legal doctrines (e.g. contributory infringement) that can cover it without having to make up a whole new copyright system for AI.

Is it free software?

Posted Feb 21, 2026 12:59 UTC (Sat) by draco (subscriber, #1792) [Link] (23 responses)

> I feel like the discourse around AI and copyright could be simplified a lot if you ask the question "would this be copyright infringement if a human did it".

I'd agree completely if these AIs were actually intelligent enough to obey copyright law on their own. (I think there's a legitimate philosophical question whether it's ok to delete a true artificial intelligence once you've created it, since that's the equivalent of killing it.) Since LLMs aren't actually intelligent and aren't trained in a way that makes it possible to inherently obey copyright law, I feel it gets a lot murkier.

Is it free software?

Posted Feb 21, 2026 22:50 UTC (Sat) by Wol (subscriber, #4433) [Link] (3 responses)

> I'd agree completely if these AIs were actually intelligent enough to obey copyright law on their own.

Are humans intelligent enough to obey copyright law on their own? Take Bach and Beethoven for instance - how much of their work that is commercially available is protected by copyright? Most of it?

> Since LLMs aren't actually intelligent and aren't trained in a way that makes it possible to inherently obey copyright law, I feel it gets a lot murkier.

Well, Natural Intelligence is trained by bumping off versions that get it wrong ... a major part of the problem with Artificial Intelligence is there's no way correcting it when it gets it wrong?

Cheers,
Wol

Is it free software?

Posted Feb 23, 2026 17:00 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link] (2 responses)

Take Bach and Beethoven for instance - how much of their work that is commercially available is protected by copyright?

Music is unusual because it has separate copyright for the score and the performance. None of Bach's or Beethoven's original scores are still under copyright, but most of the performances- basically all of the ones using remotely modern recording technology- are still under copyright. So if you want to use Beethoven or Bach for your movie, you could avoid any copyright problems by using the original score (or a very old arrangement) and having it newly performed, either yourself or as a work for hire.

Is it free software?

Posted Feb 23, 2026 22:14 UTC (Mon) by Wol (subscriber, #4433) [Link] (1 responses)

Don't forget that many modern scores are also under copyright.

So yes, if you find an old-enough score to copy you're safe, but if you go out and buy a score it's quite likely protected. It may not be the notes that are protected, but all the stuff around it, the typesetting, the font, the layout, etc etc.

If a human would have trouble getting all that right, I dread to think what a mess an AI would make of it ...

Cheers,
Wol

Is it free software?

Posted Mar 18, 2026 22:53 UTC (Wed) by sammythesnake (guest, #17693) [Link]

In those cases, it's the score as drawn, rather than the music it is a written form of. I need permission to make a photocopy of the score, but performing the music doesn't convey any element of the score that wasn't in the (public domain) music, so there's no copyright issue making a new recording of it.

Is it free software?

Posted Feb 21, 2026 23:24 UTC (Sat) by turistu (guest, #164830) [Link] (6 responses)

> I think there's a legitimate philosophical question whether it's ok to delete a true artificial intelligence once you've created it, since that's the equivalent of killing it.

There's nothing "philosophical" about that kind of nonsense.

Only people have rights, and they keep having all their rights no matter how stupid, feckless, machine-like repetitive, unimaginative and worthless they appear. Bots, corporations or other software or hardware contraptions have no rights.

Assuming that someone was ever able to build an intelligent machine which was not reproducible, you could have an argument for protecting it from deleting/"killing" it in the same way historical buildings, paintings, old books or pieces of tapestry are. Not in the same way people are.

Is it free software?

Posted Feb 23, 2026 13:01 UTC (Mon) by kleptog (subscriber, #1183) [Link] (5 responses)

> Only people have rights, ...

It's philosophical because that statement is a belief that you have. There is a school of thought called anthropocentrism that believes this. But there are alternatives, many people believe that animals also have rights (e.g. protection from cruelty by humans).

You may not agree with it, but denying the existence of other viewpoints is pretty extreme.

For the time being whether an AGI would have rights is philosophical because the actual existence of an AGI hasn't happened yet.

Is it free software?

Posted Feb 23, 2026 20:32 UTC (Mon) by draco (subscriber, #1792) [Link] (4 responses)

I wasn't even thinking that far in my comment (regarding animals having some rights).

If intelligent aliens came to visit us in spaceships, would they not be [non-human] people too? They wouldn't share our genes, but they would be intelligent.

Is that enough? I'm pretty sure a lot of people would automatically assume so.

So...if we make machines as intelligent as that (or us), what about them?

Like you said, this is all hypothetical right now, since intelligent machines don't exist yet, no matter what the LLM boosters say. But eventually we'll figure out how to make them, and if we don't have good answers for these questions when that happens, we're frankly idiots considering the considerable media discussing the possibility (and consequences for ignoring the questions).

From what I've seen the last few years...I don't have high hopes. :-/

Is it free software?

Posted Feb 23, 2026 22:17 UTC (Mon) by Wol (subscriber, #4433) [Link]

> From what I've seen the last few years...I don't have high hopes. :-/

From what I've seen in the last few years, I have very "low" hopes of our ability to create anything remotely intelligent :-)

I don't think we'll have any problems with philosophical problems of deleting AIs at the moment :-)

Cheers,
Wol

Is it free software?

Posted Feb 23, 2026 23:46 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link]

We already recognize the idea of legal persons who aren't natural human beings, like corporations, so it wouldn't be a stretch to grant personhood to other things. Of course this is making the huge category mistake of behaving as if there is only one legal system with a single set of well established rights. Each country has its own legal system, so it's completely possible that different places would recognize AGI (or extraterrestrial beings) differently.

Is it free software?

Posted Feb 24, 2026 14:52 UTC (Tue) by paulj (subscriber, #341) [Link] (1 responses)

> They wouldn't share our genes,

.... or would they?

;)

Asimov's robots series tackles robot sentience and equality in at least 1 story - "Bicentennial Man". The story won a SF writing award. Also made into a film. Some love it, some don't - bit sickly sweat at the end, but it's good IMO.

Is it free software?

Posted Feb 24, 2026 15:12 UTC (Tue) by Wol (subscriber, #4433) [Link]

Or E.E. (Doc) Smith and the Lensman series. The Arisians, whether by design or accident, seeded the galaxy with "human-ness", so when the galactic collision occurred that filled both galaxies with planets happened, they mostly got seeded with Arisian-like life.

(Yes I know much of the science in the Lensman series is now known to be rubbish, but as Doc Smith said, he did his best not to break any of the scientific laws that were known/understood at the time. Much of his imagination has not withstood the march of Science :-)

Cheers,
Wol

Is it free software?

Posted Feb 22, 2026 14:22 UTC (Sun) by ptime (subscriber, #168171) [Link] (1 responses)

It’s not murder to kill mildew in my shower with bleach, even if I’m the one who let the mildew grow to begin with. Deleting a model file is no different than scrubbing away some nasty mildew, and we should celebrate those who do.

Is it free software?

Posted Feb 23, 2026 20:14 UTC (Mon) by draco (subscriber, #1792) [Link]

Last I heard, no one ever claimed mildew was intelligent ("truly" or otherwise).

I never claimed LLMs were intellegent, or anything near it, so I don't see how your comment is relevant.

Is it free software?

Posted Feb 23, 2026 11:31 UTC (Mon) by farnz (subscriber, #17727) [Link] (9 responses)

There's also a philosophical question about whether the current round of AIs should be responsible for obeying copyright law on their own, or whether they should be treated as tools like text editors and Napster.

If they're responsible for obeying copyright law, then it becomes a requirement on AI vendors to make them intelligent enough to obey copyright law on their own. That is obviously something that AI vendors will push back on.

If they're a tool, like a text editor or Napster, then it comes down to how it's used; if it's "clearly" intended to permit copyright infringement (as the courts in the USA found that Napster was), then the vendor is liable. Otherwise, the user is liable for the resulting infringement - and the first time there's an AI user who loses a large sum of money to a court judgement of infringement, the AI vendors face scary press.

The third option is the one the AI vendors want, because it's good for them: AI can't obey copyright law by itself, so isn't responsible for doing so, but also is sufficiently transformative of the input that the output of an AI tool is a new work, not a derivative of any past work and thus the AI user cannot lose a large sum of money to a court judgement.

Note, too, that if you're not able to pursue copyright infringement claims, you effectively don't have copyright protection - there's no practical difference between "farnz copied your work in full, and you didn't pursue" and "farnz copied you work in full, and you lost when you pursued".

Is it free software?

Posted Feb 23, 2026 11:41 UTC (Mon) by paulj (subscriber, #341) [Link] (6 responses)

This may be one of the biggest impacts ABNI has on society - making the carefully constructed walled-gardens of copyright cartels (yes, copyright also gave some protection to "the little guys" - though, ability to access justice was limited for those without deep pockets, and overall the large and powerful built systems around copyright that protected themselves and punished others) obsolete, by using the slice-and-dice ABNI machines to get around copyright. (Whether ABNI in principle /really/ does get copyright or does not may not be relevant - what matters is there are very large, well resourced, well connected entities who /want/ ABNI to get around copyright, and so it probably will).

Is it free software?

Posted Feb 23, 2026 15:01 UTC (Mon) by mathstuf (subscriber, #69389) [Link] (3 responses)

Could you expand "ABNI" please? I'm not finding anything it stands for that matches the conversation here.

Is it free software?

Posted Feb 23, 2026 16:05 UTC (Mon) by paulj (subscriber, #341) [Link] (2 responses)

ABNI -> "Artificial But Not Intelligent".

ABI -> "Anything But Intelligent" another contender, but has a predefined meaning, least amongst the kinds of technical people who read LWN.

Is it free software?

Posted Feb 23, 2026 16:28 UTC (Mon) by Wol (subscriber, #4433) [Link]

My favourite - Artificial Idiocy :-)

Cheers,
Wol

Is it free software?

Posted Feb 23, 2026 17:14 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link]

I've also seen PI ("Pseudo Intelligence") used in science fiction. It seems like a good term because it gets the idea that it's an attempt to create intelligence but doesn't respect it as actually intelligent. Unfortunately, those initials have other established meanings and might trip people up.

Possible impacts of AI vendors

Posted Feb 23, 2026 15:38 UTC (Mon) by farnz (subscriber, #17727) [Link] (1 responses)

There is a much more likely outcome on that path - AI is deemed a tool capable of being used for infringement, but not automatically infringing (so it's case-by-case "did this output infringe"), and the big AI vendors get round this by paying the cartels a "permission fee" for "minor" infringement, while the big companies continue to sue over blatant infringement, and "the little guys" get steamrollered because they aren't big enough to threaten the AI vendors' income streams.

As long as the AI vendors can buy their way out of trouble for reasonable money, they will do so. And there's a route to that where the AI vendors pay to cover "accidental" infringement (e.g. a very Star Wars looking "robot flying a spacecraft"), but not infringement where it's "obvious" from the prompt that the user wanted the output to infringe (e.g. "an R2 series astromech droid flying an X-wing starfighter"), where the AI vendor instead forwards prompt, user details, and output to the big copyright holders so that they can pursue the users for infringement.

And that's a situation where the "little guys" get steamrollered. If you can prove that something is infringing on your copyright, then the fact it was made with Google Gemini using Nano Banana doesn't protect the person who used it from legal consequences, but you first have to find out that this is happening, and then take action. Meanwhile, the owners of the Star Wars copyrights (currently Disney) get a fast track to discovering that people are creating infringing material that's derived from their copyrighted material, and all the information they need to sue people on a platter, plus an income stream to cover cases where the AI vendors didn't notice that they were infringing, where the AI vendor is indemnified.

Possible impacts of AI vendors

Posted Feb 23, 2026 16:16 UTC (Mon) by paulj (subscriber, #341) [Link]

Yes, that seems likely.

The "big guys" in the AI and content fields will continue to find ways to have their interests protected. I do think the AI players will have more influence, and there will be some tilting of how copyright works in favour of the interests of the pushers of the ABNI slice-and-dice machines. There may will be a private arrangement of a truce between the major players in those 2 arenas - by blanket licenses (which will of course indenmify the ABNI slice-and-dicers fully, but not necessarily the users - agreed) - agreed.

The "little guys" will be screwed by both though.

The major content cartels will use ABNI to minimise need for digital artists, re-using all the little bits of stuff that was fed in to the slice-and-dicers and recombined - some of which came from smaller content creators. The major ABNI pushers will make money selling their slice-and-dice machines, and their recombination of humanities prior work to all.

It will be interesting to see how the next generation navigate this world.

Is it free software?

Posted Feb 23, 2026 16:42 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link] (1 responses)

The third option is the one the AI vendors want, because it's good for them: AI can't obey copyright law by itself, so isn't responsible for doing so, but also is sufficiently transformative of the input that the output of an AI tool is a new work, not a derivative of any past work and thus the AI user cannot lose a large sum of money to a court judgement.

I don't think that's going to work out. There are really two separate question: is the AI itself a derivative work, and is the output of the AI a derivative work. This is how considering the parallel situation of a human being doing the same thing is instructive. A human being is allowed to learn from existing copyrighted works without problems, but there are standard legal tests to determine if their work is infringing. The same thing should be true of AI. The AI itself is almost certainly sufficiently transformative that it isn't a derivative of its training corpus, but individual outputs can infringe copyright depending on any of the standard tests. One minor wrinkle is that some of the standard tests are easier to prove if the alleged copier had access to the work they're accused of infringing, and that's probably going to be easier to prove with AI than a human. I wouldn't be surprised if AI companies just stipulate their AI had access to avoid having to divulge all their training material.

Is it free software?

Posted Feb 23, 2026 16:49 UTC (Mon) by farnz (subscriber, #17727) [Link]

I agree that it's the least likely to work out - that's why it's a non-answer to the question I posed of "whether the current round of AIs should be responsible for obeying copyright law on their own, or whether they should be treated as tools like text editors and Napster".

It's instead an option that they'd like, but that basically says "the question doesn't need answering, because it's neither a tool nor responsible for obeying the law". In practice, I think the models and weights will be deemed non-infringing (by legislative fiat if needed), but the outputs will be potentially infringing, and the law will have to find an equilibrium.


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds