Open source and the Cyber Resilience Act
The European Union's Cyber Resilience Act (CRA) has caused a stir in the software-development world. Thanks to advocacy by the Eclipse Foundation, Open Source Initiative, Linux Foundation, Mozilla, and others, open-source software projects generally have minimal requirements under the CRA — but nothing to do with law is ever quite so simple. Marta Rybczyńska spoke at Linaro Connect 2025 about the impact of the CRA on the open-source ecosystem, with an emphasis on the importance of understanding a project's role under the CRA. She later participated in a panel discussion with Joakim Bech, Kate Stewart, and Mike Bursell about how the CRA would impact embedded open-source development.
Rybczyńska is not a lawyer. She's a security professional and a developer, but
"we cannot leave law to the lawyers
". A company in need of legal advice
should go to its lawyer; for the rest of us, we have to rely on
summaries from interested non-lawyers, or our own research.
The CRA has already become law, but does not come completely into force until
2027, Rybczyńska said. Some provisions start earlier than others; as of
September 2026, vendors will need to report exploited vulnerabilities.
"Basically everything
" is affected: any software
or hardware that is or can be connected to the Internet and is sold in Europe.
There are specific exceptions for web sites, for
products with existing regulations, and for hobby projects (including many
open-source projects). Open-source stewards, organizations that guide an
open-source project but don't qualify as manufacturers, also have reduced
requirements.
![Marta Rybczyńska [Marta Rybczyńska]](https://static.lwn.net/images/2025/marta-rybczynska-linaro-small.png)
So, if hobby projects are an exception to the law, why does anyone without access
to a corporate legal team need to care? Rybczyńska laid out two possible
futures: either CRA compliance becomes another regulation for lawyers to work
around with paperwork, self-assessments, and calculated risks of being caught,
or software developers take the opportunity that the CRA offers to persuade companies to
employ the best practices "that engineers have always wanted.
"
If someone is simply a developer of open-source software, which they don't monetize, they have no obligations under the CRA. But they can help vendors who do have those obligations choose real change over paperwork-only "compliance" by having a clear reporting channel for security vulnerabilities and a way to announce to users when those vulnerabilities are discovered. This helps consumers, but another provision of the law directly helps the open-source project itself. Manufacturers that monetize their products are legally responsible for all included software in their products, even if it's open source. If a manufacturer uses 1,000 open-source projects, it is responsible for fixing bugs in those 1,000 projects, Rybczyńska said.
Historically, companies have often demanded security fixes from open-source
projects. The CRA inverts that relationship: companies are required to fix
security problems in the open-source software they use, and report security problems to the
upstream project. This obligation lasts for
the entirety of the CRA's support period, five years after a consumer buys
the end product. The companies are, unfortunately, not required to actually
share their bug fixes (except as compelled to do so by a project's license)
— but if an open-source project makes it easy to do so,
they can likely be convinced to contribute back, if only so that they don't have
to maintain a fix out-of-tree.
[As pointed out in a comment, the CRA does
actually require companies to share bug fixes with the upstream project.]
That isn't the only obligation companies have under the CRA, Rybczyńska continued. Companies will also be required to report security incidents to the government, and perform a risk analysis of their software-development process, although the CRA doesn't mandate a framework to perform that risk analysis. It does require companies to use encryption for user data, encrypted communication, and mechanisms to ensure code integrity, such as signed images, in their products.
Rybczyńska finished her talk by inviting people again to consider the two possible worlds. Open-source developers can ignore the CRA, in which case companies will likely stick to working around the CRA with paperwork, or fixing bugs without sharing. Or open-source developers can embrace the CRA, make it easy for corporate users of their software to contact them with information about vulnerabilities, cooperate with risk analyses, and receive an army of paid engineers to fix security-related bugs for them.
Discussion
Bech, an employee at Linaro,
led a later panel discussion about the CRA with Rybczyńska, Stewart, and
Bursell. Stewart works at the Linux Foundation on dependable embedded systems;
she gave
a related talk earlier in the week. Bursell
serves as the executive director of the
Confidential Computing
Consortium.
Bech opened with a simple question for
Rybczyńska: "Marta, if I'm a small business, what should I do?
" Her
answer was: "Figure out who you are, under the CRA
". Manufacturers,
open-source stewards, and contributors all have different obligations, she
explained. Bursell added that there are specific provisions for small businesses
as well, so company size can also play a role.
"If you fancy going to sleep one night,
reading the CRA is a great way to do
that,
" he said. Rybczyńska and Stewart disagreed, saying that the law has
many interesting parts. Stewart was particularly interested in the
classification of operating systems as "important" components.
Bursell briefly explained about the different levels of products defined in the CRA (in paragraphs 43 through 46, primarily). By default, products can be self-certified for compliance; their manufacturers only need to provide supporting materials on request. "Important" products, a category that includes everything from baby monitors to operating systems, are held to a higher standard, and may need to have paperwork filed in advance. "Critical" products are the highest category, with additional compliance obligations. He advised people to err on the side of caution, or ask the EU for clarification if unsure about the status of a specific product.
A concern that applies regardless of product classification, however, is the
mandate that companies which sell a product retain documentation about its CRA
compliance for 10 years. Rybczyńska urged everyone to generate that
documentation in advance and save it in a safe place; trying to come up with a
software bill of materials (SBOM) at the time of a request is likely to be problematic.
Stewart agreed, saying: "Yeah, don't do that.
"
Bech asked what kind of documentation was covered by the requirement. Rybczyńska
gave a long list: processes for software development, evidence that they were
followed, a product's SBOM, and a complete history of security updates for the
product. She emphasized that companies should really have a complete history of
security updates for their products already. "We all know many cases where
something went wrong in a product after a sequence of updates; if you don't have
them, you can't debug.
"
Stewart advised that companies should be generating a new SBOM along with each of those security updates, as well, which can help with reproducing problems. A lot of the challenges of CRA compliance will come during mergers and acquisitions, she said, when trying to reconcile processes for these things across companies. Stewart was also worried about the relationship between datasets and trained machine-learning models, which the CRA doesn't cover. Rybczyńska agreed, noting that machine-learning models are increasingly used in security-critical applications such as firewalls.
Bech asked the panel members what they thought about the requirement that
companies provide security fixes for their dependencies — "won't that result
in a kind of fragmented 'fixed' ecosystem?
" Rybczyńska agreed that it could
happen, but called it an opportunity for vendors to review their whole supply
chain and minimize their dependencies, focusing on dependencies with good
security policies. If a company relies on abandoned projects, she said, that's
going to cause a nightmare eventually, so it's better to find that out up front.
In her opinion, the next thing SBOM tooling needs is a way to track project's
security policies as well as their licensing requirements.
"I'd go further,
" Bursell asserted. If a vendor's product relies on an
open-source project, the company should be involved in the project's
development, or at least pay for its support, he said. He expressed the hope that the CRA
would push more companies in that direction. Bursell also wondered how much
information about the software running in a product's build environment, rather
than direct dependencies, the CRA requires.
Stewart answered that the CRA leaves that undefined, just requiring "an SBOM
".
What exactly that means is not clear, with US, German, and Japanese agencies all
publishing different
definitions and requirements. Bech asked what parts of the definition were
missing from the CRA.
The industry currently focuses too much on documents, Stewart answered, which
provide a snapshot in time. The definition of an SBOM would ideally handle
keeping that information in a database. Rybczyńska added that the tooling simply
isn't there yet — there are multiple SBOM standards, multiple SBOM-generating
tools, and "you are expected to make sense of all of that
".
One member of the audience asked whether the panelists thought that the CRA
would harm open-source adoption. Stewart said that the Linux Foundation had a
survey done which showed that 46% of manufacturers passively rely on upstream
projects for security fixes. "The way the CRA looks at it is, the people
making money should have skin in the game.
" Ultimately, she doesn't think
the CRA will hurt open source. Bursell also suggested that if an open-source
developer is worried about this, it's "a great chance to figure out who your
users are
".
Rybczyńska pointed out that under the CRA, open-source projects are not required to attach a "CE mark" to their project (the mark that claims compliance with the CRA's security requirements) — but nothing is stopping them from doing that paperwork voluntarily. If a project is concerned with increasing adoption, it could do the work to obtain a C mark as a marketing tactic.
The panel session ended with a few more questions about the exact scope of the
CRA. The panelists clarified that it applies to "any product with a digital
element
", including firmware. Rybczyńska advised that if someone were really
concerned about whether they could get away with not complying for a specific
product, they should "speak to your lawyers, not three semi-experts
".
It seems clear that the CRA is going to have an impact on the open-source ecosystem, if only because of its impact on the entire software sector. While open-source contributors don't have direct obligations under the CRA, they will still need to be aware of its implications in order to effectively guide their projects.
[Thanks to Linaro for funding my travel to Linaro Connect.]
Addendum: Videos of Rybczyńska's keynote and the panel discussion are available on YouTube, and slides for the talks are available on Linaro's website.
Index entries for this article | |
---|---|
Conference | Linaro Connect/2025 |
Posted Jun 5, 2025 19:13 UTC (Thu)
by pizza (subscriber, #46)
[Link] (9 responses)
It seems to me that unless your project has a legal organization standing behind it, you're better off not touching any of this without a 30.48-meter pole.
(I also wonder how exactly this is supposed to work with modern software ecosystems that routinely require directly pulling in dozens, if not hundreds, of relatively small dependencies....)
Posted Jun 5, 2025 19:58 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link]
But this is also a good channel to make money for some projects. You can provide the C-marked version to downstream users for a fee.
Posted Jun 5, 2025 21:11 UTC (Thu)
by kleptog (subscriber, #1183)
[Link]
That, or there being a big repository of such documentation for lots of projects. But being near the source sounds preferable.
Posted Jun 5, 2025 22:49 UTC (Thu)
by Wol (subscriber, #4433)
[Link] (6 responses)
The article tells you, the law tells you ...
IF YOU'RE NOT *SELLING* IT, YOU'RE NOT LIABLE !!!
If you don't want to touch it with a ten foot barge pole, just don't sign a contract! No contract, no liability. End Of.
If you're routinely pulling tens or hundreds of dependencies, but haven't audited them, that's called negligence. Doesn't matter whether you're a tin-pot lone developer, or a mega-billion company.
What the CRA will do is turn those negligent projects into "here be dragons" zones. Yes it'll have the unfortunate side effect of effectively killing off many of them. But at the end of the day, do we really need 50 different open-source implementations of the same two-line utility that effectively just changes the variable names in the source?
And if it results in a bunch of Open Soruce Co-Operatives providing support and CE markings for those utilities, then that will put food on the tables of Open Source guys. So yes, I would, most certainly, be only too happy to touch this with a punt pole :-)
Cheers,
Posted Jun 6, 2025 7:22 UTC (Fri)
by taladar (subscriber, #68407)
[Link] (5 responses)
Counterpoint, if you are overly focused on the number of dependencies you are in denial about the fact that large dependencies have areas that bit-rot just as much and are just as unmaintained, just without a convenient way to talk about them individually and without an easy way to see when it was last updated. And, for that matter, without a convenient unit to audit one at a time and judge how much the current version diverges from the last one you (or someone you trust) audited.
And if you think you can just avoid using a dependency altogether and implement it yourself then you are in denial about the fact that the same is true about your own code. Code isn't inherently better because it was written by your own past self and bit-rots in your own code base compared to a dependency. In fact, what essentially amounts to an internal dependency with a single user is very likely less reviewed and audited and has more undiscovered bugs, than an external dependency used by many people.
Posted Jun 6, 2025 9:20 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link]
The end result was _shorter_ code: turned out implementing just the required subset of the functionality in-tree was easier than adapting to variegated conventions of dozens of upstreams.
At the end only the hard stuff remained as dependencies: cryptography, complicated algorithms and data structures, parsers.
Posted Jun 6, 2025 11:54 UTC (Fri)
by pizza (subscriber, #46)
[Link]
You have left out the fixed per-dependency/audit overhead.
One set of paperwork is a lot less work to manage, maintain, and update than several dozen (or an order of magnitude more).
> Code isn't inherently better because it was written by your own past self and bit-rots in your own code base compared to a dependency.
One key difference is that once audited, it's done, and it won't change out from underneath you and require a new audit. And it can also be part of of the same audit (and paperwork) as your main body of code.
Posted Jun 6, 2025 15:00 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link] (2 responses)
> And if you think you can just avoid using a dependency altogether and implement it yourself then you are in denial about the fact that the same is true about your own code.
That’s not the point. Delegation reduces accountability. Reduced accountability reduces quality. That’s true in all domains not just IT, that’s why tenders for big projects forbid subcontracting bellow a certain level (usually, after painful failures).
And you can fail with one level of subcontracting just like you can fail with a handful of dependencies, and you can succeed with 10 levels of subcontracting just like you can succeed with hundreds of dependencies. However on average, all other things being equal, and humans being human and prone to pass the shit-can, you’re almost certain to be in deep trouble if you abuse the delegation amount.
Posted Jun 6, 2025 16:54 UTC (Fri)
by farnz (subscriber, #17727)
[Link] (1 responses)
As a result, "all other things" are not equal; the larger a dependency is, the more likely it is to be hiding subcontracting from you, with the resulting pain around too many layers of subcontracting.
There's an irreducible complexity here; each component part of the system needs maintaining. If you have 100 small dependencies, each of which contains one component part, then you have 100 relationships to maintain. If you have 10 bigger dependencies, then you see 10 relationships to maintain, but those 10 may well be simply managing 10 relationships themselves (and not maintaining anything), leaving you dependent on 110 relationships going well (the 10 you maintain, plus the 100 hidden sub-contractors).
If you're aware that this is what's happening, that can work better than directly managing 100 relationships; if you believe that you're reliant on 10 relationships, but there's actually 100 hidden relationships, that's where things can go horrifically wrong (since the 10 you think you're reliant on have reasons to lie about the relationships they're maintaining with their sub-contractors, in order to keep you happy).
Posted Jun 10, 2025 7:47 UTC (Tue)
by taladar (subscriber, #68407)
[Link]
Posted Jun 5, 2025 20:02 UTC (Thu)
by clugstj (subscriber, #4020)
[Link] (3 responses)
Posted Jun 5, 2025 20:15 UTC (Thu)
by daroc (editor, #160859)
[Link]
Posted Jun 5, 2025 21:35 UTC (Thu)
by pbonzini (subscriber, #60935)
[Link]
Posted Jun 9, 2025 7:11 UTC (Mon)
by Jyx (subscriber, #97038)
[Link]
Posted Jun 6, 2025 9:16 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (19 responses)
Or doing it once for a fee would taint the project as "commercial"?
Posted Jun 6, 2025 11:48 UTC (Fri)
by pizza (subscriber, #46)
[Link] (18 responses)
Yeah -- the moment you start charging money, does everything change?
Posted Jun 6, 2025 12:05 UTC (Fri)
by Wol (subscriber, #4433)
[Link] (17 responses)
Depends on the terms.
The question is "Who provides the CE mark". You charge a small maintenance fee which says you will "fix things on a best efforts" basis, then that's not good enough for CE. If your customer is happy with that, if they have resource they can divert to helping you, then they'll be happy to provide the mark.
If you have a proper company, they'll expect to pay a proper maintenance fee that says you take full responsibility for providing the mark.
Basically, how much are they paying you? What does the contract say? Are you providing "best efforts" or a guaranteed SLA?
If you're not prepared to include providing the mark as part of your service, it's down to your prospective customer whether they think it's worth paying you. Simples!
Cheers,
Posted Jun 6, 2025 12:27 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (16 responses)
And can one provide CE mark only for paying customers?
Posted Jun 6, 2025 13:51 UTC (Fri)
by Wol (subscriber, #4433)
[Link] (15 responses)
You're missing the ENTIRE POINT of the CE mark system. ALL it does is provide a legal guarantee that you will stand behind your product.
If I make and sell something, I can't do it without a CE mark. There's nothing stopping me saying "I stand behind my product, I will self-issue my own CE mark". The point is, I'm providing a legal guarantee. And if I don't honour it, it's a breach of contract which can result in the regulator (NOT my customer) taking me to court.
So if my product includes your software, *I* need either (a) a support contract with you that says you WILL fix any bugs - which comes with your CE at which point you accepted *legal liability* for fixing any bugs, or (b) I take you at your word that you will fix bugs, but it's MY CE, and I'm on the hook if things go wrong. Which puts you in a strong position, you can just refuse to sign any contract unless I offer you £££ (to *your* satisfaction).
It's all about the BoM, and who is legally liable for any problems in any component. And from the FLOSS point of view, it's all about *stopping* companies copying random software off the internet and using the fact they didn't write it themselves, to disclaim responsibility when it breaks.
So it can all be summed up with the simple phrase "Issuing a CE means you accept legal liability for the product you *sell*". (And your supply chain accepts legal liability for the components they sold to you.)
Remember that Playstation 2 Linux debacle? A CE mark would probably have made that illegal - and Sony would have been in very hot water.
Cheers,
Posted Jun 6, 2025 14:22 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link]
Posted Jun 6, 2025 14:26 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (13 responses)
This ought to be "Issuing a CE means you accept legal liability for the product you sell, but only to the people you sell it to, not to the general public", right?
Posted Jun 6, 2025 15:08 UTC (Fri)
by farnz (subscriber, #17727)
[Link] (12 responses)
With that said, there's nuance here. A CE mark is not a guarantee that it's up to standard for all possible uses someone might put it to, in all possible cases; rather, it's a promise that, within the scope of use that you could reasonably be expected to predict, it's up to standard. If you're selling a completed end-user product, then you're liable for all end-user type uses; if it's a component part, then you're liable for its quality when it's used for the purpose it's intended for, and integrated with reasonable care and attention to detail.
So, for example, my car's fuel filter is CE marked, and the manufacturer is liable for problems with the fuel filter being sub-standard, as long as it's been installed properly in a diesel-fuelled engine. It's a component part, so they're not liable for problems that occur if I misuse it to (say) filter cooking oil, instead of diesel, since that's using it for a purpose it wasn't intended for, nor are they responsible if I don't tighten it to spec, since that's a failure to integrate it with reasonable care and attention to detail. They are liable if I install it to specification, and because of a design or manufacturing flaw, it breaks apart and damages the engine it's attached to; however, even though it's integrated into my car, they're not liable for faults in other parts of the car unless I can show that they were caused by a fault in the fuel filter.
The CRA extends this line of thinking to digital goods; if your product is a completed product, then you're liable (to the world) for basic cybersecurity in your product. If it's a component, you're liable for flaws in the component when integrated correctly, but not (e.g.) for flaws caused by errors integrating it into the final product.
Posted Jun 6, 2025 15:20 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (11 responses)
That's terrible.
Posted Jun 6, 2025 15:42 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link] (1 responses)
And you can copy the non-CE code (provided the code licence allows it) and make it a CE product by taking up the maintenance obligations yourself (why you would want to do that instead of paying the original author is up to you).
So no you should not be liable to someone that has not signed a support contract with you. If he has signed this contract, and the contract says CE, you can not redefine your maintenance obligations lower than what the CE system requires.
Posted Jun 6, 2025 16:26 UTC (Fri)
by Wol (subscriber, #4433)
[Link]
To the point that the law explicitly forbids you from saying "not my problem" if you didn't yourself pay for it.
So yes, you're spot on. A CE cannot exist without a contract explicitly saying who is liable for what.
Cheers,
Posted Jun 6, 2025 16:28 UTC (Fri)
by farnz (subscriber, #17727)
[Link] (7 responses)
You're not on the hook for "all the future secure updates"; there's a timeout after you've been paid for a version, after which you're not liable, and you're not liable for updates to versions that you never "placed on the market" (a term of art, here).
Additionally, everyone demanding updates from you can only demand updates that apply to the component as integrated into your paying customer's product. You are entirely entitled to refuse to supply a security update that's not relevant to your paying customer's product, and you're entirely entitled to refuse to supply it in a form other than that needed to integrate with your paying customer's product.
However, if your paying customer doesn't care about an issue in your component, but their customers do care, you are liable for the issue, even though your paying customer isn't demanding an update - and this is transitive, so if your paying customer's product is a component, and I integrate that component into my product, my customers can demand a fix to your component from you directly, not just from me, or from your paying customer.
It's not terrible - it's setting up the same situation as exists for physical goods today; you are liable to everyone for issues with your component as integrated by your paying customer (and only if your customer's integration is done to an acceptable standard), but not for issues with your component when it's separated from your paying customer's product and used with something else.
And note that your fix to a security issue does not have to be acceptable to users who aren't paying; for example, if your paying customer's product only ever communicates over Unix sockets with your component, a fix to a security bug might be as simple as "completely remove IP support in this version". If that breaks my use case, well, that's my problem, because you fixed the security flaw that matters to the paying customer's product.
Posted Jun 6, 2025 23:13 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (5 responses)
I am a mantainer of a small C library that parses ID3v1 tags (real case).
A local manufacturer of low-batch toys comes to me and asks me for a CE mark paperwork so that they can include this library into their new toy. they expect to produce 100 devices, and the the devices will play a fixed set of .mp3 files. We agree that the probability of the security issue in this device is low, and they pay me, say, €500 euro for CE mark paperwork for library version 1.0.
All is fine.
But at the same time Google engineer incorporates the same version 1.0 of my library into Android, marks it as "covered by CE mark by manufacturer" and ships it to all the ODMs, who produce 1 billion of headsets.
A vulnerability is discovered in this library. Google Android security team, security teams of 500 ODM manufacturers and 10 million security-conscious owners of headsets all come filling my inbox and demanding a security fix.
It might be a trivial security fix, but even handling all this amount of incoming email will bankrupt me, and I guess I also have to answer it all?
Still not terrible? Any open source maintainer taking any amount of money is on the hook to support the whole world, and basically can be held at the gunpoint by any large manufacturer who can threaten to incorporate the code into their product.
The alternatives are: 1) never take a cent of money; 2) tell first potential customer to boot the bill for supporting the whole world.
Am I incorrect somewhere?
Posted Jun 7, 2025 10:29 UTC (Sat)
by johill (subscriber, #25196)
[Link] (1 responses)
You'd never issue a CE mark to the general public to use for free for arbitrary purposes, that'd be silly. It's not even clear that you'd be _allowed_ to, as an "open-source software steward":
Although I guess the argument is that once you provided the mark, then you're no longer just an "open-source software steward" but actually on the hook.
If you just post code:
Now, that's not good for your local widget manufacturer, since they don't want to be on the hook for your software. So you have a contract with them and make available to them separately, as part of the contractual relationship, the same software bits with a different color. Now you're making it available on the market and need the CE mark, which implies maintenance, but that was the whole point of the contract. So the CE mark you issue in this case is for the specific integration into the local widget manufacturer's toy and part of your contractual relationship with them. It extends - to some extent - to their customers though, although they'd probably have a hard time figuring our your component is in there and, if they do, finding a way to repair it. Not your problem, though if they do get to all that then you might have to provide some security fixes that are actually applicable to this particular product.
As for Google doing whatever they do:
They only got the bits via your open source repository, which was never "made available on the market" (see above.) Now they're on the hook. Maybe they will send you email anyway, but there's /dev/null.
Now you could ask is all of that plausible?
The question I guess will come down to whether you can be both an "open-source software steward" and a "manufacturer", even when the text says
(14) ‘open-source software steward’ means a legal person, other than a manufacturer, that has the purpose or objective of systematically providing support on a sustained basis for the development of specific products with digital elements, qualifying as free and open-source software and intended for commercial activities, and that ensures the viability of those products;
Worst case, you'd need a different legal person (company) to be the manufacturer, how it manufactures the thing is not all that interesting, so maybe it just pulls it from your personal (natural person) repository as the sole "manufacturing" step.
Anyway, that's just what I think, but given the level of discussion etc. I find it highly implausible that such a setup is or was intended to be prohibited.
Posted Jun 7, 2025 11:23 UTC (Sat)
by dottedmag (subscriber, #18590)
[Link]
Posted Jun 7, 2025 10:57 UTC (Sat)
by Wol (subscriber, #4433)
[Link]
Regardless of the meaning of "on the market", you SOLD one hundred copies of your widget, with 100 marks (one per toy), to the toy manufacturer.
Those are the only marks you have to worry about. As farnz said, despite the bits being the same, Google's widgets do not have your mark, and are therefore "not your problem".
Cheers,
Posted Jun 7, 2025 14:18 UTC (Sat)
by marcH (subscriber, #57642)
[Link]
Whatever the law says, that seems extreme and unrealistic.
- Google is likely to just go and fix the vulnerability itself to preserve the value of its brand.
PS: do ODMs have a security team? ;-)
Posted Jun 9, 2025 9:28 UTC (Mon)
by farnz (subscriber, #17727)
[Link]
At that point, I can demand a security fix for the toy. I can't demand a security fix for other uses of the library; the CRA doesn't extend that far. If the toy is not exploitable, no liability for a fix. If the toy is exploitable, and you fix the toy use case, but not the more general case, no liability for a fix.
The fact that it's also been put in a huge number of phones is irrelevant to legal liability - they got the open source version, and liability ends with the entity that placed your library on the market (possibly Google, possibly the ODMs, possibly even the retailers), and they've got to find a way to negotiate with you that works for you as well as for them. And that applies even if their version is bit-for-bit identical to the CE marked version; it's the provenance that matters for liability, not the code itself.
Posted Jun 9, 2025 9:58 UTC (Mon)
by paulj (subscriber, #341)
[Link]
This is all part of a thing where the council doesn't like there being these quite visible homeless aid operations on central streets in Dublin, and so they're using every law and bylaw they can to try stop it - street vending laws, food safety, etc. But there can very easily be unintended consequences to broad sweeping "consumer protection" laws that put significant burdens on every little person trying to do stuff. And it all contributes to a socio-economic environment that ever more favours large corporates - big enough to be able to amortise the cost of managing red-tape and bureacracy over a larger number of operational activities - over individuals and small businesses.
Posted Jun 6, 2025 16:37 UTC (Fri)
by Wol (subscriber, #4433)
[Link]
No! It's just like copying a physical good. If you copy someone else's physical good, and you copy their trademark/other marks too, then that's fraud. There's nothing stopping you copying their goods (well, there may be), but pretending they made that copy when they didn't is a serious offence. There's nothing stopping someone copying your software, but likewise copying your marks as well is a serious offence. The marks are only legal when they're attached to the original article, AND NOT UNAUTHORISED COPIES.
(Just because a copy is unauthorised, doesn't necessarily mean it's illegal. Just that the owner of the original didn't give you permission to make exact copies. And it doesn't fall foul of the GPL because you haven't lost any of your copyright rights, or your FSF freedoms. And the GPL obliges you to remove marks if so required.)
Cheers,
Posted Jun 6, 2025 13:22 UTC (Fri)
by hailfinger (subscriber, #76962)
[Link] (6 responses)
That is incorrect. Quoting the CRA chapter II article 13 number 6:
"Manufacturers shall, upon identifying a vulnerability in a component, including in an open source-component, which is integrated in the product with digital elements report the vulnerability to the person or entity manufacturing or maintaining the component, and address and remediate the vulnerability in accordance with the vulnerability handling requirements set out in Part II of Annex I. Where manufacturers have developed a software or hardware modification to address the vulnerability in that component, they shall share the relevant code or documentation with the person or entity manufacturing or maintaining the component, where appropriate in a machine-readable format."
Posted Jun 6, 2025 14:45 UTC (Fri)
by daroc (editor, #160859)
[Link] (1 responses)
Posted Jun 6, 2025 15:12 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link]
However not passing the fix upstream is akin to a legal admission *you* are maintaining (and liable) for your own fork. And are liable for any problematic rebasing (you can not both deny an upstream is maintaining something, and disclaim your responsibility when merging the changes done by this upstream).
Posted Jun 6, 2025 15:51 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (3 responses)
Posted Jun 6, 2025 15:55 UTC (Fri)
by marcH (subscriber, #57642)
[Link] (2 responses)
Posted Jun 13, 2025 17:06 UTC (Fri)
by naesten (subscriber, #71199)
[Link] (1 responses)
Posted Jun 13, 2025 18:06 UTC (Fri)
by daroc (editor, #160859)
[Link]
Posted Jun 6, 2025 15:22 UTC (Fri)
by dottedmag (subscriber, #18590)
[Link] (13 responses)
Posted Jun 6, 2025 16:24 UTC (Fri)
by hailfinger (subscriber, #76962)
[Link] (12 responses)
The European Commission was at FOSDEM 2023, 2024 and 2025, participated in panel discussions, held excellent talks and answered questions from the audience. Those talks had FOSDEM attendees as target audience, and the speakers excelled at presenting the topics at hand in a way that could be easily understood by a technical audience.
If you're interested in the interaction between CRA, PLD and F/OSS, I highly recommend listening to the recordings of the various FOSDEM CRA talks in 2023 (some of those statements may be outdated), 2024 and 2025. A really good starting point is https://archive.fosdem.org/2024/schedule/event/fosdem-202... . Please make sure to listen to the video and not just read the slides, the interesting content is what is being said.
IMHO the CRA is a really well-written law which goes to great lengths to shield hobbyist F/OSS developers from responsibilities and instead places those obligations on the companies earning money with code they didn't write. I think that's entirely fair. Oh, and the law also is pretty easy to read and understand, so each time someone tells you "CRA is bad for open source / consumers / whatever", you can challenge people to back up that claim with a quote from the law. So far, in personal discussions I have watched all of those fearmongering claims collapse.
@daroc maybe LWN.net can cover the FOSDEM CRA talks. They might be interesting and relevant for the LWN.net audience even if those talks are a few months old.
Posted Jun 6, 2025 17:33 UTC (Fri)
by daroc (editor, #160859)
[Link]
Posted Jun 6, 2025 20:47 UTC (Fri)
by pbonzini (subscriber, #60935)
[Link] (1 responses)
Posted Jun 6, 2025 20:54 UTC (Fri)
by johill (subscriber, #25196)
[Link]
"Mr Benjamin Bögel is Head of Sector for Product Security and Certification Policy at the European Commission."
Posted Jun 9, 2025 7:07 UTC (Mon)
by iabervon (subscriber, #722)
[Link] (8 responses)
So another reason to ask for a quote from the law is that they may actually have a quote from what didn't end up becoming the law, and be glad to find out that their concerns were actually addressed since they last looked into it.
Posted Jun 9, 2025 8:07 UTC (Mon)
by kleptog (subscriber, #1183)
[Link] (3 responses)
You should remember that it's largely written by non-lawyers. Only a third of MEPs have legal training. For most English is a second language. They're not going to be making complicated phrases which difficult meanings. There are a few terms of art like "putting on the market" but by and large it means what it says in plain English.
There's actually a running debate about whether it's a problem having laws written by non-lawyers. Some countries like NL require it to be written by people with specific training and it leads to compact and concise though tricky to read laws. I think the EU approach isn't too bad, especially since the specific tekst isn't important, as long the intent is clear.
[1] https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32024R2847
Posted Jun 9, 2025 8:15 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link]
If you want another great example, check these documents that outline the repairability scoring criteria: https://susproc.jrc.ec.europa.eu/product-bureau/product-g...
Posted Jun 9, 2025 21:59 UTC (Mon)
by iabervon (subscriber, #722)
[Link]
[1] https://eur-lex.europa.eu/resource.html?uri=cellar:864f472b-34e9-11ed-9c68-01aa75ed71a1.0001.02/DOC_1&format=PDF
Posted Jun 10, 2025 8:40 UTC (Tue)
by Wol (subscriber, #4433)
[Link]
Bear in mind they also have a very good translation unit. Dunno about today, but you can thank my Granddad for its tradition of excellence. He was the head of the Directorate in the EC days, and the stories are legion about his ability as a linguist and his insistence on "getting it right". He retired months after Britain joined and it went from 9 to 12.
Cheers,
Posted Jun 11, 2025 15:35 UTC (Wed)
by raven667 (subscriber, #5198)
[Link] (1 responses)
> It's a little hard to tell whether it was always supposed to not apply to hobbyist developers and they just made this more explicit, or they hadn't considered it at all
These laws appear to be written in good faith, unlike a lot of US law written by some industry group for private advantage sneaking a fast one past the other legislators, and the people who write them _aren't_ _any_ _smarter_ than anyone else, they may have specific experience and training in the law but they are not omniscient gods. Kind of like the tendency to anthropomorphize LLMs, even well-meaning savvy people tend toward an underlying assumption that legislators and people at top of government know vastly more than they do and are making some sort of 9-dimentional chess moves that we can barely understand or interpret without the help of a priesthood of media pundits, when the fact is they probably just didn't think of it. In this case the draft law was put together with what they knew and had experience in and they relied on good-faith feedback to, loudly, tell them all the detail they missed, all the effects they forgot when drafting. Does anyone really thing that a human person could mentally keep track of all the effects and second-order consequences of even a 5-10 page law, it takes teams of people and the public reviewing it form their subject-matter-expert perspective to find the bad feedback loops and holes to shave off the worst of the potential negative consequences when changing societies rules.
Posted Jun 12, 2025 7:10 UTC (Thu)
by Wol (subscriber, #4433)
[Link]
The cleverest person is usually the person who can recognise they are out of their depth, and asks for help. Which is why women tend to make good GPs (as someone who interacts with the medical fraternity far too much ...)
Cheers,
Posted Jun 11, 2025 17:06 UTC (Wed)
by farnz (subscriber, #17727)
[Link] (1 responses)
Even the very first draft I can find consistently refers to "placing on the market", which is a term of art that would have excluded hobbyist developers, since a hobbyist, by definition, does not place anything on the market (at most, they offer gifts to interested parties).
I'd therefore expect that it was never intended to apply to hobbyists, and the clarification we've seen is because "placing on the market" is a term of art that most of us aren't familiar with.
This isn't helped by the CRA trying to carefully balance allowing companies to contribute - or even run - open source projects without opening themselves up to liability, while not wanting companies to be able to escape liability for security flaws in products they sell by open sourcing some, or all, of the code (or, indeed, using ancient versions of open source stuff that's full of known flaws).
Posted Jun 12, 2025 15:33 UTC (Thu)
by kleptog (subscriber, #1183)
[Link]
> (10) In order not to hamper innovation or research, free and open-source software developed or supplied outside the course of a commercial activity should not be covered by this Regulation. [...]
This is a fairly clear and straightforward statement. The intent was clear, though it was the only reference to open-source. Now, plenty of people pointed out that "commercial activity" could do with some clarification since we want open-source developers to eat too and not everything involving money is commercial. The draft attracted hundreds of amendments and unlike in some legal systems where there's a speaker who chooses which amendments get voted on, in the EP they vote on all of them. The result is many more recitals and clarifications which I think are real improvements. Possibly even overkill, but this is lawmaking by non-lawyers for you.
The "placed on the (single) market" is a term of art which is related directly to the authority the EU has to make regulations in the first place. I hope this whole process has given people a little better understanding as to what it means.
Posted Jun 14, 2025 16:49 UTC (Sat)
by milek7 (subscriber, #141321)
[Link] (3 responses)
That sounds bad for firmware freedom. Is this going to be another excuse for companies to enforcing signed firmware (like FCC certification requirements)?
Posted Jun 15, 2025 12:24 UTC (Sun)
by pizza (subscriber, #46)
[Link] (2 responses)
Abso-effing-lutely.
I wonder how this will play out with respect to right-to-repair rules. From what I can tell it effectively neuters any possibility of the latter with regards to firmware.
Posted Jun 15, 2025 21:20 UTC (Sun)
by kleptog (subscriber, #1183)
[Link] (1 responses)
It's basically like that for PCs. You can boot alternate firmware, it just requires physical interaction. As long as malware can't transparently insert itself into the boot process you're fine.
Posted Jun 15, 2025 22:11 UTC (Sun)
by pizza (subscriber, #46)
[Link]
The systems I am currently working on (in the automotive space) go well beyond that -- "secure boot" is the *only* way to boot, reinforced by numerous laws and regulations that (1) make it a literal felony to break those locks and (2) "strongly disincentivize" manufacturers from providing any mechanism that allows mere "owners" to modify (or even *view*) anything of substance.
Welcome to the future.
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
Wol
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
Ugh, another avenue for emotional blackmail...
The problem is that what looks from the outside like "one big dependency" is often internally "100 small dependencies sharing a repo and a single point of contact", with worse accountability issues than if it actually were 100 small dependencies, since you have the "everyone thinks somebody else is responsible for that module, but nobody will take responsibility".
Big dependencies versus many dependencies
Big dependencies versus many dependencies
Propaganda?
Propaganda?
Propaganda?
Propaganda?
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Wol
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Wol
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
This gets complicated fast, but no, issuing a CE mark means that you accept legal liability for the quality of the product or component you've so marked, no matter who's using it, or who it's sold to.
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Wol
Like I said, there's nuance here.
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Not really well-versed with this, but for all I've seen it is, as usual, the color of bits that matters, even if they're the same bits.
CRA paperwork for a fee impact on "hobbyist" status
Given that the light-touch and tailor-made regulatory regime does not subject those acting as open-source software stewards to the same obligations as those acting as manufacturers under this Regulation, they should not be permitted to affix the CE marking to the products with digital elements whose development they support.
The sole act of hosting products with digital elements on open repositories, including through package managers or on collaboration platforms, does not in itself constitute the making available on the market of a product with digital elements.
And if you aren't making it available on the market it doesn't need/have a CE Mark.
When integrating components sourced from third parties in products with digital elements during the design and development phase, manufacturers should, in order to ensure that the products are designed, developed and produced in accordance with the essential cybersecurity requirements set out in this Regulation, exercise due diligence with regard to those components, including free and open-source software components that have not been made available on the market.
(13) ‘manufacturer’ means a natural or legal person who develops or manufactures products with digital elements or has products with digital elements designed, developed or manufactured, and markets them under its name or trademark, whether for payment, monetisation or free of charge;
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Wol
CRA paperwork for a fee impact on "hobbyist" status
- ODMs are more likely to first pressure the "bigger" fish with whom they already have a business relationship and contacts there, and who has more manpower and is more likely to get things done one way or the other.
- Good luck finding 10 million "security-conscious" users and good luck finding end users technical enough to understand the vulnerability is who is to blame. You could receive some email, granted. But not from 10 million people.
You did not make your library available on the market with a CE mark; you sold a CE-marked version to your low-batch toy manufacturer, for integration into that toy only (and your lawyer worked with you on the contract of sale to ensure that this version has restrictions that protect you).
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
CRA paperwork for a fee impact on "hobbyist" status
Wol
Sharing bug fixes...
Sharing bug fixes...
Sharing bug fixes...
Sharing bug fixes...
Sharing bug fixes...
Sharing bug fixes...
Sharing bug fixes...
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Whom to ask?
Wol
Whom to ask?
Whom to ask?
Wol
Whom to ask?
It's a little hard to tell whether it was always supposed to not apply to hobbyist developers
Whom to ask?
Enforced signed firmware?
Enforced signed firmware?
Enforced signed firmware?
Enforced signed firmware?