|
|
Log in / Subscribe / Register

EU CRA (Cyber Resilience Act)

EU CRA (Cyber Resilience Act)

Posted Jan 2, 2026 21:21 UTC (Fri) by hailfinger (subscriber, #76962)
Parent article: Kroah-Hartman: Linux kernel security work

The CRA reference in Greg's document is not applicable to community FOSS projects unless they happen to be part of a larger organization ("steward" in CRA-speak) or steered by a company.

I know that various foundations have employed FUD tactics to get small FOSS projects to join them due to the perceived CRA threats. Unless you're a FOSS project the size of Debian, Mozilla or the Linux kernel, joining a larger organization is completely stupid from a CRA perspective.
Why, you ask? Simple. If you're a small FOSS project and don't make a profit from it, you're exempt from the obligations of the CRA (but you still get all the benefits from the CRA). Join a larger org (foundation, whatever) and suddenly you're subject to the CRA.
Now if you enjoy being regulated, feel free to join some of the foundations. Otherwise, steer clear of them.


to post comments

EU CRA (Cyber Resilience Act)

Posted Jan 2, 2026 21:34 UTC (Fri) by Wol (subscriber, #4433) [Link] (10 responses)

Stewards aren't affected by the CRA either! READ THE CRA.

Companies have contracts with suppliers. In order for a component to be legal as part of a product, IT MUST HAVE A CRA MARK. And in order to have a CRA mark, there MUST be contracts in place to say who is legally liable.

So as a company, you either pay the foundation, or the project, or the maintainer, for a support contract that includes a CRA mark, or you provide your own CRA mark.

Simply put, without a contract in place the CRA can't touch you. It will, however, clobber any company that uses your product thinking they can offload the responsibility off to you at no cost to themselves. That's by design ...

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 2, 2026 22:05 UTC (Fri) by hailfinger (subscriber, #76962) [Link] (9 responses)

> Stewards aren't affected by the CRA either! READ THE CRA.
I did read the CRA. I even commented on it before it became the law and some of my suggestions ended up in the CRA. So yes, I think I can claim that I did read it.

And yes, despite your claims, stewards are affected by the CRA.
Please look at the official text of the CRA, Chapter II, Article 24: "Obligations of open-source software stewards".
https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=...

> [...] IT MUST HAVE A CRA MARK
There is no such thing as a CRA mark. Sorry.

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 9:31 UTC (Sat) by kleptog (subscriber, #1183) [Link] (4 responses)

I really hope we can avoid a massive discussion CRA here again. This horse has been quite beaten to death. Especially since the whole discussion tends to get mired in the fundamentally different way Europe and the US regulate markets. Europe tends to write out a description of how they want the market to work and then rely on the market participants to keep each other honest. The US tends to be more prescriptive and sets the government up as judge, jury and executioner, and the market participants spend their time trying to figure out how to avoid scrutiny.

Case in point: open source stewards. Who decides who is one? No-one. Don't give yourself that label and you're all set. That text is largely aspirational, it describes how the EP expects open source to work within the CRA framework. Its value is largely in making clear to the CxOs that all those people doing unpaid open-source work are not responsible for their bonuses. For extra clarity, Article 64.10b says stewards can't be fined, even if you could figure out who they are.

European law is largely written by people with no legal training, which means you get a lot of text of dubious legal value. A significant chunk of the Parliament and Council are very careful to make sure the EU (and governments in general) don't actually gain any powers that aren't strictly required. Concentrated executive power (we know from experience) is very risky. Read any contracts before you sign them, that's the most important lesson.

But I can say from personal experience that CRA also has for me also made it way easier to get time for security and software maintenance. Suddenly SBOMs are on the agenda, after years of being summarily dismissed.

EU CRA (Cyber Resilience Act)

Posted Jan 5, 2026 15:12 UTC (Mon) by poruid (subscriber, #15924) [Link] (3 responses)

Excuse me, but stating that EU law is written by legal nitwits is baseless.

Besides that, the CRA as far as FOSS is concerned, has been modelled to ensure that downstream free riders MUST upstream fixes that the CRA obliges them to provide. That is good, very good.

EU CRA (Cyber Resilience Act)

Posted Jan 5, 2026 16:09 UTC (Mon) by Wol (subscriber, #4433) [Link] (2 responses)

And while it's a hobby horse of mine, assuming that a lawyer understands the law - even in his own specialty! - is a dangerous thing to do. As a doctor mate of mine once said - "50% of people are below average intelligence, so what does that say about doctors?". The same could be said about lawyers - and once you assume that all your hot-shots are in high powered jobs, what does that say about your average high street doctor or lawyer!!!

Mix a high intelligence, and a good grasp of the detail, and you don't need to be a professional to outperform your typical high-street practitioner.

Mind you, you also need my attitude to electrics - as an okay amateur sparky, my FIRST response to any job is "if I don't understand it, get a professional involved". You can then assess the professional's competence ...

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 5, 2026 16:39 UTC (Mon) by mathstuf (subscriber, #69389) [Link] (1 responses)

> As a doctor mate of mine once said - "50% of people are below average intelligence, so what does that say about doctors?".

My gripe with this is that they're actually talking about the median. It also assumes doctors are uniformly sampled across the "intelligence spectrum" (as measured by, presumably, IQ) and all the faults with that.

I don't think anyone would say "50% of people at IAS in the 1930's and 1940's were below average intelligence" without also having a footnote of "but all are sampled from the top 1% of the world population". And the internal rankings would certainly differ from those on the outside (cf. Einstein and him ranking Gödel as above himself, IIRC).

EU CRA (Cyber Resilience Act)

Posted Jan 5, 2026 17:37 UTC (Mon) by Wol (subscriber, #4433) [Link]

I don't know that intelligence *should* be a factor in sampling people to become doctors (other than filtering out the subnormal). Certainly in my case intelligence seems to have been a DEselection criteria. I got grades A, B, B, your typical offer iirc was about C, D, D, and your successful candidate typically achieved B, C, C. Certainly I think only ONE of my successful fellow candidates from school got a higher grade.

Absent reliable information one way or the other, I'm happy to assume reasonably random selection (although of course I've missed the fact that school leavers rig the criteria somewhat. That said, so do English Public Schools in the opposite direction, where a lot of people who get good grades come over as "rich and stupid".

However you play it though, my experience of legal work leaves me inclined to put the two letters "in" in front of your average lawyer's competence.

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 10:04 UTC (Sat) by lynxlynxlynx (guest, #90121) [Link]

Yes and no. Consider the definition of a steward:

> (14) ‘open-source software steward’ means a legal person, other than a manufacturer, that has the purpose or objective of systematically providing support on a sustained basis for the development of specific products with digital elements, qualifying as free and open-source software and intended for commercial activities, and that ensures the viability of those products;

It's a long list of conditions that all have to be true for the article to come into play. For the vast majority of small projects using stewards to just host their secrets, a donation account and similar, the criteria won't be met.

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 10:17 UTC (Sat) by Wol (subscriber, #4433) [Link] (2 responses)

> (14)

> ‘open-source software steward’ means a legal person, other than a manufacturer, that has the purpose or objective of systematically providing support on a sustained basis for the development of specific products with digital elements, qualifying as free and open-source software and intended for commercial activities, and that ensures the viability of those products;

Which pretty much describes a business that supports open source as its business ... like Red Hat, SUSE or Ubuntu

And your reference basically says they have to have a security policy, which they have to provide to the regulator on request.

To my mind, it's not at all clear what the liabilities of a steward are, beyond providing the said security policy.

> (24) 3. The obligations laid down in Article 14(1) shall apply to open-source software stewards to the extent that they are involved in the development of the products with digital elements. The obligations laid down in Article 14(3) and (8) shall apply to open-source software stewards to the extent that severe incidents having an impact on the security of products with digital elements affect network and information systems provided by the open-source software stewards for the development of such products.

So what happens if the Open Source Steward provides support services but not development resources for Open Source Projects? Take Apache for example, it doesn't look as if they're affected by this ... from what little I know, Apache projects all have their own teams, and Apache just provides an umbrella - which this clause appears NOT to catch.

I'll need to read it again, but I don't see what difference a project joining a foundation would make to the project. It might make a difference to the foundation ...

> There is no such thing as a CRA mark. Sorry.

My bad - but it must have something. There must be some sort of B2B agreement in place which I've taken to calling a CRA mark. The CRA will not look kindly on a business - selling software as part of their product - that does not have a formal agreement in place for its maintenance.

I'll read the CRA again ...

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 12:16 UTC (Sat) by hailfinger (subscriber, #76962) [Link] (1 responses)

The CE mark is probably what you were looking for. With the CRA a CE mark is also required for "products with digital elements", i.e. (in very simplified terms) anything which can run code or can interact with code or is code.

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 13:10 UTC (Sat) by Wol (subscriber, #4433) [Link]

That's where I took it from. In effect what I call the CRA mark, is a CE mark applied to software, no?

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 3, 2026 22:46 UTC (Sat) by andrewaylett (guest, #181673) [Link] (1 responses)

Is it not more that the maintainers' employers may have responsibilities under the CRA, and the intent is that reporting to the security team does not by itself trigger any obligations or deadlines for the employers?

EU CRA (Cyber Resilience Act)

Posted Jan 4, 2026 10:27 UTC (Sun) by Wol (subscriber, #4433) [Link]

How?

Either the security problem exists within an employer's product "placed on the market" in which case an employee reporting a problem DOES trigger obligations - the employee knows therefore - as far as the law is concerned - the manufacturer/importer/whoever also knows, or

The employer is not connected with the product at risk, and unless they are the steward for the software in question (not the case if the employee is either (a) working in his own time, or (b) working using time donated by the employer for projects of the employee's choice) the employer has no obligations at all. If they are the steward, and their employees work on the project, then they have to follow their declared guidelines.

Since this particular discussion started, it seems to me that the "steward" wording is aimed clearly at people like Google, Red Hat, Oracle, et al - the commercial distros and support-sellers.

Let's take Google, they provide a lot of services, under service contracts, to other companies. My company uses a lot of them ... One of those services, for example, is Gsheets. As far as our use of it is concerned, it's covered by the CRA because we have a support contract. But the whole point of the steward wording is to say that if I as a private individual (as opposed to a company employee) make use of Gsheets for my own personal use, any bugs I find are reported to Google in their role of steward. AND NO LEGAL LIABILITY RESULTS. Google, however, are now aware of the bug, and are expected to follow their declared procedures. Failure to do that *probably* *will* trigger liability, because combine a known bug, declared procedures, *and* *support* *contracts*, the regulator will say "you should have fixed it under the support contracts". Coupled with the requirement under the CRA to report any bugs and fixes upstream, that means I as a private individual get to see the bug fixed, otherwise Google forfeit their legal protection as steward.

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 6, 2026 15:02 UTC (Tue) by coriordan (guest, #7544) [Link]

> I know that various foundations have employed FUD tactics to get
> small FOSS projects to join them due to the perceived CRA threats.

No they haven't.

EU CRA (Cyber Resilience Act)

Posted Jan 6, 2026 15:57 UTC (Tue) by pizza (subscriber, #46) [Link] (8 responses)

> I know that various foundations have employed FUD tactics to get small FOSS projects to join them due to the perceived CRA threats.

Please keep in mind that nearly all of the "HOLY CRAP THIS IS BAD" reactions [1] were to the initial CRA drafts, which were every bit as bad as those early reactions claimed. The folks drafting the CRA listened to the feedback and did a very good job [2] addressing the concerns of F/OSS authors and maintainers, resulting in the final form of the CRA being vastly superior to what was first proposed.

Folks that only saw the final as-ratified version of the CRA will of course wonder what the big deal was, but as the saying goes, this is how sausage gets made. The final form looks pretty and clean but what goes into it is anything but.

[1] Quite a few of which were from EU-based entities
[2] I say that as a particularly loud critic of the initial draft

EU CRA (Cyber Resilience Act)

Posted Jan 6, 2026 18:30 UTC (Tue) by pizza (subscriber, #46) [Link] (7 responses)

> The folks drafting the CRA listened to the feedback and did a very good job [2] addressing the concerns of F/OSS authors and maintainers,

I should add that the legislative _intent_ of the CRA was not really known until later drafts came out that positively incorporated this feedback. Up until that point, we had no way of knowing that the aformentioned problems were not an *intentional* feature (or an unfortunately necessary side effect) of the legislation.

EU CRA (Cyber Resilience Act)

Posted Jan 7, 2026 17:25 UTC (Wed) by kleptog (subscriber, #1183) [Link]

> I should add that the legislative _intent_ of the CRA was not really known until later drafts came out that positively incorporated this feedback.

The legislative intent of legislation is determined by the Parliament/Council on the basis of the amendments that are approved. There is no gatekeeping, the Commission cannot control which way the legislation evolves.

If anything, the process demonstrated (to me anyway) that the Open source/Free software movement has a wide base of support and the process works as designed. Even the Council amendments were largely pro-open source.

We (collectively) have more influence than we realise IMHO.

EU CRA (Cyber Resilience Act)

Posted Jan 14, 2026 0:43 UTC (Wed) by SLi (subscriber, #53131) [Link] (5 responses)

> I should add that the legislative _intent_ of the CRA was not really known until later drafts came out that positively incorporated this feedback.

I've come to believe that legislative intent is a myth and legal fiction that tries come up with the least damaging theory for why the law ended up the way it did.

Basically, any time you have three or more people voting for something, this is in play. We have the intents of the three people, and the output of the process, which at best reflects a found acceptable compromise that allowed two of the three people to say "I prefer that to no law; it does look a bit schizophrenic and I don't know what it means, but seems low-risk".

By no means applies only to law.

EU CRA (Cyber Resilience Act)

Posted Jan 15, 2026 9:16 UTC (Thu) by taladar (subscriber, #68407) [Link] (4 responses)

If we gave up that fiction we would have to acknowledge that many laws are approved or rejected on reasons that are either not sane (e.g. biases) or self-serving (benefits your particular voting district or campaign donor or increases your personal career chances as a politician). And at that point we probably would have to question whether it is sensible at all to just vote on 1000 page laws at all by people who have first seen those laws the day before.

EU CRA (Cyber Resilience Act)

Posted Jan 15, 2026 10:04 UTC (Thu) by kleptog (subscriber, #1183) [Link] (3 responses)

> And at that point we probably would have to question whether it is sensible at all to just vote on 1000 page laws at all by people who have first seen those laws the day before.

But why do lawmakers accept that bills are that large? AIUI it's mostly a US phenomenon. In Australia for example it is required by the constitution that budget/tax/tariff bills must only deal with that single thing and cannot be combined with any other bills. ISTM they learned from the American experience. And elsewhere it's at least a cultural expectation that you don't combine bills covering unrelated topics.

All it would take is for a group of lawmakers to refuse to vote for such combination bills and the problem would go away. That this doesn't happen just shows there are deeper problems.

EU CRA (Cyber Resilience Act)

Posted Jan 15, 2026 12:08 UTC (Thu) by Wol (subscriber, #4433) [Link] (2 responses)

And this is why gutting the House of Lords is such a disaster. You had plenty of people who would go in to the House to debate bills, and they would take the effort to read them, and because they had ended up in the Lords by accident they would probably have had a lot of relevant experience.

The Commons precisely hated the Lords because so much legislation got voted down with "this isn't going to work", while the Commoners, with an eye on getting re-elected, just wanted it passed whether it made sense or not.

Cheers,
Wol

EU CRA (Cyber Resilience Act)

Posted Jan 15, 2026 15:32 UTC (Thu) by taladar (subscriber, #68407) [Link] (1 responses)

Maybe we need the equivalent of a linter for legislation, some purely rule-based rejection mechanism that is non-political precisely because it just applies to purely factual objections and those objects have to be validated for every bill?

EU CRA (Cyber Resilience Act)

Posted Jan 15, 2026 15:45 UTC (Thu) by daroc (editor, #160859) [Link]

There are a handful of interesting projects along those lines, actually. The most complete and useful is probably Catala: https://github.com/CatalaLang/catala


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds