|
|
Subscribe / Log in / New account

The European Cyber Resilience Act

The European Cyber Resilience Act

Posted Sep 21, 2023 2:25 UTC (Thu) by wtarreau (subscriber, #51152)
In reply to: The European Cyber Resilience Act by kleptog
Parent article: The European Cyber Resilience Act

> Notifications only apply to security issues found in deployed products, so not every bug needs notification.

The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not. The developer makes a bug and fixes a bug.

> If you want to reduce the impact of this Act, then it would be a good idea to stop assigning CVEs for issues that aren't important.

That's one of the likely outcomes, that everyone strongly refuses to assign any CVE on the grounds that "no, it's not a security problem, in the worst case it's a misuse" at least to try to escape the CRA.


to post comments

The European Cyber Resilience Act

Posted Sep 21, 2023 11:35 UTC (Thu) by kleptog (subscriber, #1183) [Link]

> The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not. The developer makes a bug and fixes a bug.

Correct, which is why notifications are only required for vulnerabilities that are known to be actively being exploited. Potential vulnerabilities are of not interest here because as you point out, you just need to push the bug-fix out to people in a reasonable time-frame and spend no more time on it. The goal here is to prevent suppliers sitting on information about actively exploited vulnerabilities. You can voluntarily notify about other things your customers may be interested in. I don't see how this is a big change compared to what people are doing now.

The European Cyber Resilience Act

Posted Sep 22, 2023 5:10 UTC (Fri) by ebee_matteo (subscriber, #165284) [Link] (4 responses)

> The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not.

I strongly disagree and I think this is at the root of the problem.

The CRA is written *exactly* because we see a severe uptick in exploitable vulnerabilities because developers do not care enough about security as part of project development.

In turn these are widespread enough that they are very juicy for a state-actor threat or rogue terrorist.

The spirit of the CRA is: you *need* to care about this. If you do not directly have the resources as an hobbyist, at least companies commercializing those products need to allocate proper resources and FTEs or face the consequences.

The free ride has ended.

You do not like this? Keep unreviewed software out of your commercial product that can have potential repercussions on the life of millions of people.

Unfortunately it is currently very visible for people on the cybersecurity area how open source is becoming an easy target for a plethora of attacks, esp. socially engineered and supply chain related.

Big companies certainly have resources to foot the bill for code reviews and the paperwork needed.

Now, the language in the CRA needs fixing, I agree. But the spirit is correct.

The European Cyber Resilience Act

Posted Sep 22, 2023 21:00 UTC (Fri) by Wol (subscriber, #4433) [Link] (3 responses)

> > The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not.

> I strongly disagree and I think this is at the root of the problem.

I'd put it rather differently, but yes it's because developers don't do their job properly.

> The CRA is written *exactly* because we see a severe uptick in exploitable vulnerabilities because developers do not care enough about security as part of project development.

> In turn these are widespread enough that they are very juicy for a state-actor threat or rogue terrorist.

> The spirit of the CRA is: you *need* to care about this. If you do not directly have the resources as an hobbyist, at least companies commercializing those products need to allocate proper resources and FTEs or face the consequences.

If you care about doing a good job, a lot of this grief would just go away. What's the quote? "If they built buildings like we build software, the first woodpecker to come along would destroy civilisation"? Most software is held together with duck tape, string, and sealing wax.

I'm very much with Linus here - "a bug is a bug is a bug. It should be fixed". Security considerations are secondary. But as I see it, there are two problems ... and I'm with him with his other quote too - "the best programmers are lazy programmers, they get it right first time because they can't be bothered to do it twice".

Firstly, most software today is not designed. It's thrown together, it works, and that's that. Then it gets extended and extended and the parts don't work together. Etc etc.

The second thing is, a heck of a lot of our tools suffer the same problem. C is a massive offender, with all its undefined behaviour. Landmines everywhere. I've just been fighting conditional formatting with VBA. Badly documented, things blowing up when you think they should work, things only make sense AFTER you've debugged the problem, ...

Again, what's that saying? "Ten minutes extra in the design phase knocks an hour off the debugging". I'm probably loved and hated simultaneously at work, because I spend so much time fixing technical debt even in the middle of a fire fight.

That's why I hate Word. That's why I hate Relational/SQL. I've worked with programs that have a clear, simple design. Things "just work". Everything I do, I try to step back and have a clear design behind it. Even if I do a half-baked implementation, so long as the design is simple, clear, AND WELL COMMENTED IN THE CODE, things are far less likely to break. If somebody tries to do something I haven't implemented, they should crash into an error message that says "not implemented, please file a bug report". They shouldn't crash into an undefined, unanticipated state that causes all sorts of grief.

How much effort is it to check a variable that says "these are the states I know about and can handle. Anything else, raise an error"? Okay, if the previous guy didn't do it you're probably into a world of hurt. But if all your code does it, you're not going to be responsible for some obscure security problem because you didn't do your job (if that other guy's code drops you in it, well sorry ...)

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 2:33 UTC (Mon) by wtarreau (subscriber, #51152) [Link] (2 responses)

> I'm very much with Linus here - "a bug is a bug is a bug. It should be fixed". Security considerations are secondary. But as I see it, there are two problems ... and I'm with him with his other quote too - "the best programmers are lazy programmers, they get it right first time because they can't be bothered to do it twice".
> Firstly, most software today is not designed. It's thrown together, it works, and that's that. Then it gets extended and extended and the parts don't work together. Etc etc.

That's exactly my point. Figuring how a bug might be used to participate to an exploitation chain requires a different mind and set of skills. Many developers will not see how missing a line feed at the end of an error message might constitute a vulnerability, it's just a cosmetic error, but some upper layer wrapper might rely on this specific delimiter and might get confused enough to be fooled. That's what I mean by "it's not the developer's job". The developer cannot know *all* possible use cases and integration of their creation. Of course, writing a blatant buffer overflow does have immediately visible security implications, but a lot of security issues nowadays stem from a combination of small escalations, sometimes from different components.

The European Cyber Resilience Act

Posted Sep 25, 2023 7:32 UTC (Mon) by Wol (subscriber, #4433) [Link]

> That's exactly my point. Figuring how a bug might be used to participate to an exploitation chain requires a different mind and set of skills. Many developers will not see how missing a line feed at the end of an error message might constitute a vulnerability, it's just a cosmetic error, but some upper layer wrapper might rely on this specific delimiter and might get confused enough to be fooled.

But this is what really pisses me off (not only with developers, but ...)

If the developer is not knowledgeable enough (I carefully didn't say "skilled") to know that there is SUPPOSED to be a line feed, this is a failure on oh so many levels. A badly designed tool (the compiler?), an inappropriate language (okay that's often management), a developer satisfied with "oh it appears to work", an analyst who didn't analyse the job properly ... the list goes on.

At the end of the day we should be aiming to do the best job we can, and that does NOT mean patching broken tactics on top of broken tactics. Banging on about Pick again, but it's noticeable with Pick that good software tends to be implemented by teams, with domain experts specifying the problem and helping in the programming, while the computer experts help in the specification and make sure the programming is done properly.

At the end of the day, far too many problems are caused by "ivory tower syndrome" - epitomised by typical computer split into analysts and programmers, where analysts understand neither the subject nor programming, and programmers implement the spec they're given. A recipe for disaster. I hope most of us don't work in those sorts of environments, but it's those embedded attitudes that are responsible for a lot of the problem. I guess that can be summed up as "bureaucracy" :-)

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 10:17 UTC (Mon) by farnz (subscriber, #17727) [Link]

Right, and nothing about the CRA stops you saying that all commits (not just all bugfixes, but all commits, whether they're known to be bugfixes or not) are security relevant; if you do this, then your downstreams have to either keep up with development, or accept the risk of being liable for bugs in your software if you've fixed one and they haven't taken that patch. The whole thing only becomes an issue if you're (a) providing software in a commercial context, (b) not wanting to force your customers to upgrade all the time, and (c) don't want to be liable for security-relevant bugs in the software; at this point, the CRA forces you to choose which of those three you drop.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds