Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
We feel the current proposal misses a major opportunity. At a high level the 'essential cybersecurity requirements' are not unreasonable, but the compliance overhead can range from tough to impossible for small, or cash-strapped developers. The CRA could bring support to open-source developers maintaining the critical foundations of our digital society. But instead of introducing incentives for integrators or financial support via the CRA, the current proposal will overload small developers with compliance work.
Posted Nov 14, 2022 16:35 UTC (Mon)
by jmclnx (guest, #72456)
[Link] (2 responses)
To me, if it does pass as is, I can see development moving out of the EU, leaving the EU stuck with proprietary environments. Maybe UK will be positioned well for this ?
Posted Nov 14, 2022 17:36 UTC (Mon)
by dvrabel (subscriber, #9500)
[Link] (1 responses)
I have not read the regulations in detail but from a quick skim (particularly of Annex V and VI), the requirements are what I would expect a company producing such products to be doing anyway -- the requirements are pretty basic (design docs, threat assement, test evidence etc.). If you're a company cobbling together a product from random open source components and claiming security in marketing material without any of the required engineering to ensure actual security, then, well, you're a company that deserves to fail.
Posted Nov 14, 2022 18:28 UTC (Mon)
by Tov (subscriber, #61080)
[Link]
Maybe the software industry will also produce better products, if challenged a bit to do so.
Posted Nov 14, 2022 17:31 UTC (Mon)
by kleptog (subscriber, #1183)
[Link] (13 responses)
It does propose an exemption for open-source software, but doesn't answer the question of who will do the work instead. We can make regulations that say something should happen, but it's all pointless unless someone actually does the work. We don't want to swamp start-ups in compliance work, but giving them a free-pass to produce shitty products isn't really a good alternative either.
One approach would be to turn such "critical software" into a public good. The regular auditing of such product would be something undertaken or funded by a central authority. Or perhaps companies that do audits of open source software they use could actually publish the results. But that doesn't really help, because audits for complex software don't tend to find all the bugs. In general you're better off ensuring a good update mechanism than proving your software is bug-free.
Overall, I'm glad people are thinking about these issues, but I don't think it's going to be solved by regulation at this point. But you know, hammers and nails and all that.
Direct link to proposal: https://digital-strategy.ec.europa.eu/en/library/cyber-resilience-act
Posted Nov 15, 2022 7:41 UTC (Tue)
by eduperez (guest, #11232)
[Link] (12 responses)
According to the proposed legislation, the companies doing "commercial activities" with that software. In other words, if you buy a device or a license, for a product using open software, then the company selling that device or license should have done the "conformity assessment".
Posted Nov 15, 2022 9:13 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (6 responses)
And for now, the EU is leaving the details of how those companies do the conformity assessment wide open. Maybe we'll see each company assess the Linux kernel, OpenSSL, wpa_supplicant, hostapd, GStreamer etc separately. Maybe people will start buying commercial Linux distributions with conformity handled by the distro (SUSE, Red Hat, for example). Maybe we'll get consortiums doing the assessment for their members - join the "OpenWRT club" and you get access to the documentation you need to validate that a given release of OpenWRT is secure enough for this directive.
It feels to me like the EU has been hoping that the market will solve this; it's recognised that it's like pollution, where the cost to society of polluting is more than I gain from polluting, but the direct market cost to me is lower than my gain, and is now hoping that a small prod will get industry to behave itself.
Posted Nov 15, 2022 13:37 UTC (Tue)
by Vipketsh (guest, #134480)
[Link] (5 responses)
That's always what the EU wants irrespective of the problem -- nothing new here.
This proposal is the same as all EU proposals: start with a real problem that reasonably needs a solution and then create the most bureaucratic system possible, maybe make people pay something that is essentially a tax with no benefits and in the end does not really solve anything. e.g. The recent removal of VAT-free imports under some value, the computer registration thingy, GDPR, or the infamous banana/cucumber curvature laws.
Posted Nov 15, 2022 14:24 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (2 responses)
Given that you cite the banana/cucumber curvature laws, I suspect you've been getting your information from an anti-EU source.
The EU rules on fruit and vegetables say that there are at least two legally defined classes for all such foodstuffs:
The reason the laws became mocked in the UK is that a certain Boris Johnson (before he was fired from the Telegraph for falsifying some of his reports) noted that supermarkets insist on class I produce when buying cucumbers, bananas, carrots and many other produce items (while often being happy to buy class II tomatoes, among other produce), and that class II produce is perfectly fine to eat. He then claimed, falsely, that this was an EU-imposed rule - and yet the EU permits them to sell class II produce.
Posted Nov 15, 2022 15:45 UTC (Tue)
by HenrikH (subscriber, #31152)
[Link] (1 responses)
Posted Nov 15, 2022 17:04 UTC (Tue)
by farnz (subscriber, #17727)
[Link]
The UK certainly supported them, because (at the time they were enacted) different retailers had different contractual rules for what was acceptable, and if you tried to grow to (e.g.) Jumbo's standards, you'd end up with produce that didn't meet Carrefour or Tesco's standards. Supermarkets quickly adopted the EU's standards in their contracts with farmers since all produce has to be labelled correctly to be sold (so you knew any farmer would be able to sell you produce labelled as Class I or Class II), and the effort of maintaining your own standards and inspections was better spent elsewhere.
Posted Nov 15, 2022 14:47 UTC (Tue)
by eduperez (guest, #11232)
[Link] (1 responses)
Jokes aside, as someone who has worked on the impact on GDPR on a bank, I can assure you that GDPR is not "essentially a tax".
Posted Nov 15, 2022 15:37 UTC (Tue)
by Wol (subscriber, #4433)
[Link]
This then tends to clobber small businesses especially.
As someone responsible for a tiny charity, I simply insist on a paper trail. That way I should be able to provide proof, if required, at minimal cost to myself. The problem comes if you're big enough to attract the attention of the authorities, but not big enough to pay someone to do it.
Cheers,
Posted Nov 16, 2022 1:35 UTC (Wed)
by pabs (subscriber, #43278)
[Link] (4 responses)
Posted Nov 16, 2022 9:18 UTC (Wed)
by eduperez (guest, #11232)
[Link]
Posted Nov 16, 2022 12:11 UTC (Wed)
by farnz (subscriber, #17727)
[Link] (2 responses)
The notice of conformity applies to the device as sold, and obliges the supplier to meet the obligations set out in the notice of conformity. It does not extend beyond that.
If you're selling a device without software, you'd be crazy to have your notice of conformity cover any arbitrary software that's installed; instead, your notice would cover just the hardware, and you're only on the hook if the user can show that the hardware does not comply with the notice of conformity.
Posted Nov 17, 2022 4:25 UTC (Thu)
by pabs (subscriber, #43278)
[Link] (1 responses)
Posted Nov 17, 2022 11:36 UTC (Thu)
by farnz (subscriber, #17727)
[Link]
Yes, just as it incentivizs selling engines, wheels, fuel tanks, chassis, steering etc separately over selling cars, and giving consumers a choice of how they put their car together.
In practice, the consumer pressure to sell a complete car, with more obligations on the seller than car components have in aggregate (since you're also responsible now for the way the parts are put together, and for interaction between parts) results in companies not going down that route. I suspect that the same will be true of computers, phones etc - selling a complete device, and taking responsibility for the whole thing, is easier than selling hardware without an OS (but with lesser guarantees) and a separate OS that make guarantees on the assumption of hardware functioning in certain ways, thus not being responsible if the OS requirements to be secure aren't met by the guarantees the hardware offers.
Posted Nov 14, 2022 21:35 UTC (Mon)
by mat2 (guest, #100235)
[Link] (3 responses)
Posted Nov 14, 2022 22:07 UTC (Mon)
by mtaht (subscriber, #11087)
[Link] (1 responses)
Posted Nov 17, 2022 12:17 UTC (Thu)
by davecb (subscriber, #1574)
[Link]
That might actually be a good thing, and encourage vendors to contribute to funding an OpenWRT compliance certification, rather than trying to roll their own
--dave (irrepressible optimist) c-b
Posted Nov 15, 2022 7:37 UTC (Tue)
by eduperez (guest, #11232)
[Link]
* Perform a "conformity assessment" on OpenWrt.
The only different is that many components of OpenWrt could (potentially) have been audited before.
Posted Nov 15, 2022 1:24 UTC (Tue)
by tialaramex (subscriber, #21167)
[Link] (7 responses)
The medium term impact of EO 14028 was stuff like P2687R0 which is the first step of a proposal for C++ 26 describing this as an "Emergency" because of course the agencies said if you want secure software you should stop writing C and C++ (this draft proposal was written by Bjarne Stroustrup whose name might ring a bell)
I see P2687 and similar efforts as mostly an attempt to say "We're doing something" in the hope that politicians will quickly forget about this, the proposed work can then be abandoned, or at least limited to documentation which is then abandoned, and life goes on. It is certainly the case that there's lots of C and C++ Free Software. On the other hand though, there's also a LOT of Free Software written in languages which these reports say you should consider instead, such as Java and Python and most relevantly in this context (and to LWN) Rust.
So this has that crisis property where aspects of danger and opportunity combine. To the extent that Free Software prefers C to something safer like Java, there are risks that world governments will decide they value safety more heavily than before and choose non-free alternatives which don't have use-after-free bugs. On the other hand, to the extent Free Software communities embrace safety features more readily than big slow proprietary behemoths there's a chance worthy Free Software solutions dominate inferior but widespread and commercially successful alternatives that can't make themselves safer enough, quickly enough.
It is unclear to me, and perhaps somebody with the right perspective can explain, why both the US and EU decided they want their computers to be secure specifically in the last 2-3 years but not say, in the 1990s or 2000s.
Posted Nov 15, 2022 5:11 UTC (Tue)
by NYKevin (subscriber, #129325)
[Link] (3 responses)
Less stuff was computerized in the 90s. Even in the 2000s, it was still a bit early. China only hacked Google in 2009, and Stuxnet was 2010. Before those events, cyberwarfare basically didn't exist as a discipline. That just leaves the 2010s to explain, but I think we can write that off as politicians ignoring the problem and hoping it would go away on its own (or more realistically, hoping private actors would solve it without the need for government intervention).
Posted Nov 15, 2022 16:21 UTC (Tue)
by deater (subscriber, #11746)
[Link] (2 responses)
You might want to read "The Cuckoo's Egg" by Cliff Stoll. Or read up on the 1982 Siberian Pipeline explosion. Cyber-warfare appeared more or less the same time computer networks appeared, if not earlier.
Posted Nov 15, 2022 17:25 UTC (Tue)
by NYKevin (subscriber, #129325)
[Link]
Posted Nov 15, 2022 20:46 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link]
It hadn't happened. I worked in Russian energy sector in 2000-s and I was curious about this story, so I tried to find more details. Nobody knew about it, though many people heard this story. Later people claimed that it hasn't happened, and it certainly was not the biggest non-nuclear explosion ( http://ogas.kiev.ua/perspective/vzryv-kotorogo-ne-bylo-581 ).
Back then software was not out-of-box thing that you could just install on hardware and let it control everything. Pipelines also have multiple safeties and the leak wouldn't have resulted in a huge explosion, rather in a huge fire.
Posted Nov 15, 2022 20:53 UTC (Tue)
by mathstuf (subscriber, #69389)
[Link]
I think there's more awareness that this is a dire problem that the C++ committee needs to tackle. I also hope it is not a transient thing, but it's hard to see the future.
Posted Nov 18, 2022 17:07 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link] (1 responses)
That’s a direct result of the tensions in Eastern Europe since 2014. At first both the US and the EU hoped things would settle down, then they ordered audit after audit (are the Russians doing to us what they are doing to the Baltic states and Ukraine?), then eventually the PTB got fed up with ordering the same one-time emergency audit several times a year, and decided to strong-arm the private sector into being secure by default.
Posted Nov 18, 2022 17:12 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link]
Posted Nov 15, 2022 4:37 UTC (Tue)
by coriordan (guest, #7544)
[Link] (1 responses)
https://digital-strategy.ec.europa.eu/en/events/cyber-resilience-act-new-eu-cybersecurity-rules-ensure-safer-hardware-and-software
Posted Nov 18, 2022 10:20 UTC (Fri)
by Herve5 (guest, #115399)
[Link]
Posted Nov 15, 2022 8:18 UTC (Tue)
by jpfrancois (subscriber, #65948)
[Link] (20 responses)
Posted Nov 15, 2022 10:35 UTC (Tue)
by Lennie (subscriber, #49641)
[Link] (1 responses)
Posted Nov 15, 2022 10:36 UTC (Tue)
by Lennie (subscriber, #49641)
[Link]
Posted Nov 15, 2022 12:29 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (15 responses)
Part of the point of such a law is that you're now incentivised to find someone to give you a statement of conformity for the buildroot base image bundle you use, and to pay them to do the security assessment, rather than having your employer just go "I'm sure it'll all be fine, really, we're using published sources".
Whether you do this via a consortium or other group, whether you get it for free from the buildroot project, whether you pay a consultant to do it for you is left up to you.
Posted Nov 15, 2022 13:43 UTC (Tue)
by Vipketsh (guest, #134480)
[Link] (13 responses)
Posted Nov 15, 2022 14:41 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (12 responses)
Issuing a notice of conformity for a product or component carries liability with it if the notice is inaccurate. If the security assessment is done badly, and as a result the product or component does not actually meet the security standards you've claimed it meets, then you are liable to anyone in the EU who's affected by the deficiency.
And there's already ways to handle this within EU acquis if a product is built from components - the person issuing the notice of conformity for the product is liable until they demonstrate that the non-conformity is because a component is non-compliant. At that point, the entity that issued the notice of conformity for the component is responsible.
This isn't exactly an unusual process in industry - the EU adopted the whole "notice of conformity" thing from large company supply chains, because entities like the IEEE, IET and others already had this process set up for the standards they issued, complete with proforma conformance documents you can use to indicate how compliant you actually are with the standard.
Posted Nov 15, 2022 16:15 UTC (Tue)
by NYKevin (subscriber, #129325)
[Link] (11 responses)
Posted Nov 15, 2022 16:50 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (8 responses)
Ultimately, Intel for Intel's affected chips, Qualcomm for Qualcomm's affected chips etc.
And because they'd be sold as components, and not as final products, to get to liability, you'd have to show that if the component had conformed with the specification, then the product would not have had the non-conformance it actually had. So, for example, while I know that a former employer's product uses Spectre-afflicted chips, there should be no software running on it not supplied by my former employer, and thus there's no non-conformance of the final product (in turn, meaning that the non-conformance of the component is not a liability issue, since there's nothing to be liable for).
Posted Nov 15, 2022 17:27 UTC (Tue)
by NYKevin (subscriber, #129325)
[Link] (7 responses)
Posted Nov 15, 2022 17:58 UTC (Tue)
by deater (subscriber, #11746)
[Link] (3 responses)
I can assure you people knew and were aware of the side-channel attacks. Especially people in government 3-letter agencies but also security researchers in general. The chip companies pushed ahead anyway because they thought the attacks were too difficult to exploit, but it turns out they were wrong.
Posted Nov 15, 2022 18:44 UTC (Tue)
by NYKevin (subscriber, #129325)
[Link] (1 responses)
Posted Nov 17, 2022 8:28 UTC (Thu)
by anton (subscriber, #25547)
[Link]
My guess this lack of differentiation is also why the hardware manufacturers are not fixing Spectre: Software has been mitigating or lived with side channel attacks forever, so we (hardware manufacturers) don't need to fix Spectre. And yes, it's possible to fix Spectre at a tiny cost in performance and a modest cost in silicon; and yes, it's now over five years since Intel and AMD learned about Spectre, so they could have fixed it in their new cores in the meantime.
Concerning conformance statements, Intel etc. probably would not have signed a statement that claims freedom from side channel attacks, and if they explicitly mentioned that side channels exist, they would probably be legally in the clear wrt. Spectre. Now if the manufacturer of a device with an Intel CPU employed the classic mitigation techniques, and claimed that it does not reveal secret keys, but may reveal other data through side channels, the first claim would be false in the light of Spectre. I guess the statement would contain some language about "state of the art", which would indemnify them, though.
After Spectre was revealed to the public, such conformance claims would become more interesting, though.
Posted Nov 17, 2022 12:23 UTC (Thu)
by davecb (subscriber, #1574)
[Link]
Posted Nov 15, 2022 21:26 UTC (Tue)
by farnz (subscriber, #17727)
[Link] (2 responses)
But that's not the sort of thing that goes into the specification; the specification for a CPU will say things like "a process running in user mode cannot access data that does not have a page table entry permitting user mode access to that data". Spectre's trick was bypassing that separation; if my device is now insecure, because I can bypass the separate to exfiltrate a secret, then Intel's on the hook. If, on the other hand, my device does not run software that could exfiltrate secrets (e.g. because it's a locked down appliance), Intel's fine - there's no non-conformance of the final device, and thus the fact that a component is non-conformant is not a problem.
Posted Nov 17, 2022 8:43 UTC (Thu)
by anton (subscriber, #25547)
[Link] (1 responses)
A number of people have claimed that because there is MMU-based memory protection, that's the only protection that counts. I.e., that all the work people have been doing on avoiding buffer overflows in various ways does not count, because it's not enforced by the MMU, and therefore the CPU is free to reveal all the memory that a process has access to to an attacker.
But I have not seen any such specification in the architecture manuals I have looked at. Instead, the describe branch instructions in an architectural way. Microarchitectural things like branch prediction and caches are only described in optimization manuals, and are not supposed to change any architectural guarantees. One might argue that architectural guarantees just are about behaviour, not security, but that would mean that all these architectures give no security guarantee. Even the MMU does not give any guarantee that the Intel CPU does not switch to system management mode (SMM) between any two instructions, send the contents of all registers, all RAM, and all drives to the NSA, and then continue the execution where it left off.
that describe the architectural effects of a branch, and load statements
Posted Nov 17, 2022 11:44 UTC (Thu)
by farnz (subscriber, #17727)
[Link]
This leads to a really important point; a fault in a component is not necessarily enough to invalidate the notice of conformity for an entire device.
For example, if I sell a home router that includes a Spectre-vulnerable CPU, but the security guarantees of the router (as included in its notice of conformity) are met because the router does not run externally provided code, and the noise in the network timing overwhelms side channel noise from the supplied code, then there's no liability, even though the component might not meet its notice of conformity as sent to the router manufacturer - the final device still meets its notice of conformity despite the CPU fault.
And it's entirely possible that commercially, device manufacturers will find themselves without risk-free combinations of components; a CPU may, in its notice of conformity, not guarantee any means to completely separate different processes running on the CPU, while an OS may make security guarantees that only hold if the CPU guarantees a means to separate different processes. It's then up to the device manufacturer to decide whether or not they are willing to take the risk on the CPU functioning as promised, or whether they're going to find insurance against the risk, or whether they simply give up on selling such a device inside the UE.
Posted Nov 15, 2022 18:36 UTC (Tue)
by mfuzzey (subscriber, #57966)
[Link]
So spectre is a huge deal for cloud providers whose whole business model is renting compute capacity on the same hardware to multiple mutually untrusted customers.
I hope this law allows manufacturers to say "yes we know the hardware / software has vulnerabilities A,B,C but in our case that doesn't matter because of X,Y,Z
Posted Nov 15, 2022 20:29 UTC (Tue)
by Cyberax (✭ supporter ✭, #52523)
[Link]
Liable for _fixing_ the consequences once the defects were discovered? Manufacturers of covered devices, who probably would get some money from Intel.
Posted Nov 18, 2022 17:36 UTC (Fri)
by nim-nim (subscriber, #34454)
[Link]
Posted Nov 15, 2022 14:41 UTC (Tue)
by eduperez (guest, #11232)
[Link] (1 responses)
In other words, "we ship a product, that could be full of security holes, but we do not know or care"... and that's exactly what this law is trying to avoid.
Posted Nov 15, 2022 15:49 UTC (Tue)
by jpfrancois (subscriber, #65948)
[Link]
I honestly don't know how you qualify a "security hole" for a "custom distro" If a library with a CVE is used, but is never exposed to malicious input, how is that a problem for the shipped hardware ? And I am pretty sure this kind of custom distro (buildroot / openembedded) is heavily used.
Rules for hardware conformance don't change every now and then. Yet hardware certification is a time consuming process with bureaucratic traps.
This can raise the bar and improve the global software security status. But I know our barely profitable (but growing) business would be much less profitable if we had to have someone to do this kind of security analysis in house.
Posted Nov 15, 2022 16:12 UTC (Tue)
by mgb (guest, #3226)
[Link] (6 responses)
Nobody will actually ship any bug free software - and I say this as someone who has shipped a few small scale commercial products in which no bugs were ever reported.
Small businesses will be overwhelmed by the paperwork or swatted like flies whenever they look set to compete on some behemoth's turf.
Big businesses will have somebody fill out some forms and when the inevitable bugs surface they can afford the lawyers to avoid any serious consequences.
Posted Nov 15, 2022 18:41 UTC (Tue)
by mfuzzey (subscriber, #57966)
[Link]
But I fo agree it may have that effect.
Posted Nov 16, 2022 15:26 UTC (Wed)
by nilsmeyer (guest, #122604)
[Link] (1 responses)
Posted Nov 17, 2022 7:59 UTC (Thu)
by eduperez (guest, #11232)
[Link]
Posted Nov 17, 2022 13:18 UTC (Thu)
by zoobab (guest, #9945)
[Link]
Same story for software patents.
The patent industry is about to launch the Unified Patent Court, where the judges are pro-software patents.
Posted Nov 19, 2022 16:24 UTC (Sat)
by kleptog (subscriber, #1183)
[Link] (1 responses)
* All code that is merged has been reviewed by another developer
Frankly, in this day and age I would consider the above to be the *absolute minimum* for a business selling a software product. This isn't paperwork, it's basic checklist for "what's makes a good software development environment". With Linux distributions, things like "npm audit" and "pip-audit" there is no excuse for not knowing if you're shipping anything with known issues. Someone like the Linux Foundation could turn this into a template conformity notice which you could cut and paste and adjust to suit.
The above would almost get you through an ISO27001 audit if you do some extra work.
Here open source has a significant advantage, because all these steps are public and to an extent automated. If you're using some proprietary library you have to cross your fingers that they're telling you about any issues they have.
By the way, I disagree EU regulations are mostly for big businesses. If you look at Brexit, it's the small businesses being driven to the wall, not the big ones.
Posted Nov 20, 2022 11:12 UTC (Sun)
by farnz (subscriber, #17727)
[Link]
One thing people don't take into account when looking at regulations is that they result in it being simpler to sell to people with lots of choices. If you want to sell to Intel, or Sony, or Apple, or any other large buyer, they will impose "standard terms" on you that you must meet in order to sell to them. In the absence of regulation, those standard terms, while having the same objective, will each impose a different compliance burden on the seller to meet the buyer's conditions.
Regulations change that - everyone has to meet the regulations, and so big buyers replace pages and pages of requirements that you must meet with "you will comply with this regulation, and you will provide this indemnity against any costs we incur as a result of your non-compliance". As a small business, you're now able to compete much more easily - the job you do to comply with the regulations so that you can sell to Intel means that if the Intel deal falls through, you can reuse most of that work as part of trying to sell to Sony, or Apple, or another large buyer.
You see similar in hardware - if I have to write a driver for a chip for every OS out there (FreeBSD, OpenBSD, NetBSD, Linux, Android, macOS, iOS, Windows, QNX, VxWorks, Nucleus, and all the others) in order to sell it, it's hard to compete with established players who've already done all that work. If I just have to supply a source-form driver for Android and can rely on my customers porting to whatever OS they care about, it's a lot easier.
Posted Nov 17, 2022 17:00 UTC (Thu)
by esemwy (guest, #83963)
[Link]
Posted Nov 18, 2022 3:43 UTC (Fri)
by GNUtoo (guest, #61279)
[Link]
So it makes me wonder if the proposed regulation (will) take into account the threat models and use cases?
In many cases security is very subjective or dependent on the context.
For instance Damn Vulnerable Linux ( https://en.wikipedia.org/wiki/Damn_Vulnerable_Linux ) has been made vulnerable on purpose but it's not a threat for users unless it's misused.
Another example is with booting: Having a bootrom exploit is often a good thing for free software as we can then have free software bootloaders on devices[1], etc. Restricted boot is also not desirable from a user freedom point of view.
If I understood right it may apply on these cases because the software "is designed to run with elevated privilege or manage privileges;", and the commercial activities isn't clearly defined. The question here is what is these "essential requirements"?
I also wonder if they (will) have exceptions for the second hand market because there are often known security vulnerabilities in older devices. For instance for smartphones there are many security flaws in the modem of older devices like remotely exploitable bugs (like a bug in some ASN-1 related software[2]) and you could also see the support of older protocols as broken because their encryption is too weak from today's point of view. And here that assumes that there is some community distribution that is up to date when the devices are sold. If there isn't (which is probably the case most of the time) then the devices are vulnerable.
And here too proper funding can be part of the solution. For instance the EU has funds to improve privacy and work on free software projects with that through organizations like NLnet (https://nlnet.nl/)[3]. So here funding people to upstream support for devices in Linux and various other projects could help.
References:
Denis.
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Wol
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
* Perform a "conformity assessment" on their own firmware.
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Info session about the CRA
Info session about the CRA
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
We don't have the resources to assess the security property of this 'base image bundle'. Is the buildroot project able to provide some kind of security assessment ?
I guess the same questions goes for openEmbedded based images, or distro based embedded systems.
Beyond looking at the software project, and saying "looks like they do stable release, should be ok for us" there is not much more i can do given my limited skills :)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
deater is doing the usual trick of writing about a different thing than you do: He writes about side-channel attacks in general (which I was taught about as a student in the 1980s, and for which the way we deal with them is to write software that deals with secret keys in a special way), you talk about speculative side-channel attacks (which were discovered in 2017, and for which the mitigation mentioned above is insufficient).
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
What you describe sounds like Meltdown, not Spectre.
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
But on an embedded device that only runs software provided by the manufacturer it's not really an issue.
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
> We don't have the resources to assess the security property of this 'base image bundle'. Is the buildroot project able to provide some kind of security assessment ?
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
CVE list on the other hand, change every day.
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
* For any line of code you can work out who wrote it and when. And who reviewed it.
* Bugs that reported are tracked and fixes can be linked to them
* For all code you import from elsewhere you have an identifiable source
* For each of those sources you track any notices of security related issues
* You don't ship known obsolete software
* If you see one of the components you use has a published security issue, you fix it or determine it's not relevant.
* For your released product you provide a way to deliver timely updates to your customers
* (Bonus points) You've done an architectural review to identify the risky parts of your product and spent some extra effort securing those.
EU regulations versus business size
Open-source software vs. the proposed Cyber Resilience Act (NLnet Labs)
Scope and use cases
---------------
[1]https://switch.homebrew.guide/hacking/fuseegelee/sdsetup
[2]https://neo900.org/news/about-the-asn1-vulnerability
[3]I've no idea if that's related to NLnet labs or not.
