|
|
Subscribe / Log in / New account

The European Cyber Resilience Act

September 19, 2023

This article was contributed by Marta Rybczyńska

The security of digital products has become a topic of regulation in recent years. Currently, the European Union is moving forward with another new law, which, if it comes into effect in a form close to the current draft, will affect software developers worldwide. This new proposal, called the "Cyber Resilience Act" (CRA), brings mandatory security requirements on all digital products, both software and hardware, that are available in Europe. While it aims at a worthy goal, the proposal is causing a stir among open-source communities.

There is a reason why the open-source world has concerns: the legislation indirectly defines who is responsible for the security of open source and who should pay to improve the current state. In addition, it puts the responsibility on individual developers and foundations hosting open-source projects instead of the manufacturers of goods embedding the software. It could have important consequences for open source across the globe.

The original proposal of the CRA (the main text and the annexes) brings several requirements (enumerated in Annex I) for "products on the market" (more on definitions a little later). Most requirements are generally accepted best-practices, such as a secure default configuration, providing security updates, protection from unauthorized access, and releases free of known vulnerabilities; others could be considered somewhat vague like "(g) minimise their own negative impact on the availability of services provided by other devices or networks".

Each product (which means every release for software) would need to provide "an assessment of the cybersecurity risks" along with the release-related documentation (listed in Annex II). The security assessment covers the software itself and all its dependencies; the way to perform the assessment could fall under one of three categories, from self-assessment to the involvement of a third party. Self-assessment is the default. However, the regulation requires a stricter approach when the product's "core function" falls into the category of a "critical product with digital elements" (listed in Annex III), which is further divided into Class I (less critical) and Class II (more critical). Depending on their class, products must undergo a mandatory external security assessment; all Class II products are required to do so, while Class I products must only if they do not follow the (not yet defined) "harmonised standards, common specifications or cybersecurity certification schemes".

The release-related documentation needs to cover (among other items) the product's expected uses, its support time frame, and the way to install security updates. All manufacturers must have a vulnerability-reporting procedure and release security updates free of charge for users. If a manufacturer learns about an "actively exploited vulnerability", it is expected to notify authorities rapidly (24 hours in many cases; followed by a complete analysis one month later). Finally, a party not complying with the requirements may be subject to fines of up to €15-million or up to 2.5 percent of the worldwide annual turnover (gross revenue), whichever is higher.

Commercial activity

As the devil is in the details, definitions have an important impact on understanding the CRA. The most important term is, without a doubt, "commercial activity" as found in the following sentence:

In order not to hamper innovation or research, free and open-source software developed or supplied outside the course of a commercial activity should not be covered by this Regulation.

The definition for "commercial activity" comes from a document called "The Blue Guide", which has examples on page 21 that provide a little more explanation. The text states that commercial activity means providing goods in "a business related context" and the decision if a particular activity is commercial or not should be done:

on a case by case basis taking into account the regularity of the supplies, the characteristics of the product, the intentions of the supplier, etc. In principle, occasional supplies by charities or hobbyists should not be considered as taking place in a business related context.

Without further clarification, one might consider many open-source projects as commercial activities, especially mature ones that are used widely and make regular releases. Those with developers hired to work on the project might also qualify.

The definition of commercial activity is the main point affecting open-source projects. It will not affect hobby projects run by unpaid volunteers only. It could (depending on the any future judgments) affect all others. If that wording is not changed in the final version of the CRA, it could cause uncertainty, leading to lesser involvement in open source.

Who is the manufacturer?

The discussion of commercial activity brings us to the other primary term of the CRA: the "manufacturer". Again, according to the "Blue Guide" (page 34): "The manufacturer is any natural or legal person who manufactures a product or has a product designed or manufactured, and places it on the market under his own name or trademark". The CRA also defines other roles like the distributor or importer, but "manufacturer" will be the most important one in our case. The manufacturer is critical because they are responsible for fulfilling requirements and could face the fines mentioned above.

Let us take the example of a project hosted by an open-source foundation. That foundation could be considered the manufacturer, even if it might have limited impact on the development process and concentrates on governance and organizational aspects. The definition of the manufacturer might be even more complicated when there is no formal legal entity. Is the person tagging a release the manufacturer in this case? The person with the most commits? All the maintainers together? Their employers?

If foundations or other supporting organizations are to be classified as manufacturers, they will be required to put additional constraints on projects and how they develop and release. That could cause significant tension, especially for projects with no established security culture.

Another point is the need for a budget in case an external assessment is required; currently, operating systems and networking software both require an external assessment. A typical budget for a security assessment is in tens of thousands of dollars (or euros). The exact scope of the audit for CRA requirements will be defined further (the general description appears in Annex VI), but we might assume a need for a similar budget for every non-bugfix release. It will be vital for many projects to figure out who will pay for assessments. Many of them, including virtually all without an organization backing them, will not be able to pay that fee.

Even if a project falls under self-assessment only, its release must be accompanied by several documents, including a risk analysis. Writing and verifying this information would mean additional work for projects, though the exact content of this documentation still needs to be fully defined.

Vulnerability and exploitation reporting

Under the CRA, each manufacturer needs to have a vulnerability-reporting process and, in most cases, provide security updates. Logically, they would also have to provide updates of all dependencies, especially when they are linked into the main product. A recent discussion in the Linux kernel community showed that timely delivery of security fixes is not yet a solved problem; regulation like the CRA might actually push device vendors to publish fixes more rapidly.

One clause also requests all manufacturers to report each "actively exploited vulnerability" whenever they learn about it. The report must happen within strict time limits, such as a first notification in 24 hours (though it is not required for small companies). The company should provide a detailed analysis with a fix within one month. These reports are sent to the European Union Agency for Cybersecurity (ENISA). With that pile of 0-days, ENISA will become a juicy target for attacks (though one might argue that a service like GitHub's private advisories is already such a target).

The obligation to notify about all issues also breaks normal disclosure processes. These days, vendors disclose vulnerabilities only after a fix is available. Also, the one-month limit for a complete analysis might sometimes be hard to meet. The industry typically uses 90 days, but some vulnerabilities (notable examples include hardware issues like speculative execution bugs) take months from the discovery to the fix.

The reaction

After the original proposal was published in September 2022, the open-source community started rapidly responding to it. One of the first reactions came from a foundation, NLnet Labs, in a blog post describing the impact on its domain: network and routing protocols. Many of the projects the foundation works on fall into the "critical products" category and would require significant additional work, possibly including an external security audit. Also, they note that some of these tools, which may have been available for dozens of years, could be considered "commercial" so they would fall under the regulation even though the organization gets no income from that software.

Other organizations have done their own analysis as well. A blog post from Mike Milinkovich of the Eclipse Foundation lists all of the documentation and assessment that the Foundation would have to do for each released software version; he also mentions the uncertainty of which tools would be classified as critical or highly critical. During FOSDEM 2023, a panel (WebM video) took place where European Commission representatives answered questions from the community. This session mostly concentrated on the impact of the definition of "commercial activity" that is a condition for the product to fall under the scope of the CRA. It was said that even for charities, it is going to be case-by-case determination. Also, a Commission representative said that the goal of the regulation is to force manufacturers to do more due diligence on the components that they include; they will be obligated to do so as part of their security assessment.

In April 2023, multiple organizations released an open letter asking the Commission to "engage with the open source community and take our concerns into account". In addition, some specialized communities such as Content Management Systems (CMSes) wrote their own open letter, which was signed by representatives of WordPress, Joomla, Drupal, and TYPO3. Interested readers may also want to look at the list of reactions that is maintained by the Open Source Initiative (OSI).

Possible response

If the regulation is put in place in its current form, some projects may not want to risk being classified as "commercial activity" and may decide to state that their code will not be available to the EU (enforced by technical means or not). That also raises interesting licensing questions; for example, it is not clear that GPL-covered code can have such a restriction. When code is restricted in that way, the security assessment for any downstream projects using that project in the EU will become more complicated — it could even mean that each downstream project needs to perform that audit independently. Or remove the dependency.

Some projects might decide to stop accepting contributions from developers employed by companies, or not take donations from companies, in order to not be classified as commercial. That could seriously impact those projects, reducing both their funding and the development base. But, even if such a project falls under the non-commercial category, downstream users might be using it otherwise.

Finally, there may be an impact on the number of new open-source projects. Convincing a (big) company to open-source new work is a daunting task; if there is more burden related to liability and documentation, fewer companies will release projects that are not crucial to their goals. This could affect tools like those for testing, continuous integration, for programming embedded devices, and so on. An increase in the number of forks is also likely; companies may want a version of some project's code with changes for CRA compliance. That could in fact decrease overall security, instead of improving it.

The current state

In addition to the initial version, as of August 2023 there are currently two sets of amendments, one from the EU Council, and another from the EU Parliament, that resulted from the work of Parliament committees. The most recent vote in the EU Parliament committees took place in July 2023. The next step is negotiations (called a "trilogue") between the Council, Commission, and Parliament to come up with a final version.

The two set of amendments change certain details, but not the general thrust of the regulation. Both of them change lists of the "critical product with digital elements" (those that might require an external audit). The amendments from the Council shorten both lists, while ones from the Parliament move all routers to Class II and add new categories to Class I (home automation, smart toys, etc.).

They also both modify the "open-source exception". The Parliament version seems to move requirements to companies and gives a set of examples of commercial and non-commercial activities. An indication of a non-commercial activity is a fully distributed model and lack of control by one company. On the other hand, if "the main contributors to free and open-source projects are developers employed by commercial entities and when such developers or the employer can exercise control as to which modifications are accepted in the code base", that is an indication of commercial activity, the same as regular donations from companies. It also states that individual developers "should not be subject to obligations pursuant to this Regulation". The Council's version seems to cover more products by the exception:

this Regulation should only apply to free and open-source software that is supplied in the course of a commercial activity. Products provided as part of the delivery of a service for which a fee is charged solely to recover the actual costs directly related to the operation of that service [...] should not be considered on those grounds alone a commercial activity.

The start of the negotiation on the final version is likely to happen after the summer break (which means in September 2023). Note that European Elections will happen in early June 2024 which means that the process is likely to be rushed to completion before that date.

The outcome?

Improving the security state of the digital world is a worthy goal, and many ideas the CRA brings are reasonable best practices. However, the impact of the current form of the regulation is difficult to predict. In the open-source world, it could be putting all the burden on upstream projects. These projects are frequently underfunded, so they might not have the resources to perform all the required work; that analysis and documentation work is worth doing, but funding has to be available in order to make it happen. FOSS developers, especially those working in the embedded space, should be paying attention to this legislation, as there is more to it than our summary above covers. Readers in the EU may want to contact their representatives about the CRA, as well.


Index entries for this article
GuestArticlesRybczynska, Marta


to post comments

The European Cyber Resilience Act

Posted Sep 19, 2023 19:44 UTC (Tue) by kleptog (subscriber, #1183) [Link] (6 responses)

Out of interest, is this written by someone with a background in EU law, or by an interested lay person?

Because in this whole discussion I get the feeling that there's a lot of reaching for worst case scenarios and looking for interpretations of words that seem pretty far-fetched. The scope here is severely limited by treaty and other acts.

I appreciate the update though, and the article is a nice summary of the current situation.

The European Cyber Resilience Act

Posted Sep 19, 2023 19:57 UTC (Tue) by pizza (subscriber, #46) [Link] (2 responses)

> Because in this whole discussion I get the feeling that there's a lot of reaching for worst case scenarios and looking for interpretations of words that seem pretty far-fetched.

Eh, think of the law as being written in an ambiguous, badly-scoped programming language. Sure, there's the nominal intent of the law, but there may be unintentional (or for the cynical, intentional) side effects that might only crop up in some specific corner cases. It just so happens that developers of F/OSS seems to heavily intersect this time.

Just because the current crop of politicians/courts/etc can be trusted and is "reasonable" doesn't mean their successors will be. History has shown us over and over that bad actors can and will take advantage of these fuzzy grey areas, and it's always the relatively powerless that pay the price.

The European Cyber Resilience Act

Posted Sep 19, 2023 21:20 UTC (Tue) by kleptog (subscriber, #1183) [Link] (1 responses)

If politicians/courts/etc want to be evil, they're not going to let themselves be held back by the precise meanings of words. They'll simply say the meaning they want was the correct meaning all along and do what they wanted to anyway. The fact that everyone in this process recognises the unusual position of open source is far more important than the precise words used to describe it. The best defence is to prevent power accumulating in the first place so that what a single person says doesn't have significant impact.

The European Cyber Resilience Act

Posted Sep 21, 2023 8:44 UTC (Thu) by smurf (subscriber, #17840) [Link]

> If politicians/courts/etc want to be evil, they're not going to let themselves be held back by the precise meanings of words

Yeah, but if they are (or act) merely incompetent or clueless, words with precise meaning are the last resort of the competent.

The European Cyber Resilience Act

Posted Sep 19, 2023 22:09 UTC (Tue) by ghodgkins (subscriber, #157257) [Link]

I would assume from the sources that many of the statements in the linked "list of reactions" [1] were made or approved by someone familiar with EU law.

[1] https://blog.opensource.org/the-ultimate-list-of-reaction...

The European Cyber Resilience Act

Posted Sep 20, 2023 4:42 UTC (Wed) by mrybczyn (subscriber, #81776) [Link] (1 responses)

The author of the article here. I have no background in EU law, but such people have done proofreading of the piece and suggestions beforehand. Without that help, that would have been so much harder!

The European Cyber Resilience Act

Posted Sep 21, 2023 15:52 UTC (Thu) by jajpol (subscriber, #8044) [Link]

Appreciated

The European Cyber Resilience Act

Posted Sep 19, 2023 20:59 UTC (Tue) by florianfainelli (subscriber, #61952) [Link] (1 responses)

Having gone through the EU CRA months ago for the purpose of understanding how it will impact some of my employer's deliverable (Linux kernels, toolchains, build systems) has made me left with a few thoughts. The general feeling was that this is going to be a mess, far reaching and definitively increasing the cost of doing business.

The one positive outcome that I see is that if you had a company who was just aggregating open source software and packaging that as a product, possibly on top of hardware, now they have a responsibility into addressing security issues. Before that point, only their reputation was at stake but they could have been largely free loaders. Now they need to have a more active role and realize the cost of using open source (or not) software components in their products and take some amount of ownership into ensuring they are somewhat secure.

There is a big unknown with the timing and boundary aspects of "shipping free of vulnerabilities", by the time the product is out of the door, there might already be a vulnerability discovered in the version of the Linux kernel that you just shipped with. Products can stay on shelves with their manufacturing firmware for weeks, maybe months until inventory is sold.

I cannot shake the feeling that this act has been heavily lobbied by European companies in the certification/compliance business under the flawed pretence that consumer experience will improve and the world (or the EU at least) will be a better place for people to use their electronics. This appears to be a response to the EU having seen a sharp decline in manufacturing popular consumer electronics products (specifically phones, tablets, computers). As such, making certification harder for other economies (China, US, etc.) to sell to the European market is perceived as a way to grab a share of the pie that has escaped the EU for years. What I suspect will happen is that some companies may skip a version of two or their products since the cost of doing business in the EU will be judged too high.

So all in all, if you look at it, this is going to be a form of a tax imposed on manufacturers, but paid by the consumer, because no company will ever eat into their profit and give you the certification cost for free.

It seems to me we would have a better chance at improving the overall industry if we were to fine companies for failing to address security vulnerabilities that they had been made aware of. It teaches a lesson to anyone in the industry and reminds you that you cannot just accumulate profits and walk away.

Products that are absolutely mission critical should hopefully already be under similar legislation, if not, then clearly they should be the prime target, because nobody needs their ultra sound or MRI machine to be ground zero for a hospital hack.

My 0.02€

The European Cyber Resilience Act

Posted Sep 20, 2023 8:26 UTC (Wed) by farnz (subscriber, #17727) [Link]

I would just note that it's meant to "increase the cost of doing business". Right now, businesses get to push the cost of dealing with security onto everybody else, including people who aren't their customers; for example, if your business shipped an industrial tool 5 years ago that's now out of support and that has a security hole that's used to attack my home systems, that's my problem, even though I didn't buy anything from you.

The goal of the CRA is to push those costs onto the businesses that have a way to prevent them. In my example, you'd be on the hook for fixing that problem. For businesses that already care, this imposes a small cost in the bureaucracy of showing compliance; for businesses that don't care, you've now got to spin up the infrastructure to support in-the-field software updates, the engineering team(s) to provide such updates on a timely basis, and the additional bureaucracy.

The European Cyber Resilience Act

Posted Sep 19, 2023 21:03 UTC (Tue) by flussence (guest, #85566) [Link]

The tight time limit on reporting vulns and publishing fixes for them seems like a great way to make "bug-compatible" distro forks unviable, especially when upstream already has a well-established apparatus for obfuscating changes…

The European Cyber Resilience Act

Posted Sep 19, 2023 21:07 UTC (Tue) by kleptog (subscriber, #1183) [Link]

So I looked through all the amended texts by the Council and the Parliament and am impressed by the breadth and scope of some of the changes. It's also interesting to see the different focus: the Council has more amendments that attempt some refactoring, and spends more time on the powers of the Commission. While the Parliament focussed on many of the smaller (though important details), and worked on the impacts on SMEs and how the money is spent. The Expert Group proposed by the Parliament is an interesting touch. I'm not sure if there is precedent for this kind of new committee, but it seems reasonable. The Council added the ability for collective actions which is something I hadn't thought of.

There doesn't seem to be much disagreement between the two texts, but it's going to take a lot of word-smithing to get this down to a single document. Both the Council and the Parliament drastically changed the paragraphs relating to free-software in largely compatible ways, but someone needs to work them down do a single version both agree with.

The European Cyber Resilience Act

Posted Sep 19, 2023 21:48 UTC (Tue) by dullfire (guest, #111432) [Link] (11 responses)

The GPL does not permit additional requirement. Furthermore it explicitly says that if you can't comply with the GPL AND the law, then you can not redistribute the covered work.

That being the case, I wonder if that would effectively make use of GPL'd projects that originate outside the EU (where these requirements do not exist), in works by parties inside the EU a breach of the GPL.

PS. I am not a lawyer. And definitely not one for EU related maters.

The European Cyber Resilience Act

Posted Sep 19, 2023 22:10 UTC (Tue) by pizza (subscriber, #46) [Link] (6 responses)

> That being the case, I wonder if that would effectively make use of GPL'd projects that originate outside the EU (where these requirements do not exist), in works by parties inside the EU a breach of the GPL.

Why wouldn't this apply equally to GPL'd projects written from within the EU?

After all, it's not the national origin of the software that matters; it's whether nor not it comes with the necessary paperwork?

The European Cyber Resilience Act

Posted Sep 19, 2023 22:25 UTC (Tue) by dullfire (guest, #111432) [Link] (5 responses)

Because for a project originating inside the EU, the law constrains the author before the copyright exists. Where as an "imported" works could only happen after, and under the terms of the GPL (since that defines the terms which the subject can use the works).

To put another way: The reason I don't think it would be a GPL violation for an in-EU author is the author would directly (potentially) be held liable. Where as an external project only has a nexus to the author via their GPL license.

Anyhow. Not a lawyer. Just musing that I think that goes against at least one part of the GPL's terms (the prohibition on adding terms)

The European Cyber Resilience Act

Posted Sep 20, 2023 13:43 UTC (Wed) by Wol (subscriber, #4433) [Link] (4 responses)

> Anyhow. Not a lawyer. Just musing that I think that goes against at least one part of the GPL's terms (the prohibition on adding terms)

Or it could be a simple case of "intersection of requirements". If the GPL imposes one set of requirements (that you pass on everything you receive) and the law imposes a different set of requirements (if you give a product to someone, you must provide a warranty), then there is not necessarily any conflict. Just because the GPL says "this software comes without warranty" doesn't mean it conflicts with "the law says you must provide a warranty". The legal warranty you provide is totally irrelevant to the fact that software has no warranty.

It's actually very similar to the copyright/patent situation. Just because patent law may say "you can't use this software", it has no impact on the GPL saying "you may freely share AND USE this software". The software authors have given you the right to use the software, the fact that the law says exercising that right is illegal under a different (patent) legal code is irrelevant to the GPL. v2 at least, v3 attempts to address this.

Even with ITAR and arms regulations etc etc, if the GPL allows you to freely distribute "illegal" software, you're in the clear as far as the authors of the software are concerned. Doesn't stop the government coming after you for distributing "illegal munitions", but it's nothing to do with the GPL.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 20, 2023 13:44 UTC (Wed) by paulj (subscriber, #341) [Link] (3 responses)

You mean union of requirements, right?

The European Cyber Resilience Act

Posted Sep 20, 2023 14:05 UTC (Wed) by Wol (subscriber, #4433) [Link] (2 responses)

Union? Intersection?

I was thinking of the case where requirements collide.

Actually, I think I can now word it far better. The GPL places requirements on the GIVER. The law places restrictions on the RECIPIENT. Where this is the case there can be no GPL violation. And, in this particular case, I think this is the actual state of affairs.

Americans are free to distribute GPL software into Europe, CRA or no CRA. If it's not been advertised in Europe then there is no "placed on the market", and it's a grey import.

Europeans are then free to distribute it, provided they comply with the extra legal burden of the CRA. And the GPL has no say here, because the "additional requirement" of complying with the CRA is not being passed on by the giver, but is imposed (or not) by the law.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 20, 2023 14:23 UTC (Wed) by pizza (subscriber, #46) [Link] (1 responses)

> Americans are free to distribute GPL software into Europe, CRA or no CRA. If it's not been advertised in Europe then there is no "placed on the market", and it's a grey import.

As currently drafted, simply being *made available* (even for zero cost) to an EU citizen is sufficient to be considered "placed on the market" for purposes of the CRA.

When the various proposed changes are reconciled together, we shall see what the new text says... But until then...

The European Cyber Resilience Act

Posted Sep 20, 2023 15:26 UTC (Wed) by Wol (subscriber, #4433) [Link]

> As currently drafted, simply being *made available* (even for zero cost) to an EU citizen is sufficient to be considered "placed on the market" for purposes of the CRA.

Hmmm ... that's scary.

Because as I understand it, "placed on the market" is a term of art defined elsewhere in other (consumer protection?) legislation, and if the CRA is re-defining it, then that is a massive change - far bigger than just cyber-security and what-not.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 19, 2023 22:44 UTC (Tue) by jkingweb (subscriber, #113039) [Link] (2 responses)

With the caveat that I have read little beyond the summary in this article, I don't think you're articulating an actual problem. The GPL forbids licensees from imposing additional requirements on sub-licensees. The GPL does not (cannot) forbid the state from separately imposing requirements on software distributors. Don't we already have such restrictions with e.g. frequency and power requirements for software defined radios?

The European Cyber Resilience Act

Posted Sep 20, 2023 11:12 UTC (Wed) by mrybczyn (subscriber, #81776) [Link] (1 responses)

The author here again: I will clarify the GPL example here. If a project has a license saying it is not expected to be used in the EU, it isn't covered by the CRA, because it is not placed on the (EU) market. Adding such a restriction to a GPLed project looks problematic, at least.

I haven't checked that interpretation with lawyers.

The European Cyber Resilience Act

Posted Sep 20, 2023 12:31 UTC (Wed) by pizza (subscriber, #46) [Link]

> Adding such a restriction to a GPLed project looks problematic, at least.

I don't see why; Nothing forces the software author to make the softare available to everyone in the world. If the author blocks EU IP addresses from their download side, that's not a GPL restriction.

Now if someone _else_ distributes the software within the EU -- say, Debian, will Debian face liability?

(FWIW, I have a really hard time seeing how any major Linux distribution could continue to exist under the CRA. Or should only RHEL or SLES face liability because those are commercial products, but CentOS/Fedora or OpenSUSE/Tumbleweed not be, as they're provided Gratis, As-Is?)

The European Cyber Resilience Act

Posted Sep 20, 2023 8:23 UTC (Wed) by farnz (subscriber, #17727) [Link]

There are already additional requirements in the USA on software that's distributed there under the GPL (notably FCC spectrum requirements for WiFi chips). If those restrictions qualified as "you cannot redistribute the covered work", then you could not ever have distributed a WiFi driver in the USA for any driver that allows software to have some say in which frequencies it uses (which covers all SoftMAC WiFi chips, for example).

Secondly, the text of the GPL does not prohibit you from distributing if additional requirements apply to you; it prohibits you from imposing additional requirements beyond the GPL on downstream recipients. It also says that you are only blocked from distributing if the additional requirements conflict with the GPL. In this case, you are not imposing a liability requirement on downstream (that's coming from the law), so you're not blocked by the "no additional requirements on downstream". You're also not facing requirements that conflict with the GPL; the GPL does not say "you must not accept liability for software you distribute", and thus you're safe on that count.

The European Cyber Resilience Act

Posted Sep 20, 2023 3:58 UTC (Wed) by wtarreau (subscriber, #51152) [Link] (23 responses)

My concern with this requirement of extra bureaucracy inflicted on developers is that it will further speed up the migration from distributed software to hosted software, and software will further be replaced by centralized services such as google docs and so on, so that there's no vendor anymore and no notification to be done. What a mess, really! Those working on such laws should have put their hands on a keyboard before writing their crap, and dealt with bugs and backports to learn a bit on the subject they're speaking about. They would particularly discover that the "analysis of impacts" and so on is commonly unknown because you fix a bug and continue on something else. Developers cannot afford to spend one month on a bug to figure if there's anything security-related behind it.

Software development in EU might be living its last few years.

The European Cyber Resilience Act

Posted Sep 20, 2023 12:16 UTC (Wed) by corsac (subscriber, #49696) [Link] (12 responses)

SaaS like Google Docs is under the scope of the CRA. You can't “escape” it buy selling a service instead of a product.

The European Cyber Resilience Act

Posted Sep 20, 2023 22:51 UTC (Wed) by neggles (subscriber, #153254) [Link] (11 responses)

Actually, you can:

> Products provided as part of the delivery of a service for which a fee is charged solely to recover the actual costs directly related to the operation of that service [...] should not be considered on those grounds alone a commercial activity.

That sounds like an out for SaaS to me.

The European Cyber Resilience Act

Posted Sep 21, 2023 6:15 UTC (Thu) by znix (subscriber, #159961) [Link]

Doesn't that wording preclude turning a profit from said service?

The European Cyber Resilience Act

Posted Sep 25, 2023 15:44 UTC (Mon) by Wol (subscriber, #4433) [Link] (9 responses)

No I don't think it does. *Products* provided as part of a *service*.

In other words, if I provide a bunch of download servers, which distribute stuff (product) other people supply, no liability attaches to me.

Think a delivery operation, like eg Federal Express. Okay, I guess they have a duty of care to avoid distributing illegal stuff like drugs, but for the most part they are the courier with no obligations beyond safely moving stuff around. Said "stuff" is for them an opaque box.

So if a distro merely packages upstream (yes I know distros typically do more :-) and distributes it, then they are free to run that as a business without incurring liability.

So what we need is the clear chain that says the writers are under no obligation because they provide it with no warranty. The distributors are under no obligation because they are merely aggregating everything into a convenient package.

Liability needs to start at the point a "manufacturer" includes this stuff in a PHYSICAL product. Because it's at that point the RISK also really starts.

Who cares if my pet project is vulnerable as hell? So long as it's just me, it's the same liability as the lone inventor tinkering in his shed with things like gas bottles. Any disaster will be localised, and I'll bear the brunt of it.

Even if I start distributing it, the user-base will be small, and the blast radius insignificant. But if a manufacturer spots my product and "places a product on the market" (as per the blue book, I think it is), this is where the blast radius becomes significant, and this is also where the CRA needs to bite.

One of the simplest ways to protect the small-time coder would be the rule "No contract? No transfer of liability!" Then in, let's say, pizza's case he can warrant that his software will behave to spec, and if it doesn't he'll fix it. If the spec is wrong, not pizza's problem. If there is a vulnerability, pizza gets to fix it. It's the product manufacturer's responsibility to get that bugfix to their customers in a timely manner - indeed - to have a manner of issuing bug-fixes!

(Oh, and I don't think pizza needs to worry about manufacturers saying "go download this software from over there". If the product needs the software to function, and the software malfunctions, that's "not fit for purpose", which brings a world of hurt on its own.)

If manufacturers want to graze the software commons, they accept the liability that comes with accidentally picking poison mushrooms ...

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 15:59 UTC (Mon) by Wol (subscriber, #4433) [Link] (2 responses)

Having responded to a post about SaaS, I forgot to address that issue.

SaaS *would* be covered by the CRA. They aren't just providing you with a copy of a piece of software that they got from someone else, They're providing you with a whole lot more. It's not a case of "you can get that from anywhere, we just provide it all neatly packaged".

Think the difference between an editor and a printer. Give your book to a printer, and you get back multiple copies of your book. all identical copies of the original you handed over. Give it to an editor, and you get *roughly* the same thing back, with (hopefully) the grammar and punctuation cleaned up, the layout redone, a nice professional cover, etc etc.

One company learned the difference between a printer and an editor the hard way. My English master sent the school magazine to them to PRINT. They promptly reset it in their own house style, and then expected to be paid. The English master had a fit and refused point blank. When the printers threatened to sue he pointed out they didn't stand a hope in hell of winning because, if they were PRINTing what he'd sent them, the goods were not what he asked for and not fit for purpose. And if - as they had - they took it upon themselves to edit it, they took upon themselves the risk it would be a flop, as it was, as the English master refused to pay.

Oh - and not fit for purpose. It was a SCHOOL magazine. And the School's definition of "correct" English was the relevant Local Exam Board - Oxford, as it happens. The printer's house style was, according to the exam board, "wrong", and would have cost the students marks had they copied it.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 22:51 UTC (Mon) by Wol (subscriber, #4433) [Link] (1 responses)

following up to myself, having actually gone to the trouble of a Google search to find out what the CRA actually says ...

https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52022PC0454

You need BOTH documents on that page, they work together.

SaaS is NOT covered. The CRA defers to a new, yet-to-be-written, similar Regulation or Directive. That Regulation/Directive will be similar in spirit to the CRA.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 28, 2023 11:38 UTC (Thu) by kleptog (subscriber, #1183) [Link]

> following up to myself, having actually gone to the trouble of a Google search to find out what the CRA actually says ...

I do encourage people to do this. The original article above actually links directly to the original version (first link) and the amended versions by the Council[1] and Parliament[2] (under current state), so you can get an idea of where this is heading.

If nothing else, read just the recitals that come before the articles. While not explicitly legally binding they do provide guidance as to the how and why of the Act. They are written in straight-forward English and in the case of any ambiguity, the recitals make the difference. They also go into way more detail about how this is intended to work.

If you're into software design, the recitals are the equivalent of requirements & use cases.

Repeating the links in case people missed them.

[1] https://data.consilium.europa.eu/doc/document/ST-11726-20...
[2] https://www.europarl.europa.eu/meetdocs/2014_2019/plmrep/...

The European Cyber Resilience Act

Posted Sep 25, 2023 16:13 UTC (Mon) by pizza (subscriber, #46) [Link] (5 responses)

> Liability needs to start at the point a "manufacturer" includes this stuff in a PHYSICAL product. Because it's at that point the RISK also really starts.

So in other words, Microsoft will face zero liability for defects found in Windows, as long as they don't supply any hardware.
And hardware makers face zero liability as long as it's the end-user that installs the software.

>Who cares if my pet project is vulnerable as hell? So long as it's just me, it's the same liability as the lone inventor tinkering in his shed with things like gas bottles. Any disaster will be localised, and I'll bear the brunt of it.

Are you really sure that you don't care that you "bear the brunt of it"? After all, we're talking about potential financial ruin here, with no upside.

> (Oh, and I don't think pizza needs to worry about manufacturers saying "go download this software from over there". If the product needs the software to function, and the software malfunctions, that's "not fit for purpose", which brings a world of hurt on its own.)

Are you sure about that? That just means the hardware "purpose" will get dialed way back until all it's legally "fit" for is along the lines of "it takes up space, blinks a couple of lights, and won't electrocute you." And if you want it to do anything more, you'll have to look elsewhere for the software. Not unlike today when you can buy a motherboard or even a complete barebones PC without any OS. It's no longer a "complete product" but a "kit" that requires the user to assemble or otherwise complete. And the user bears all responsibility for the consequences.

The European Cyber Resilience Act

Posted Sep 25, 2023 17:17 UTC (Mon) by farnz (subscriber, #17727) [Link]

Liability starts at the point something is included in a physical product today. The point of the CRA is to push liability backwards down the chain until either you get to a creator, or you leave the world of commercial relationships, and to add liability for things that occur "in cyberspace".

So, in a CRA world, Microsoft is liable for defects in Windows, because you can't get Windows legally via any route other than a commercial deal with Microsoft. Similarly, hardware makers are liable for the software they include with the hardware, and for ensuring that the hardware is supplied with sufficient software to make the hardware useful for the claimed purposes of the system. This means that if I sell you a "car", it's got to function as a car - and if you have to install software to make it function as a car, then I'm required to bundle that software with the car (and thus be liable for it). I could sell you a "kit of parts" that you need to assemble, but then I can't do anything that might make you think I'm selling you a car, just a set of parts that don't work for any purpose.

Similarly, in the CRA as it currently stands, me putting my pet project up on GitHub, or publishing it on my website, or selling tapes with the source code on at the cost of the tape plus shipping, does not make me liable for flaws in it. Only the selling tapes is a commercial transaction, and the CRA as it stands has an exception where I'm doing that at cost. If it's awful, and blows up in my face because it's so vulnerable, that's my problem to deal with - I'm liable for the damage my systems do, and it's my code on my systems. But because you have no relationship with me that would make me liable to you for flaws in my code that I gave away for free, without so much as a service contract, if you take a copy of my code and run it, you're still liable for the damage your systems do, and you don't have a relationship that lets you push that liability back to me.

The European Cyber Resilience Act

Posted Sep 25, 2023 17:50 UTC (Mon) by Wol (subscriber, #4433) [Link] (3 responses)

> > Liability needs to start at the point a "manufacturer" includes this stuff in a PHYSICAL product. Because it's at that point the RISK also really starts.

> So in other words, Microsoft will face zero liability for defects found in Windows, as long as they don't supply any hardware.
And hardware makers face zero liability as long as it's the end-user that installs the software.

So maybe I shouldn't have said "physical". I think MS is a manufacturer and Windows is a "product placed on the market" according to the Blue Book, so as things stand they're liable. And are you sure you don't want things to change? Because that could kill MS' pre-installed monopoly just like that, so that actually could be a good thing. It would mean all of a sudden people would start installing linux left right and centre because they would see the real cost of Windows. :-)

> >Who cares if my pet project is vulnerable as hell? So long as it's just me, it's the same liability as the lone inventor tinkering in his shed with things like gas bottles. Any disaster will be localised, and I'll bear the brunt of it.

> Are you really sure that you don't care that you "bear the brunt of it"? After all, we're talking about potential financial ruin here, with no upside.

And that's different to the current situation how? Given the current propensity of Americans to sue anybody (including foreigners) for anything, I think having a rich American take a dislike to you will leave you facing potential financial ruin whether you're a saint or a devil.

No I'm not downplaying the risks. But life is a game of Russian Roulette, and given that the CRA is a gun that probably won't fire, I've got rather more important things to worry about. Life is a fatal disease, don'cha'no?

> > (Oh, and I don't think pizza needs to worry about manufacturers saying "go download this software from over there". If the product needs the software to function, and the software malfunctions, that's "not fit for purpose", which brings a world of hurt on its own.)

> Are you sure about that? That just means the hardware "purpose" will get dialed way back until all it's legally "fit" for is along the lines of "it takes up space, blinks a couple of lights, and won't electrocute you." And if you want it to do anything more, you'll have to look elsewhere for the software. Not unlike today when you can buy a motherboard or even a complete barebones PC without any OS. It's no longer a "complete product" but a "kit" that requires the user to assemble or otherwise complete. And the user bears all responsibility for the consequences.

And nobody will buy it. Which might be a bloody good thing.

You said something about not being European. Well, Europe doesn't tend to indulge in all these nasty extra-territorial shenanigans that some other countries hint hint like to do. Keep out of Europe, and the CRA can't touch you. Even if it touches your software, the authorities will go after the people who recklessly imported it, not you.

Something I'd like to know. You said the CRA defines you as "offering a product" which makes you a "manufacturer". Where on earth did you find that in the CRA? Because when I was looking, I've mentioned the "Blue Book" before (which someone else here pointed me at), and if you're right and the CRA is trying to redefine the term, I think we're in for a far bigger world of hurt than just computer security. You're looking at this far too much in isolation. As others have said, this interacts with tax law. I've made repeated references to the Blue Book and Consumer Protection legislation.

Yes, there is no upside to being sued. And no there is no defence against some random guy trying to sue you. For ANYTHING. But in any jurisdiction where Equity is important, like the UK, like I suspect most of Western Europe, you'll get off much more lightly than anywhere else in the world.

If some random company sued me under the CRA, my reaction would be to ask the Judge what grounds do they have to sue me, because their suit should be illegal under the slavery acts. So what if that argument doesn't legally fly, if the Judge thinks yes I have a real case, that argument makes sense, said random company is pretty much certain to lose provided I line my ducks up properly. What's worse, they'll probably end up paying for my target practice.

You said your side business enabled you to keep a roof over your head when $dayjob let you down? In other words, your side business saved you from financial ruin? So there's a hell of a lot of historic upside - it's proven it can save you from financial ruin - TWICE. Don't you think a current score of 2-nil says it's worth the risk?

Or are you - like a lot of people actually - so focussed on the pessimistic downside that you'll actually happily ruin your life trying to avoid a possible imaginary disaster? What a STUPID waste.

I can't say there's no risk. But as it says in the Flanders & Swan song ... "here in a nuclear testing ground, is no place to bury your ..." Live for NOW, write PROPERLY DESIGNED software, and try and keep your nose clean. Unfortunately, we all have no control over being dragged into the legal sausage machine, so don't worry about what MIGHT happen (other than to watch out for really stupid laws).

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 20:55 UTC (Mon) by pizza (subscriber, #46) [Link] (2 responses)

> And are you sure you don't want things to change? Because that could kill MS' pre-installed monopoly just like that, so that actually could be a good thing.

I don't consider it a good thing if it also destroys the ability for F/OSS projects to accept any sort of commercial funding.

>And that's different to the current situation how?

In my documentation, I explictly state: "This free software comes with ABSOLUTELY NO WARRANTY and is licensed under the GNU GPL (v3 or later); see the 'COPYING' file for more details." And later on, "[this software] may have half-developed features that don't quite work, with giant bugs that come out at midnight to eat your cat."

The social contract underpinning this is that I give this software away freely, but at the same time, you have no recompense if it doesn't work. The CRA changes this implicit contract so that you (and everyone else who downloads that software) *do* have grounds to demand recompense, without giving *me* some sort of proportional benefit.

> Well, Europe doesn't tend to indulge in all these nasty extra-territorial shenanigans that some other countries hint hint like to do.

I assume you've heard of the GPDR? That is explcitly extra-terroritoral. This will need to be as well, or it won't be worth the paper it's printed on.

(Meanwhile, my previous two employers were subsidiaries of European companies, and one involved a lot of travel to the EU. My current employer will probably be acquired by an EU company. So I'm probably not going to escape from this without ditching _something_)

> You said the CRA defines you as "offering a product" which makes you a "manufacturer". Where on earth did you find that in the CRA?

There are several blanket exemptions in the CRA for F/OSS activities, but the act of money changing hands eliminates most of them. Accepting donations to cover costs? That's fine, unless that donation is from a recurring corporate sponsor. (which hits quite a few larger F/OSS projects!) Accepting funds for anything other than covering costs, eg a feature or bug bounty? You're now commercial. Run an actual business that provides servive, support, and other such things? Bingo! It doesn't matter if the money is for software or services, if it's directly related, or even _how much_ revenue you take in that's related to that software. It only matters that you're conducting some sort of commercial activities while providing software, which makes you into a manufacturer with obligations enforced with massive punitive penalties.

The European Cyber Resilience Act

Posted Sep 25, 2023 21:59 UTC (Mon) by kleptog (subscriber, #1183) [Link]

> The CRA changes this implicit contract so that you (and everyone else who downloads that software) *do* have grounds to demand recompense, without giving *me* some sort of proportional benefit.

Eh, no. You're talking about liability and you can only have product liability with respect to someone you have a direct commercial relationship with. Just because you receive some money from someone, doesn't mean suddenly you suddenly have a commercial relationship with everyone in the world. If you don't want to provide any warranty, that's fine. Just tell them that any version not the latest may have security bugs and they must upgrade immediately when there's a bugfix. As an open-source project this is trivial to arrange.

> That's fine, unless that donation is from a recurring corporate sponsor. (which hits quite a few larger F/OSS projects!)

The thing is, most F/OSS projects are not delivering a product to market. They're publishing source code, which isn't a product by itself.

> It doesn't matter if the money is for software or services

Ofcourse it matters. Services are not covered by the CRA, products are. The normal approach with F/OSS is that the software is free and the services are not. So the software is delivered non-commercially as open-source download, together with commercial services. It's probably a good idea to make that clear in your contract.

Now, the Apache Project selling OpenOffice online, that sounds like it might be a problem. Isn't that supposed to be super buggy?

The European Cyber Resilience Act

Posted Sep 25, 2023 21:59 UTC (Mon) by Wol (subscriber, #4433) [Link]

> > Well, Europe doesn't tend to indulge in all these nasty extra-territorial shenanigans that some other countries hint hint like to do.

> I assume you've heard of the GPDR? That is explcitly extra-terroritoral. This will need to be as well, or it won't be worth the paper it's printed on.

Extra-territorial in what sense? How is the EU going to prosecute an American company, or American people? They can prosecute them if they are based in the EU. And it's the EXPORT of the data that matters. (Which is why and how American companies get clobbered, if they export the data of the European employees / customers.) And even for American companies like Google, they are operating in the EU, therefore they have to abide by European law.

In order to break the law, you HAVE to be in possession of personal data belonging to Europeans.

It's like some British sexual abuse laws - they apply to all Brits, no matter where the offence took place. But you need that connection with Britain.

This is very unlike America, where they will extradite Brits for doing stuff - in Britain - that is perfectly legal under British law.

> (Meanwhile, my previous two employers were subsidiaries of European companies, and one involved a lot of travel to the EU. My current employer will probably be acquired by an EU company. So I'm probably not going to escape from this without ditching _something_)

> > You said the CRA defines you as "offering a product" which makes you a "manufacturer". Where on earth did you find that in the CRA?

> There are several blanket exemptions in the CRA for F/OSS activities, but the act of money changing hands eliminates most of them. Accepting donations to cover costs? That's fine, unless that donation is from a recurring corporate sponsor. (which hits quite a few larger F/OSS projects!) Accepting funds for anything other than covering costs, eg a feature or bug bounty? You're now commercial.

> Run an actual business that provides servive, support, and other such things? Bingo! It doesn't matter if the money is for software or services, if it's directly related, or even _how much_ revenue you take in that's related to that software. It only matters that you're conducting some sort of commercial activities while providing software, which makes you into a manufacturer with obligations enforced with massive punitive penalties.

But here you are taking *payment*, YOU HAVE A CONTRACT, because you have a mutual exchange of money for agreed benefits. Whereas if you were taking donations, where on earth does it say that's a commercial relationship? If there's no goods or services agreed going in the other direction, how on earth are they going pin anything on you.

And you STILL haven't pointed at where - in the draft CRA - all this stuff is. Is it because it isn't actually there?

I still can't believe the CRA is trying to redefining the meaning of "commercial transaction" or "offering a product". And if it isn't, then most of what you're worried about is scaremongering.

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 20, 2023 12:40 UTC (Wed) by pizza (subscriber, #46) [Link]

> My concern with this requirement of extra bureaucracy inflicted on developers is that it will further speed up the migration from distributed software to hosted software, and software will further be replaced by centralized services such as google docs and so on, so that there's no vendor anymore and no notification to be done.

This will be the most likely outcome, along with completely gutting F/OSS creation and general purpose computing in the EU.

This will increase software (and thus product) development costs by at least an order of magnitude, and will also effectively create a giant step from purely-as-a-hobby-with-no-obligations to manufacturer-under-the-CRA-with-full-obligations, with no middle ground.

The European Cyber Resilience Act

Posted Sep 20, 2023 15:19 UTC (Wed) by shemminger (subscriber, #5739) [Link]

Looks like one of those problems where "if the only tool you have is a hammer, every problem looks like a nail". The EU has one tool more bureaucracy.

I remember when ISO9000 was going to solve all software quality problems through more process. And see how well that worked.

There is a real problem here, but this is not a solution.

The European Cyber Resilience Act

Posted Sep 20, 2023 21:56 UTC (Wed) by kleptog (subscriber, #1183) [Link] (7 responses)

Hosted software has to comply with the NIS (and now the NIS2) Directive which has similar goals to the CRA but for services rather than products. The world did not end when that was introduced.

Notifications only apply to security issues found in deployed products, so not every bug needs notification. This actually ties into the discussion that was here recently about when CVEs should be allocated. This Act cannot solve this problem (nor does it try). If you want to reduce the impact of this Act, then it would be a good idea to stop assigning CVEs for issues that aren't important.

The European Cyber Resilience Act

Posted Sep 21, 2023 2:25 UTC (Thu) by wtarreau (subscriber, #51152) [Link] (6 responses)

> Notifications only apply to security issues found in deployed products, so not every bug needs notification.

The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not. The developer makes a bug and fixes a bug.

> If you want to reduce the impact of this Act, then it would be a good idea to stop assigning CVEs for issues that aren't important.

That's one of the likely outcomes, that everyone strongly refuses to assign any CVE on the grounds that "no, it's not a security problem, in the worst case it's a misuse" at least to try to escape the CRA.

The European Cyber Resilience Act

Posted Sep 21, 2023 11:35 UTC (Thu) by kleptog (subscriber, #1183) [Link]

> The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not. The developer makes a bug and fixes a bug.

Correct, which is why notifications are only required for vulnerabilities that are known to be actively being exploited. Potential vulnerabilities are of not interest here because as you point out, you just need to push the bug-fix out to people in a reasonable time-frame and spend no more time on it. The goal here is to prevent suppliers sitting on information about actively exploited vulnerabilities. You can voluntarily notify about other things your customers may be interested in. I don't see how this is a big change compared to what people are doing now.

The European Cyber Resilience Act

Posted Sep 22, 2023 5:10 UTC (Fri) by ebee_matteo (subscriber, #165284) [Link] (4 responses)

> The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not.

I strongly disagree and I think this is at the root of the problem.

The CRA is written *exactly* because we see a severe uptick in exploitable vulnerabilities because developers do not care enough about security as part of project development.

In turn these are widespread enough that they are very juicy for a state-actor threat or rogue terrorist.

The spirit of the CRA is: you *need* to care about this. If you do not directly have the resources as an hobbyist, at least companies commercializing those products need to allocate proper resources and FTEs or face the consequences.

The free ride has ended.

You do not like this? Keep unreviewed software out of your commercial product that can have potential repercussions on the life of millions of people.

Unfortunately it is currently very visible for people on the cybersecurity area how open source is becoming an easy target for a plethora of attacks, esp. socially engineered and supply chain related.

Big companies certainly have resources to foot the bill for code reviews and the paperwork needed.

Now, the language in the CRA needs fixing, I agree. But the spirit is correct.

The European Cyber Resilience Act

Posted Sep 22, 2023 21:00 UTC (Fri) by Wol (subscriber, #4433) [Link] (3 responses)

> > The problem is that it's not the developer's job to try to figure if a bug may possibly have security ramifications or not.

> I strongly disagree and I think this is at the root of the problem.

I'd put it rather differently, but yes it's because developers don't do their job properly.

> The CRA is written *exactly* because we see a severe uptick in exploitable vulnerabilities because developers do not care enough about security as part of project development.

> In turn these are widespread enough that they are very juicy for a state-actor threat or rogue terrorist.

> The spirit of the CRA is: you *need* to care about this. If you do not directly have the resources as an hobbyist, at least companies commercializing those products need to allocate proper resources and FTEs or face the consequences.

If you care about doing a good job, a lot of this grief would just go away. What's the quote? "If they built buildings like we build software, the first woodpecker to come along would destroy civilisation"? Most software is held together with duck tape, string, and sealing wax.

I'm very much with Linus here - "a bug is a bug is a bug. It should be fixed". Security considerations are secondary. But as I see it, there are two problems ... and I'm with him with his other quote too - "the best programmers are lazy programmers, they get it right first time because they can't be bothered to do it twice".

Firstly, most software today is not designed. It's thrown together, it works, and that's that. Then it gets extended and extended and the parts don't work together. Etc etc.

The second thing is, a heck of a lot of our tools suffer the same problem. C is a massive offender, with all its undefined behaviour. Landmines everywhere. I've just been fighting conditional formatting with VBA. Badly documented, things blowing up when you think they should work, things only make sense AFTER you've debugged the problem, ...

Again, what's that saying? "Ten minutes extra in the design phase knocks an hour off the debugging". I'm probably loved and hated simultaneously at work, because I spend so much time fixing technical debt even in the middle of a fire fight.

That's why I hate Word. That's why I hate Relational/SQL. I've worked with programs that have a clear, simple design. Things "just work". Everything I do, I try to step back and have a clear design behind it. Even if I do a half-baked implementation, so long as the design is simple, clear, AND WELL COMMENTED IN THE CODE, things are far less likely to break. If somebody tries to do something I haven't implemented, they should crash into an error message that says "not implemented, please file a bug report". They shouldn't crash into an undefined, unanticipated state that causes all sorts of grief.

How much effort is it to check a variable that says "these are the states I know about and can handle. Anything else, raise an error"? Okay, if the previous guy didn't do it you're probably into a world of hurt. But if all your code does it, you're not going to be responsible for some obscure security problem because you didn't do your job (if that other guy's code drops you in it, well sorry ...)

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 2:33 UTC (Mon) by wtarreau (subscriber, #51152) [Link] (2 responses)

> I'm very much with Linus here - "a bug is a bug is a bug. It should be fixed". Security considerations are secondary. But as I see it, there are two problems ... and I'm with him with his other quote too - "the best programmers are lazy programmers, they get it right first time because they can't be bothered to do it twice".
> Firstly, most software today is not designed. It's thrown together, it works, and that's that. Then it gets extended and extended and the parts don't work together. Etc etc.

That's exactly my point. Figuring how a bug might be used to participate to an exploitation chain requires a different mind and set of skills. Many developers will not see how missing a line feed at the end of an error message might constitute a vulnerability, it's just a cosmetic error, but some upper layer wrapper might rely on this specific delimiter and might get confused enough to be fooled. That's what I mean by "it's not the developer's job". The developer cannot know *all* possible use cases and integration of their creation. Of course, writing a blatant buffer overflow does have immediately visible security implications, but a lot of security issues nowadays stem from a combination of small escalations, sometimes from different components.

The European Cyber Resilience Act

Posted Sep 25, 2023 7:32 UTC (Mon) by Wol (subscriber, #4433) [Link]

> That's exactly my point. Figuring how a bug might be used to participate to an exploitation chain requires a different mind and set of skills. Many developers will not see how missing a line feed at the end of an error message might constitute a vulnerability, it's just a cosmetic error, but some upper layer wrapper might rely on this specific delimiter and might get confused enough to be fooled.

But this is what really pisses me off (not only with developers, but ...)

If the developer is not knowledgeable enough (I carefully didn't say "skilled") to know that there is SUPPOSED to be a line feed, this is a failure on oh so many levels. A badly designed tool (the compiler?), an inappropriate language (okay that's often management), a developer satisfied with "oh it appears to work", an analyst who didn't analyse the job properly ... the list goes on.

At the end of the day we should be aiming to do the best job we can, and that does NOT mean patching broken tactics on top of broken tactics. Banging on about Pick again, but it's noticeable with Pick that good software tends to be implemented by teams, with domain experts specifying the problem and helping in the programming, while the computer experts help in the specification and make sure the programming is done properly.

At the end of the day, far too many problems are caused by "ivory tower syndrome" - epitomised by typical computer split into analysts and programmers, where analysts understand neither the subject nor programming, and programmers implement the spec they're given. A recipe for disaster. I hope most of us don't work in those sorts of environments, but it's those embedded attitudes that are responsible for a lot of the problem. I guess that can be summed up as "bureaucracy" :-)

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 25, 2023 10:17 UTC (Mon) by farnz (subscriber, #17727) [Link]

Right, and nothing about the CRA stops you saying that all commits (not just all bugfixes, but all commits, whether they're known to be bugfixes or not) are security relevant; if you do this, then your downstreams have to either keep up with development, or accept the risk of being liable for bugs in your software if you've fixed one and they haven't taken that patch. The whole thing only becomes an issue if you're (a) providing software in a commercial context, (b) not wanting to force your customers to upgrade all the time, and (c) don't want to be liable for security-relevant bugs in the software; at this point, the CRA forces you to choose which of those three you drop.

The European Cyber Resilience Act

Posted Sep 20, 2023 6:28 UTC (Wed) by vegard (subscriber, #52330) [Link] (1 responses)

I wonder if the verbiage "actively exploited vulnerability" includes exploits shared privately with a company. In this case, it is technically not known to be exploited in the wild, but an exploit clearly does exist.

The European Cyber Resilience Act

Posted Sep 22, 2023 5:15 UTC (Fri) by ebee_matteo (subscriber, #165284) [Link]

I think the idea is that just reporting it makes it exploitable; it has not yet been used to carry out nefarious purposes.

Exploited us for when an attacker is already using it for their gain.

The European Cyber Resilience Act

Posted Sep 20, 2023 7:13 UTC (Wed) by nim-nim (subscriber, #34454) [Link]

Given how pervasive software is today, it’s a nice thing authorities finally try to clarify the obligations of each stakeholder.

Of course the initial drafts are going to be a mess, the legal side of software has been rotting for too long. However it is much better to write down that mess and try to put it into shape it than dump all the misunderstandings and ambiguities on individual judges each time some bit of software is involved in loss of life or property, and hope those individual judges will magically determine the right path no one trod before.

Note that regardless of the decision to write the Cyber Resilience Act or not the misunderstandings, ambiguities and FUD-ing exist today in real life and are one of the major reasons funding never seems to reach the people writing the software prosperous commercial entities depend on.

Ultimately, if you want to depend on software (in the cyber sense or not), you need to fund the people able to fix it. And not pretend some fairies will write the fix for free every time you hit a problem.

The European Cyber Resilience Act

Posted Sep 20, 2023 11:02 UTC (Wed) by jezuch (subscriber, #52988) [Link]

FOSS needs lobbyists with bags of money.

But just bags of money would be nice too.

Anyway, we're focusing on potentially damaging impact on FOSS, but let's not forget the flipside. For example, we don't want to water down the language so much that it's trivial for companies to avoid scrutiny by falsely claiming to be doing open source. Also, as someone already mentioned, we definitely do want to increase costs for those who don't care (in order to make them care). It's the old problem of externalities, which is as old as capitalism (and probably older). The equivalent from the "old" world is a polluter who dumps waste into a community's drinking water reservoirs. This will not get fixed by markets, it can only be fixed by regulation. Yes, it increases the cost of doing business. But the health of the community needs to be more important than your business model.

Placing on the market

Posted Sep 20, 2023 12:07 UTC (Wed) by Wol (subscriber, #4433) [Link] (7 responses)

To deal with a lot of the problems, I seem to remember something about "placing on the market".

Simply put, root liability should lie with any organisation "placing goods or services on the market". This very clearly and explicitly (to my understanding) would exclude any and all developers (big, small, commercial, whatever) who effectively do a code dump and say "here's some open source code, make of it what you will". Placing on the market means you need a product, which you advertise, and presumably charge for (I don't think you have to charge, you could be giving it away in an advertising campaign, but there does have to be a clear exchange of value somewhere).

So let's say I made a router, and plonked OpenWRT on it. As soon as I advertise that router and sell it, or say "free with my ISP service", or whatever, I am accepting responsibility for that router (including the software on it). I need to get that software certified, whether I do it myself, or pay OpenWRT to do it, or employ some third party to do it.

But the crucial point needs to be that if I just download OpenWRT off the internet, NO LIABILITY transfers to the OpenWRT project. They've made a code dump available to all and sundry, and liability rests firmly with the downloader. Only if I come to some commercial arrangement with OpenWRT will liability transfer to OpenWRT (under the contract).

This will also (I think) cover Farnz's case of providing hardware and telling your customer to "download the free software off the internet". If my router only works with OpenWRT, I'm providing a product that is "not fit for purpose as supplied". Which means I'm still on the hook for any problems the customer may have.

Cheers,
Wol

Placing on the market

Posted Sep 20, 2023 12:26 UTC (Wed) by pizza (subscriber, #46) [Link] (4 responses)

> But the crucial point needs to be that if I just download OpenWRT off the internet, NO LIABILITY transfers to the OpenWRT project.

So, if you download OpenWRT and plonk it onto a router, is the original manufacturer liable, OpenWRT liable, or are you personally liable for a security flaw on the unit that gets exploited to attack $gov_facility?

It stands to reason that the manufacturer should be absolved here, unless by virtue of "allowing modifications" they then become complicitly liable. If you follow tha tline of reasoning then everything is going to be heavily locked down and protected by go-directly-to-jail DRM, effectively ending general purpose computing. It also stands to reason that OpenWRT should face _some_ sort of obligation/liability should they ship a dangerous bug that can be exploited. Or a router maker could just ship a software-less box and say "Get your software from somewhere else" to avoid liability. If the liability instead falls to the user because they're the ones who "modified someone else's product" then what's to stop the hw maker from saying "The user chose to install this software, they're responsible, not me!"

This is why this is such a thorny problem. How do you craft an exception for F/OSS activities in a way that doesn't absolve commercial players, when the definition of "commercial" is so broad that it essentially encompasses anything that's not done on a purely "disorganized volunteer that gets ZERO funding, not even banner ads on a web page" basis?

Placing on the market

Posted Sep 20, 2023 13:53 UTC (Wed) by Wol (subscriber, #4433) [Link] (3 responses)

> > But the crucial point needs to be that if I just download OpenWRT off the internet, NO LIABILITY transfers to the OpenWRT project.

> So, if you download OpenWRT and plonk it onto a router, is the original manufacturer liable, OpenWRT liable, or are you personally liable for a security flaw on the unit that gets exploited to attack $gov_facility?

$gov_facility is liable for not properly securing their site ...

My router is quite clearly NOT a "product placed on the market". I over-wrote the supplied firmware with OpenWRT. The manufacturer of the commercial router cannot be held liable because I mod'd it.

Likewise, OpenWRT just placed their software on the internet for anyone to download. I downloaded it, installed it, configured it, and quite possibly (probably?) messed up. OpenWRT can't be held liable unless I *bought* a pre-configured setup off them.

$govt may not like the fact that they are both liable, and the victim, but if the rest of us have to suffer that on a daily basis, why not them? What needs to be driven home to the legislators is that your vision of a world of locked down hardware is a dystopian cure even worse than the disease.

If manufacturers are forced to clean up the dystopian world of the "internet of things", the background security level will rise sharply.

Cheers,
Wol

Placing on the market

Posted Sep 20, 2023 14:19 UTC (Wed) by pizza (subscriber, #46) [Link] (2 responses)

> $gov_facility is liable for not properly securing their site ...

Ah yes, they should face liability because they only paid for a 10Gbps connection instead of a 1Tbps connection capble of handling a DDOS.

> My router is quite clearly NOT a "product placed on the market". I over-wrote the supplied firmware with OpenWRT. The manufacturer of the commercial router cannot be held liable because I mod'd it.

So in other words, OpenWRT is not liable, the manufacturer of the equipment is not liable... which leaves you. But since you didn't "place this product on the market" you're not liable for any damage it causes either.

That seems... quite wrong.

What if it's not "you personally" but "you as the owner of a small business whose employee over-wrote that firmware and placed it into service with no access controls?"

> If manufacturers are forced to clean up the dystopian world of the "internet of things", the background security level will rise sharply.

Doing so by effectively outlawing general purpose computing and independent software development sounds like a pretty dystopian outcome to me.

Placing on the market

Posted Sep 20, 2023 16:00 UTC (Wed) by ebee_matteo (subscriber, #165284) [Link]

> OpenWRT is not liable, the manufacturer of the equipment is not liable... which leaves you. But since you didn't "place this product on the market" you're not liable for any damage it causes either.

No. You are privately liable under European (or EU specific-country law) for failing to secure properly your devices.

It's just not due to the CRA. It's law already in place.

Placing on the market

Posted Sep 20, 2023 16:02 UTC (Wed) by Wol (subscriber, #4433) [Link]

> > My router is quite clearly NOT a "product placed on the market". I over-wrote the supplied firmware with OpenWRT. The manufacturer of the commercial router cannot be held liable because I mod'd it.
> So in other words, OpenWRT is not liable, the manufacturer of the equipment is not liable... which leaves you. But since you didn't "place this product on the market" you're not liable for any damage it causes either.

> That seems... quite wrong.

I agree with you. BUT. How different is it to anywhere else? Not at all, as far as I can tell. What if I buy a bunch of aftermarket car mods, rally-fix my car so it's totally illegal, and go road-racing with my mates?

It only needs a fatal smash and a bunch of innocent bystanders are left with no recourse because at best I'm a man of straw not worth suing - at worst I'm dead too (many people would think that was for the best! :-) and there isn't anyone to sue. Tough. That's life. And death.

> > If manufacturers are forced to clean up the dystopian world of the "internet of things", the background security level will rise sharply.

> Doing so by effectively outlawing general purpose computing and independent software development sounds like a pretty dystopian outcome to me.

And the alternative is?

It's a "damned if you do, damned if you don't" world out there.

Stuff needs to be field-updateable. Stuff needs to have the code available for audit.

I'd like to see something along the lines of "If you lock it down you have to implement a kill switch. If a serious bug is found you implement a fix, and then you trip the kill switch. If you CAN'T implement a fix, then you trip the kill switch anyway! And unless the product is end-of-life, tripping the kill switch is a warranty failure". Although if the customer didn't apply the fix, you do get to charge them for the privilege of you doing it for them, and they get a "pay for" upgrade rather than a replacement piece of kit.

Let's take routers for example. How difficult would it be for - Netgear let's say - to sponsor an engineer to certify OpenWRT on their hardware. (That is, Netgear employs the engineer to say "Yes I've audited it". He's not personally liable for it.) Another couple of manufacturers chip in so they set up a compliance testing trade association - bear in mind that sort of organisation is not allowed to be choosy about membership. So the engineers are busy fixing and improving OpenWRT, and providing value to the association members in the form of certification! Not a member? No certification!

Basically, if you provide a product, you should be responsible for making sure that (a) it works as advertised, and (b) customers and bystanders are not hurt by it working as designed. Like in the automotive industry, you should not be held responsible for unauthorised modifications, but you are responsible for the safety of the product you supplied.

And this is what I meant about the dystopian Internet World of Things, where security etc is an afterthought, if it's even a thought at all. People get hurt by commercial products acting "as designed" (as in, design consists of just throwing components together, compatible or not who cares.)

Cheers,
Wol

Placing on the market

Posted Sep 20, 2023 14:31 UTC (Wed) by nim-nim (subscriber, #34454) [Link] (1 responses)

I think part of the concern EU side is entities trying to avoid liabilities by allocating their devs to a shell organisation, buying the shell organization product for a token value and then reselling the result for *big* *money* (but pretending they are only liable for the token value paid to the shell company).

They need to nail down the language to avoid this kind of sheenigan while avoiding to hurt innocent bystanders.

And I don’t think anyone here is surprised that they suspect the Googles, Amazons and Microsofts, IBMs etc will attempt this kind of shell game.

Placing on the market

Posted Sep 25, 2023 16:13 UTC (Mon) by Wol (subscriber, #4433) [Link]

But that's easy. If you are selling the product you are responsible to your customers.

If your supplier goes bust, that's no excuse!

It's the same as all these companies supplying TVs with GPL'd firmware, but their suppliers didn't give them the software. The US may be rubbish at enforcing it, but in the EU it only takes a company to get dinged twice for repeatedly importing products they are unable to comply with the licence for, and the typical EU remedy would be "next time you import a product, you have to prove to Import that you have the software you need to comply with your obligations".

I gather they did manage to get that through in the US, but it would have been so much easier here - "repeat offender? You need to PROVE you're compliant before you can resume business".

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 20, 2023 12:39 UTC (Wed) by karim (subscriber, #114) [Link]

Wait, am I reading that some bureaucrats have discovered that the solution to some life-problem is more pervasive use of bureaucracy? Paint me surprised.

++ on people pointing out that:
- Companies will likely skip EU for first releases
- There will be companies whose business model will be CRA compliance
- Some OSS projects will like exclude EU
- This will further push centralization of services (SaaS)

And, yes, this will likely weed out "cheap electronics sold on Amazon", but it'll also kill any startup playing in this space. But what does the EU know about having a vibrant startup ecosystem anyway? :P

The European Cyber Resilience Act

Posted Sep 20, 2023 20:15 UTC (Wed) by ringerc (subscriber, #3071) [Link] (13 responses)

It's pretty clear that much of the ambiguity in the proposal is there to prevent companies from evading the law using faux-open-source "projects", open-core models etc.

Hence the provisions about devs paid mainly by one company, ability to control which changes are accepted etc.

They want to stop HackMe.example.com from putting their product at arms length from legal obligations while retaining control, most commerical benefits and all the secret sauce kept in proprietary plugins.

This is a laudable goal.

Achieving it without too much byblast impact on other projects is the hard bit.

PostgreSQL for example is a genuinely independent open source project. EntrrpriseDB currently employs a significant proportion of the most active devs and committers and core team (steering group). But this has not always been the case and may cease to be the case in future. EDB doesn't control postgres and regularly fails to get changes it wants into the product because others don't see the wider value in them, don't think they meet PostgreSQL's high quality bar, or don't think they're worth the ongoing maintenance required. EDB benefits commercially from PostgreSQL and contributes to it. Where does that place the PostgreSQL product in terms of these requirements - and where should it?

It's reasonable to expect EDB to comply with these requirements as a software vendor. But it's not especially reasonable for them to transitively apply to the whole PostgreSQL project and all its other contributors just because there's a major commercial stakeholder. The overhead and administrative burden could easily fall on all contributors and impact the project as a whole. As well as being a discouraging drag on contribution it could give big stakeholders a greater lever for control over open source projects because they may become embedded in mandatory compliance processes imposed on the project to make releases or even accept changes.

I don't have any good answers here. It's a genuinely difficult balance and a blurry line. I don't want the "mycompany foundation" open core bait-and-switch-ware of the world to use this to evade what are some reasonable and justifiable expectations on quality and responsibility for the product. But I don't want to see that harm healthy projects with commerical contribution and co-operation either.

The European Cyber Resilience Act

Posted Sep 20, 2023 21:09 UTC (Wed) by pizza (subscriber, #46) [Link]

> I don't want the "mycompany foundation" open core bait-and-switch-ware of the world to use this to evade what are some reasonable and justifiable expectations on quality and responsibility for the product. But I don't want to see that harm healthy projects with commerical contribution and co-operation either.

Exactly. And it's worth mentioning that "healthy projects with commercial contribution" encompasses single-developer projects that have accepted any sort of payment from an EU entity.

The European Cyber Resilience Act

Posted Sep 20, 2023 21:39 UTC (Wed) by kleptog (subscriber, #1183) [Link] (10 responses)

> PostgreSQL for example is a genuinely independent open source project. EntrrpriseDB currently employs a significant proportion of the most active devs and committers and core team (steering group).

Not a bad example actually. The obvious answer is that EnterpriseDB is responsible for what it sells. If there's a security issue that needs to be patched, EnterpriseDB can apply it to the version they ship. That they may not get the fix into the mainline PostgreSQL release is neither here nor there.

What responsibility does the PostgreSQL project have? Strictly speaking, none. But the people distributing PostgreSQL for money have a interest in working together with the project to make this work. In a sense this happens already, so I expect this law to formalise that process a bit more (if necessary).

> It's pretty clear that much of the ambiguity in the proposal is there to prevent companies from evading the law using faux-open-source "projects", open-core models etc.

One of the differences between US and EU regulation is that the US tends to formulate lots of precise rules, whereas EU regulation tends to be higher level leaving room for regulators to apply the intent of the regulation to specific examples. For example, the Italian regulators had banned subprime mortgages early despite there being no explicit rule against it, on the basis they looked like a bad idea. If people are expecting the CRA to provide detailed rules about who are or are not targetted they're going to be disappointed. That's not the way we roll.

If you're worried this might lead to selective enforcement, the flip side of this is that if a regulator targets a single open-source project, the fact they're not also going after every project with the same issues is actually a defence. So all you need to do is do a better job than the average commercial product (which isn't hard) and you've got nothing to worry about.

I think the discussions about this all being a huge amount of work for open-source projects is exaggerated. Nothing in the Act suggest things companies shouldn't be doing already. If you're deploying a product in 2023 without even doing the minimal cybersecurity checks, you ought to be shot. For a lot of open-source software there are standard Github CI/CD pipelines and bots which check for security issues. Tools like Coverity check lots of open source projects for free. I honestly think the average popular open-source project is in a much better state that most proprietary software and can trivially prove it too.

The European Cyber Resilience Act

Posted Sep 21, 2023 2:37 UTC (Thu) by wtarreau (subscriber, #51152) [Link] (9 responses)

> If there's a security issue that needs to be patched, EnterpriseDB can apply it to the version they ship.

Sure but there's a huge difference between "having to apply a patch", and "spend one month filling stupid paper form describing the possible impacts of the issue and its remediation". A bug needs to be fixed, period. No need to add bureaucracy to make the development halt after the first bug, and impossible to restart.

They could put the effort on the bug reporter for example: make it possible for a bug reporter whose bug report has been ignored to fill that form so that EU can ask the company if it really poses a security threat and why it's not fixed, and if the company doesn't respond in a few weeks/months, then deny it the right to sell the product in the EU until it responds. It would be more effective and limit the amount of bureaucracy inflicted on those who are already busy trying to fix the problem.

The European Cyber Resilience Act

Posted Sep 21, 2023 11:47 UTC (Thu) by kleptog (subscriber, #1183) [Link] (8 responses)

Come on, paper forms have been practically dead for a while now. I've signed two physical forms in the last decade, everything is online these days.

And if your product has an actively exploited vulnerability that's causing actual damage, a simple email to a ENISA telling them about it and how to mitigate it is the absolute least you can do. You don't have to notify them for evey bug, that would be silly (and they'll probably tell you off if you do).

If the bug reporter includes a working exploit, it's worth notifying about ASAP. Otherwise, you can probably just fix it and move on.

The European Cyber Resilience Act

Posted Sep 21, 2023 12:53 UTC (Thu) by farnz (subscriber, #17727) [Link] (3 responses)

Every interaction I've had with the IRS in the USA has involved paper forms, which can either be sent by "certified international mail" (whatever local service turns into certified USPS mail in the USA - in my case, "International Tracked & Signed" from Royal Mail is the relevant service) or faxed. They will not accept e-mailed copies.

But my understanding of EU law is that EU governments can't do this - if they want paper copies of a form, they must be willing to print e-mailed versions out.

The European Cyber Resilience Act

Posted Sep 25, 2023 16:20 UTC (Mon) by Wol (subscriber, #4433) [Link] (2 responses)

The problem in Europe, is that all too often the government will no longer accept paper forms.

I (as of this year) now have to fill in a tax return. Most European citizens don't - PAYE has removed that burden. I signed up for paper forms (web forms far too often are the work of the devil aka junior idiots who can't think straight and design things that are a nightmare / impossible to complete properly).

So, when my first return was due, I got an email telling me "You need to fill in the form online, we've scrapped paper". ARGGHHHH. I DON'T WANT ONLINE!!!

Cheers,
Wol

The European Cyber Resilience Act

Posted Feb 13, 2024 17:40 UTC (Tue) by nix (subscriber, #2304) [Link] (1 responses)

I'm very late, but you should note (if you're still reading this) that HMRC's online tax reporting systems are *lovely*. They do nearly all the work for you, every box is linked to help telling you in pretty clear terms what the heck it's for (and if you still don't understand the terminology you can pop open another tab and google for it), you can usually skip nearly all the boxes and it's usually obvious which, most of them are auto-skipped for you based on the general properties of where you get income from and never appear at all, and if you make mistakes there is a series of summaries you are forced to see which make it obvious you screwed up. And you can go back and change it repeatedly until the filing deadline.

IMHO in all ways the online reporting system is far preferable to the physical forms iff you can use it at all (not everyone can, e.g. people with large foreign shareholdings can't, but they probably have people to do their taxes for them anyway).

(And, of course, the UK's tax filing physical forms are massively better than the horrifying nightmare the US forces everyone to use.)

The European Cyber Resilience Act

Posted Feb 13, 2024 19:47 UTC (Tue) by Wol (subscriber, #4433) [Link]

The problem is HMRC now demands everyone file on line.

I recently had to sign up to filing taxes. I said I wanted paper, and even before I got my first set of forms, I got a message saying I had to file online :-(

And my experience was they were demanding all sorts of information (that I filled in with 0s), but the banks etc are supposed to give them that information. It's a complete pain ...

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 22, 2023 4:15 UTC (Fri) by wtarreau (subscriber, #51152) [Link] (3 responses)

This remains what I call paper forms. Even if they're online, it doesn't mean that suddenly it's quick to fill them. And actually I'd rather write a bot to inform them of every backported patch so that I don't have to do the extra work of figuring which ones might be relevant to them according to their own preference.

The European Cyber Resilience Act

Posted Sep 22, 2023 9:43 UTC (Fri) by farnz (subscriber, #17727) [Link] (2 responses)

Writing a bot to inform them of each backported patch is entirely in-scope and acceptable - one e-mail per patch, and let the authorities handle it.

The only reason you might consider being a little less eager to send such mails is that by doing so, you've ensured that commercial downstreams are legally liable if they haven't applied that patch and their install of your software is compromised. On the other hand, this might be a desirable effect - it forces them to keep close to upstream, for fear of being found liable for something.

The European Cyber Resilience Act

Posted Sep 25, 2023 2:36 UTC (Mon) by wtarreau (subscriber, #51152) [Link] (1 responses)

Actually that might be a good solution to ensure that distros finally apply *all* fixes to software instead of cherry-picking random ones that they consider important because the stupidly irrelevant CVE word is associated with them.

The European Cyber Resilience Act

Posted Sep 25, 2023 9:40 UTC (Mon) by farnz (subscriber, #17727) [Link]

Yes, and this is intentional on the part of the CRA; one of the concerns in making you legally liable is that you need some way to say "if you don't apply the fixes I have said are critical, then I'm not liable when your house of cards falls apart". And that's what the notification mechanism is; the idea is that if your users don't like you notifying them of a need to patch every day or so, they'll find a commercial arrangement with you that makes you less eager to notify the authorities of "required" patches.

If they don't shower you with enough money to make you behave the way they want (and take on the liability that comes with that), then as far as the EU's concerned, that's their problem to deal with, either by switching to a different source of software, or by getting used to taking all your patches, not just the ones with a CVE tag, or just accepting that they are responsible for checking all of your bugfixes for security relevance, and paying the price if they erroneously deem a bugfix "not security relevant".

One of the reasons we're seeing FUD around the CRA is that if you do decide that you're going to notify every commit as a potentially security-relevant fix (which you're entitled to do under the proposals so far), the free ride comes to an end for your downstreams; they have to either take all of your commits within a short time of you making them (which results in them having to change maintenance schedules etc to support such frequent updates), or they have to deal with liability for bugs you've fixed since they took a copy in the software you give them, or they have to persuade you to stop doing that (which will almost certainly involve giving you money).

The European Cyber Resilience Act

Posted Sep 21, 2023 10:55 UTC (Thu) by nim-nim (subscriber, #34454) [Link]

> I don't have any good answers here. It's a genuinely difficult balance and a blurry line. I don't want the "mycompany
> foundation" open core bait-and-switch-ware of the world to use this to evade what are some reasonable and justifiable
> expectations on quality and responsibility for the product.

Actually a positive outcome would be to remove the need for open core bait-and-switch-ware. Because the people engaging in this could switch to just providing (paid) liability insurance, keeping the original FLOSS licensing.

Of course, very bad news for the freeloaders that want to consume FLOSS software without paying a dime to make sure someone will fix things in case of problem, the EU won’t let them freeload and hope someone else will fix things when needed.

The European Cyber Resilience Act

Posted Sep 20, 2023 21:22 UTC (Wed) by stybla (subscriber, #64681) [Link]

Let's hope this crap never passes. Oh wait, it's EU. It will. Never mind.

The European Cyber Resilience Act

Posted Sep 21, 2023 0:47 UTC (Thu) by ch33zer (subscriber, #128505) [Link] (7 responses)

Doesn't this just make open source really hard to use in the EU? Even if the politicians draw a perfect line between commercial and non commercial activity then companies that want to use open source probably won't because the open source software doesn't come with the necessary paperwork. Inevitably they'll just buy from companies selling complying software instead.

The European Cyber Resilience Act

Posted Sep 21, 2023 2:41 UTC (Thu) by wtarreau (subscriber, #51152) [Link] (6 responses)

Yes it will be a real mess, to use and to create OSS.

Also I don't see how they could draw a "perfect line" between commercial and non-commercial because this line does not exist. Sometimes some users of your code send you some money: this can change the perception of the relation, especially if you reach a point where you can make a living of it. And the vast majority of OSS that lives more than 3 years is backed by some companies who are able to pay some developers. This really risks to simply end a large number of projects because it will be considered that it's not worth the hassle of continuing to sell them, hence let's just stop developing them.

The European Cyber Resilience Act

Posted Sep 21, 2023 9:39 UTC (Thu) by farnz (subscriber, #17727) [Link] (5 responses)

A user giving you some money does not, in and of itself, make it a commercial setting. For it to be commercial, there needs to be an offer of product or services from you conditional on the receipt of money, followed by the money coming through. So, if you say "I'll fix this bug if people give me €10,000", and I send you €10,000, you're doing something commercial. If I see a "donate here" button on haproxy's website, and donate €10,000 because I love the project, it's not commercial - the money was a gift, and this applies even if I'm a heavy user of haproxy and also ask you to fix bugs.

Similarly, this is where the carve-out for projects where no single commercial entity has control is coming from; if Google and Facebook both pay developers on a project, but neither Google nor Facebook has overall control, it's out of scope. The only reason this isn't a more general "open source" carve-out is that we want to avoid the situation where Dodgy Products Ltd open-sources all of its code under AGPLv3 (making it "open source"), but refuses to accept outside contributions, and refuses to support any builds other than its own, but escapes liability because it's "open source". On the other hand, something like haproxy where no single company has control is out-of-scope as a project; it comes in scope if you're selling it (either on its own, or as part of a product).

The European Cyber Resilience Act

Posted Sep 21, 2023 10:05 UTC (Thu) by wtarreau (subscriber, #51152) [Link] (2 responses)

But the problem is that it's neither black or white, it's gray in that it's progressive from totally benevolent to fully commercial-backed over time for lots of projects...

The European Cyber Resilience Act

Posted Sep 22, 2023 21:27 UTC (Fri) by kleptog (subscriber, #1183) [Link] (1 responses)

> But the problem is that it's neither black or white, it's gray in that it's progressive from totally benevolent to fully commercial-backed over time for lots of projects...

Actually, it is fairly black and white. Because it's basically the same calculation that needs to be done to determine if VAT is payable over the amount. If the €10,000 is paid for a patch, there's VAT payable and there must be an invoice. If the €10,000 is a donation there's no VAT. Since VAT is usually around 20% of the amount, there's a lot of case law and rules about these situations because it matters to the tax office. And in practice it's no-where near as grey as you think, because in every transaction the participants agree on the commerciality of the transaction as part of the agreement.

(Note: small businesses may be VAT exempt but that doesn't change the calculations. The invoice then just says "should have VAT but seller is exempt" and there's less paperwork.)

You can actually play around with this in some situations. Someone else made the argument that any organisation accepting money from the EU is commercial, but that all depends. If you get an EU grant for "improving the cybersecurity status of your project" that's non-specific so no VAT. If you get an EU grant for "making a secure and audited version of your project" that's a concrete deliverable and so VAT may be payable. The former wouldn't be commercial, the latter would be. The grant will include a statement clarifying which it is.

The US doesn't have VAT, so this probably sounds very weird to Americans. And possibly even to many Europeans who don't have experience with VAT administration.

The European Cyber Resilience Act

Posted Sep 23, 2023 7:12 UTC (Sat) by Wol (subscriber, #4433) [Link]

> The US doesn't have VAT, so this probably sounds very weird to Americans. And possibly even to many Europeans who don't have experience with VAT administration.

The US has Sales Tax, though, which is although it's a bit different is the same principle (we had the equivalent of Sales Tax before we joined the EU, hence all our "membership wholesaler"-type businesses). When we had Sales Tax, B2B transactions were exempt - VAT is similar in that you pay the taxman the difference between in and out - but B2C transactions were taxed. Likewise VAT is similar - the consumer can't reclaim the VAT.

So, assuming transactions come in three Sales Tax types, ie payable, exempt (eg B2B), and non-applicable, commerciality covers the first two but not the last.

(Membership wholesaler businesses were classed as B2B, but had a fair few B2C type members eg charities, people who used cards on personal businesses, sole traders and partnerships, etc etc, in order to avoid Sales Tax. Part of the reason behind VAT was to close that loophole.)

Cheers,
Wol

The European Cyber Resilience Act

Posted Sep 21, 2023 12:44 UTC (Thu) by pizza (subscriber, #46) [Link] (1 responses)

> A user giving you some money does not, in and of itself, make it a commercial setting. For it to be commercial, there needs to be an offer of product or services from you conditional on the receipt of money, followed by the money coming through.

Does it matter if they approached you first?

Because in the F/OSS context, this "offer of product or services" is little more than "I can add that feature you asked for; it's going to take approximately $hours and I charge $rate." and if they agree, BAM, you're now an EU "manufacturer."

...And once you gain that lofty status, how do you shed yourself of it? I've not done any business with an EU entity in several years, though I'm still ostensibly "in business" in my home (non-EU) jurisdiction. In fact, earlier this week I was helping a gentleman from Germany gather the data I need to hunt down a bug, but no money is expected to change hands.

The European Cyber Resilience Act

Posted Sep 21, 2023 13:06 UTC (Thu) by farnz (subscriber, #17727) [Link]

Who approaches whom doesn't matter. It's commercial if there's an offer of products or services, followed by an exchange of money; at that point, you're in a commercial transaction, and you are liable for the products or services you've sold.

And you only have that status for products and services sold in a commercial setting. Say I accept €10,000 from you to fix a bug in Linux kernel PPPoE support that affects your ISP; that fix is done as a service (the end product is a patch that applies to a known git tree from Linus). That patch is commercial - I offered to make the fix, you paid me for it. If I then do a later patch to Linux kernel PPPoE support to fix a different bug that affects me, or that adds a feature to kernel PPPoE support, that patch is non-commercial, because I didn't have a commercial relationship with you for that patch.

It gets trickier with a follow-on patch to the one you paid me for - if I supply a second patch that fixes a bug introduced by the patch you paid me for, that patch is considered part of the service you paid for (since it's a follow-up to our previous commercial relationship). But this is something a judge should be able to resolve; was my later patch a part of our existing commercial relationship, or was it a separate non-commercial transaction? A system update from Apple to your iPhone is probably part of the existing commercial relationship you have with Apple; a new app from Apple that's not forcibly installed probably isn't.

Risks of misinformation

Posted Sep 21, 2023 7:58 UTC (Thu) by gasche (subscriber, #74946) [Link] (23 responses)

I am worried about the risk of misinformation on this matter. The upcoming legislation is trying to force entities selling software (and hardware) to spend more effort on security. This will increase the costs (obviously) and so most companies are against it. It is to be expected that they will lobby against the regulations, try to weaken their scope and impact, and try to shape the public discourse against the proposed regulations.

The open source communities are great potential allies for companies opposing the law: the public and the regulators will be more easily convinced by groups of volunteer contributors acting for the greater good than by the companies themselves. If I was a company or industry group trying to oppose the regulations, I would invest effort in convincing open source people that it is bad for them and encourage them to lobby against it in their own ways.

It may be that the regulation, in addition to going against the commercial interest of software sellers, is actually bad (in intent or in wording). It is certainly a reasonable idea to discuss it. But I think that we should acknowledge the risk of manipulation here and be fairly careful in our discussions, in particular:
- avoid FUD that creates uncertainty with no actual basis (it is easy to present regulations in a way that sound very scary for people without regulatory experience; is it actually an issue in practice?)
- be very transparent about who is participating to these discussion and what their own interests are

The present article is falling way short in this respect. Does the author have previous experience dealing with how European regulations affect open source communities? We don't know. Did they get advice from FOSS groups that do so? We don't know. Was the article produced as a purely individual initiative, in the context of an existing open source project, or due to the concerns of a specific employer or sponsor? We don't know.

Note: I am not trying to suggest that there is anything nefarious hidden behind this article -- my best guess would be that it is overreacting a bit, but in good faith. I am just pointing that there *could* be, and that there are no mechanisms of transparency in place that would give us a chance of finding out if there was.

LWN is careful to provide transparency on travel sponsoring to avoid conflicts of interest. It should have done a better job here.

Risks of misinformation

Posted Sep 21, 2023 18:08 UTC (Thu) by corbet (editor, #1) [Link] (3 responses)

Nothing was disclosed with regard to this article because there was nothing to disclose; we have paid (OK, haven't paid the bill yet) for another article from a longtime LWN contributor, and that is all there is to it. It's kind of sad that you think we might be serving some other agenda.

Risks of misinformation

Posted Sep 22, 2023 15:03 UTC (Fri) by jschrod (subscriber, #1646) [Link] (2 responses)

Well, I read the parent comment as a critic of the author, and not of LWN.net.

And on 1st reading, I agreed with the critic - especially after the comments of kleptog and franz, who seem to have some inside knowledge about the EU initiative.

Reading the parent comment a 2nd time, I can understand your interpretation that caused your reaction on a may-be accusation.

So, perhaps you could take it this way: Besides the author's report & opinion on the topic, some longtime LWN readers have commented here who seem to be involved in the process that created this regulation proposal. It might be of interest to present their take on it.

FWIW: I'm not involved in any EU initiatives. I'm the CEO of a German IT company that creates FOSS. IOW, I'm affected by that regulation and I'm interested in serious discussion about it.

Risks of misinformation

Posted Sep 25, 2023 6:29 UTC (Mon) by mrybczyn (subscriber, #81776) [Link] (1 responses)

The author of the article here:

I'm also concerned by misinformation and lack of information on this subject. The article is based on my reading of the proposal and amendments, using only public sources (this is the general LWN policy). If you find any errors or omissions, please let me know. This is a complex matter, so they are possible.

For now, however, I can see nobody pointing out any factual errors in this discussion.

I think the CRA is too important to wait until all is set in stone. This is the reason for this writing. This is also why there are so many conditionals - the impact depends on standards yet to be written.

As probably everyone in the discussion, I have stakes in the outcome. I'm running a consulting business concentrated on embedded open-source security. My detailed bios are easy to find online.

And finally, if you want to discuss the subject more, there are various conferences with CFPs coming up. Submitting a panel is a possibility that could be really interesting.

Risks of misinformation

Posted Sep 25, 2023 11:09 UTC (Mon) by kleptog (subscriber, #1183) [Link]

I'd like to thank you for writing this article. I don't usually associate LWN with these kinds of topics, but for this I think it might be exactly the right audience to reach. I didn't see any factual inaccuracies, you do make the correct point that the original draft was unclear in a number of areas. The amended versions are much better, but there is also a process of awareness needed for people to understand what is actually intended and how the whole process is intended to work.

I think the EP committees and Council working parties are doing their level best to make this work and nobody is trying to kill open-source software. In fact, I feel both the Council and the Parliament went out of their way to clarify the impact on open-source projects and reduce work for small businesses. It doesn't help that much of the text is working who is responsible for a what, and what the powers and responsibilities of the Commission are, which means that it's somewhat vague on some details, because the point is they are worked out later. The EU doesn't generally write standards (it doesn't have the manpower), it adopts them from elsewhere which is why I think it's helpful for people to get ahead in thinking about how they'd like this to work and formalise it. Otherwise I fear we'll end up with Microsoft deploying an proprietary AI model on Github to determine the "security health" of projects, with no support if it produces strange results. We can do better.

Thanks again. I learned a lot from this article and ensuing discussion, and I hope others did too.

Risks of misinformation

Posted Sep 22, 2023 22:14 UTC (Fri) by kleptog (subscriber, #1183) [Link] (18 responses)

> I am worried about the risk of misinformation on this matter.

I must say I agree with you. The Apple, Googles, Samsungs, Ciscos, Microsofts, etc of this world have a vested interest to make sure they bear no liability for any software issue in products they release. And they will likely have no problem publishing a lot of FUD about this proposal. We've seen this is previous proposals. I vividly remember when Reddit pushed a big post about the Copyright Directive, whipping up a frenzy, including links to old revisions of the act and complaining about issues that had long been amended out. The readers lapped it up and trying to fight misinformation in that thread was terribly depressing. There were legitimate issues to complain about, but they were snowed under by the misinformation.

So far I haven't seen a lot of noise on the CRA. I think companies are waiting a bit to see what the first round of amendments delivers.

But you are right that we as a community need to put serious thought how we want this to work. We are in a better position to come up with workable solutions and we should, because the status quo isn't working. If we can come up with a workable structure and implement it, then the regulators will fall in behind us. If we don't, we'll get something suboptimal imposed. People are working on this, but the discussion is important.

(I saw a comment about how I may have special inside knowledge about this Act. This is not so. I have however read a lot of EU legislation, including drafts and amendments. I've read the EU treaties and how they interact specifically with Dutch law and it's 4 levels of government and 3 supreme courts and those with each-other. How the EU legislative process works and discussing it friends who study this but don't care about IT. Helping a cooperative work through the evolving regulation of the energy market is an interesting experience. The EU is a complex machine, but it's interesting to watch how people across a massive diverse continent can come to a consensus (or not, as some topics show). But it also has many limits.)

With respect to your point about the the context behind the article, I think what I really missed was info about the author, it was my first comment. Newspapers often include a short bio of guest authors and I think it would have been helpful here. For pure technical articles it doesn't matter so much, but for these topics I think it does.

Risks of misinformation

Posted Sep 23, 2023 0:04 UTC (Sat) by pizza (subscriber, #46) [Link] (17 responses)

> The Apple, Googles, Samsungs, Ciscos, Microsofts, etc of this world have a vested interest to make sure they bear no liability for any software issue in products they release.

Sure. But there need to be clear, consistent rules, with potential liabilities and compliance costs that are *proportionate to the income derived from that software*.

> If we can come up with a workable structure and implement it, then the regulators will fall in behind us.

...This seems wildly optimistic to the point of naivety.

Especially as no matter how you cut it, _anything_ other than "no warranty whatsoever" is going to result in mandated unpaid labor (lest they face potentially massive punitive penalities) from F/OSS authors that have nothing to do with how widely (or for what purpose) their software is deployed.

I don't care about the Googles, Apples, or whatnot. I don't even care about the Apaches, FSFs, or other F/OSS-focused organizations that already have lawyers and accountants on retainer. I care about the long tail of one-or-two-person projects whom about the only rational option is to just delete everything and close up shop.

Risks of misinformation

Posted Sep 23, 2023 3:57 UTC (Sat) by pizza (subscriber, #46) [Link] (16 responses)

> I care about the long tail of one-or-two-person projects whom about the only rational option is to just delete everything and close up shop.

To put this in perspective, a basic general liability and E&O insurance policy has historically cost approximately 4x my average gross FOSS-related revenue. In a world where [as-currently-drafted-]CRA-style regulations are in place, those premiums will only go up, on top of the vastly increased costs of compliance (which is likely to eat up a significant portion of the already limited time I can put into F/OSS work to begin with!)

The options I see here are to [1] cease all commercial activity (ie shutter my side business) [2] jack my rates by at least 4x and effectively bring about #1, and/or [3] completely cease any independent F/OSS activities, by which I mean anything that could lead to liability falling onto me. Of course, there's also [4] roll the dice with potential financial ruin.

It's one thing to take a big risk if there's a realistic possibility of a big reward, but this is all risk and no reward. So #4 is out.
#1 would mean making myself entirely dependent on $dayjob, but I've been laid off twice in the past five years, and that side business has meant I could continue to keep a roof over my family's head. That leaves #3 as the only rational choice.

Risks of misinformation

Posted Sep 23, 2023 16:06 UTC (Sat) by kleptog (subscriber, #1183) [Link] (15 responses)

> To put this in perspective, a basic general liability and E&O insurance policy has historically cost approximately 4x my average gross FOSS-related revenue

If you're talking about E&O insurance, then it would appear you are delivering *services* in which case the CRA is not relevant to you (directly anyway). My impression is that the vast majority of people working in the open-community are delivering a service, namely, writing code for money. This makes sense, because the product itself can be downloaded for free. There are some exceptions, EnterpriseDB, Redhat, SUSE and Hashicorp come to mind.

I think you're putting too much stress on the liability part, since that's only really an issue if you're making promises about a product you're selling (assuming you're selling one) that leads to actual safety issues. The CRA is just the framework, which defines the terms under which a CE mark can be obtained. The actual details of what that means is not yet determined, and it will be a while before any possible certification becomes mandatory. Until then you can simply sell your product without CE marking and without warranty and you'll be fine. You might not be able to sell your product as part of some public procurement process, but I'm thinking that's not a goal for you.

The GP suggests I'm naive in thinking that the open-source community could get together and formulate the terms of what a "well-run security-conscious open-source project" looks like that the bigger projects could self-certify for. I hope I'm not because the alternative is someone like ANSI/ISO defining it and building a bureaucracy around it. Or worse, FAANG doing it. It's unfortunate, because I think we do, collectively, have a good idea what a "well-run security-conscious open-source project" looks like, we're just unwilling to write it down.

Risks of misinformation

Posted Sep 23, 2023 17:52 UTC (Sat) by Wol (subscriber, #4433) [Link] (4 responses)

> If you're talking about E&O insurance, then it would appear you are delivering *services* in which case the CRA is not relevant to you (directly anyway). My impression is that the vast majority of people working in the open-community are delivering a service, namely, writing code for money. This makes sense, because the product itself can be downloaded for free. There are some exceptions, EnterpriseDB, Redhat, SUSE and Hashicorp come to mind.

Which is why, if you have a decent contract, you should hopefully be able to say "I am writing code at your request, you are responsible for making sure it does what's required". Then they are responsible for getting it certified and paying you to fix it if necessary.

Which is why, for projects of any size, I bang on about trade associations. A couple of developers get together, with a couple of clients each, form a trade association which says "for our members we will fix all known problems and certify them", and then all of a sudden we might have a decent financial basis for developers to make an income - selling certification. :-)

Cheers,
Wol

Risks of misinformation

Posted Sep 23, 2023 21:05 UTC (Sat) by pizza (subscriber, #46) [Link] (3 responses)

> Which is why, if you have a decent contract, you should hopefully be able to say "I am writing code at your request, you are responsible for making sure it does what's required". Then they are responsible for getting it certified and paying you to fix it if necessary.

Sure, and then they'll push back with a "ok, but just in case there's a problem, we want you to carry a $2 million liability policy" and when you inform them that this policy will cost more than the project budget, and you'll have to double your rate to cover the cost.

(This has happened to me, twice)

> Which is why, for projects of any size, I bang on about trade associations. A couple of developers get together, with a couple of clients each, form a trade association which says "for our members we will fix all known problems and certify them", and then all of a sudden we might have a decent financial basis for developers to make an income - selling certification. :-)

In other words, significantly increase the barrier to entry.

Risks of misinformation

Posted Sep 23, 2023 21:17 UTC (Sat) by Wol (subscriber, #4433) [Link] (2 responses)

But at least then, you've had the conversation with them, that these guarantees cost money ...

But yes, I understand what you're getting at, as a sole trader this is going to be difficult (but if you've had that sort of conversation before, why will the CRA make any difference?).

Cheers,
Wol

Risks of misinformation

Posted Sep 24, 2023 0:48 UTC (Sun) by pizza (subscriber, #46) [Link] (1 responses)

> If you've had that sort of conversation before, why will the CRA make any difference?

Because under the CRA, what used to be directly billable has been turned into general overhead that folks will expect me to provide as a matter of course.

If your operation is of sufficient scale then it's not going to be that big of a deal, but my "part time consulting/support services" operation is light years from that point.

Risks of misinformation

Posted Sep 24, 2023 5:13 UTC (Sun) by wtarreau (subscriber, #51152) [Link]

That's exactly the root of the problem: people with development skills will have to stop development to spend 100% of their time on legal stuff and bureaucracy. And the EU is champion on bureaucracy. It needs to remain simple so that one doesn't have to fear a certain interpretation of the rules.

The benefits of F/OSS was recognized to the point that in the last few years, some developers saw their software land on Mars. It would not have been imagined 20 years ago that software developed by random people around the world could be critical to a space mission success. This is thanks to the commitment of these people on delivering as high quality code as possible without the fear of any liability nor anything else: 100% of their focus was on technical excellence. I'm afraid it might be the last time we see F/OSS software on another planet. what if the probe is hacked during it way due to an overflow bug on the deployed software and the mission ruined ? In practice any F/OSS developer would rather decline any request for help in getting their software better integrated because they won't know if that's going to expose them to a legal risk.

Risks of misinformation

Posted Sep 23, 2023 20:52 UTC (Sat) by pizza (subscriber, #46) [Link] (2 responses)

> If you're talking about E&O insurance, then it would appear you are delivering *services* in which case the CRA is not relevant to you (directly anyway).

(So this means RHEL is not covered by the CRA? After all, Red Hat is selling "Services", not "software?")

"directly anyway" pretty much is my entire point! By providing *services* associated with the software I also provide, I became a "manufacturer" and thus am on the hook for compliance, liability for security issues, noncompliance, etc. The "commercial activity" is the gating factor.

> I think you're putting too much stress on the liability part, since that's only really an issue if you're making promises about a product you're selling (assuming you're selling one) that leads to actual safety issues.

I use "liability" to refer to the penalties for noncompliance with the CRA -- this can be direct penalties (eg government/court-ordered fines) or indirect penalties (the "cost" of fixing issues that may arise, and/or the cost of being sued). These penalties/liabilites/etc are the kudgel that the CRA uses to achieve its goals.

And the CRA is intended to cover far more than "safety issues" -- Anything connected to the internet needs to be kept up-to-date for security issues that might result in ransomware attacks, data breaches, botnet/etc participation, and more. And sure, potential safety issues too, but there's not a lot of things that can be done with software running on a general-purpose computer that could lead to a physical safety threat.

Risks of misinformation

Posted Sep 24, 2023 5:23 UTC (Sun) by wtarreau (subscriber, #51152) [Link]

> Anything connected to the internet needs to be kept up-to-date for security issues that might result in ransomware attacks, data breaches, botnet/etc participation, and more.

Which is another problem nowadays with few vendors providing long term support. Even if new software versions are provided, very often the user will experience some breakage, which is the very first cause why users don't apply updates. Many of us know the same with an old laptop having a nice combination of a properly tuned window manager, editor and various settings that we don't want to lose across an upgrade. I know people still using Windows XP because it just works and newer versions completely changed the way everything is organized and they don't understand it. Result: as long as XP works, let's not touch it. That's why I consider it important that most of the time the hardware dies before the software, especially for devices connected to the internet. And BTW we should avoid connecting devices that don't strictly require it. When I see the printer we have at work which sends requests to AWS for each button you press and that sends your scans there, it's unacceptable as it exposes the internal network to possible risks on these clients. And from a consumer perspective, who knows how long it will work, if the vendor can shut down the service when they want ?

Risks of misinformation

Posted Sep 24, 2023 22:01 UTC (Sun) by kleptog (subscriber, #1183) [Link]

> (So this means RHEL is not covered by the CRA? After all, Red Hat is selling "Services", not "software?")

Possibly. What would you like the answer to this question to be and why? This is the sort of discussion we as a community should be having. Does it matter that the precise source code they're shipping is not available to non-customers?

Should Firefox/Chrome be held to a higher standard than OpenOffice or Gimp? How and why?

> By providing *services* associated with the software I also provide, I became a "manufacturer"

Sorry, I don't see this supported by any text, and it doesn't make any sense either. A manufacturer is someone that amongst other things, markets a product under a trademark they own. Whether you provide services associated with it is not relevant. What's really relevant here: the manufacturer is the person that can fix any problems.

> And the CRA is intended to cover far more than "safety issues"

Sure, this is confusing two things: when it comes to liability with respect to some (security) event, that's only relevant when talking about safety issues. The CRA covers many more things, but then you're only talking about non-conformity which is something else.

I guess the thing that surprises me most about this whole discussion is that I thought one of the big things about open-source is that people publishing/distributing code did so with a sense of "I made an effort to produce good code, as free of (security) bugs as I could manage". It seems that a sizable portion of the community doesn't feel this, or at least isn't willing to state it publically. That makes me sad, but I guess explains the dismal state of the software industry.

It's basically the "This is fine" meme, while the building burns down around you.

Risks of misinformation

Posted Sep 23, 2023 20:58 UTC (Sat) by pizza (subscriber, #46) [Link] (3 responses)

> I hope I'm not because the alternative is someone like ANSI/ISO defining it and building a bureaucracy around it.

I'd argue that creating these certification bureaucracies is one of the goals of the CRA. It doesn't matter if it's a "FOSS organization", or ISO or whatever, they're going to have to jump through whatever hoops the EU sets up to provide certifications (on an ongoing basis) which will of course require staff and ongoing funding.

Risks of misinformation

Posted Sep 24, 2023 22:09 UTC (Sun) by kleptog (subscriber, #1183) [Link] (2 responses)

CE certifications are mostly done on the basis of self-assessment. There are products that require a third-party audit, but most don't. I would not expect software under the CRA to require an external party to assess them since most electrical devices (ie the computer they run on) don't either.

Third-party certification

Posted Sep 25, 2023 16:58 UTC (Mon) by ghodgkins (subscriber, #157257) [Link] (1 responses)

> I would not expect software under the CRA to require an external party to assess them since most electrical devices (ie the computer they run on) don't either.

You seem to have more knowledge about EU law than me, but from a plain reading of the current CRA text this seems at least slightly incorrect. Section 5, chapter 3 states that manufacturers must get external certification for "critical" class II products. Class II products are listed in Annex III; the category includes routers and CPUs along with several types of software: operating systems, hypervisors, container runtimes, public key infrastructure, and firewalls.

The question then becomes whether the product is "critical" - I believe that's a case-by-case decision made by a panel of bureaucrats. So it seems likely that the CRA will require third-party certification for at least some software, as well as the electronic devices it runs on.

Third-party certification

Posted Sep 26, 2023 8:51 UTC (Tue) by kleptog (subscriber, #1183) [Link]

I can see how what I said may have been confusing. I should have qualified "software" with most.

The computer/laptop you're using (if you didn't build it yourself) came with a CE mark. Many parts inside it also came with a CE mark. It's CE marks all the way down. The CE mark for the computer as a whole basically amounts to an assertion that the integrator checked all the components had a CE mark and that they were put together in accordance with the guidelines of those components. No third party checked whether they actually wired everything up correctly.

It's true that there maybe be components in there that had a third party audit (the WiFi chip?) but that's not something the builder of the computer needs to worry about. They just need to check the documentation of the chip and that they followed the guidelines.

It will be interesting to see which components in Annex III survive, the list was severly shortened by the Council.

> The question then becomes whether the product is "critical" - I believe that's a case-by-case decision made by a panel of bureaucrats

Hardly. The EU doesn't have the manpower and the regulators (all 27 of them) will be chronically underfunded, regulators always are. Bureaucrats are not your problem. The way this usually works is that whether a component is "critical" is agreed buy the buyer/seller when the contract is signed. The only time this will be checked is by a judge in the case of some liability suit. Then all the CE marks will be checked in the process of determining who is liable for what. If you think you can convince an uninvolved third party (aka a judge) you're not a critical product, you're good.

If a judge determines you are a critical product and you don't have the requisite audit, even the most underfunded regulator is going to see a chance for money. The idea behind all this is to try to configure the market to regulate itself via market forces, so governments can save money on the regulators themselves.

But you are right in a sense: many products will get third-party audits... by their customers doing due diligence on their suppliers. As long as you make clear agreements with your customers about who is responsible for what, you're set.

Risks of misinformation

Posted Sep 25, 2023 8:11 UTC (Mon) by ballombe (subscriber, #9523) [Link] (2 responses)

> Or worse, FAANG doing it. It's unfortunate, because I think we do, collectively, have a good idea what a "well-run security-conscious open-source project" looks like, we're just unwilling to write it down.

I am not sure. Consider the original IJG libjpeg library. It has not has a single serious vulnerability in 25 years.
On the other hand, libwebp from 2011 has had tons of serious vulnerabilities.
Is it that Google programmers are incompetent and do not know how to write safe code ? Certainly not.
The fact is that security is not seen as a real priority even by "well-run security conscious open-source project"
and issuing updates to fix vulnerabilities instead of writing correct code from the start is seen as acceptable
if this allows for faster development or faster code, despite claims to the contrary.

Risks of misinformation

Posted Sep 25, 2023 9:45 UTC (Mon) by Wol (subscriber, #4433) [Link]

> and issuing updates to fix vulnerabilities instead of writing correct code from the start is seen as acceptable

THIS!!! IN SPADES!!!

Cheers,
Wol

Risks of misinformation

Posted Sep 26, 2023 5:01 UTC (Tue) by Cyberax (✭ supporter ✭, #52523) [Link]

> I am not sure. Consider the original IJG libjpeg library.

Uhh... Whut?

There are some CVEs for it: https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=libjpeg

Looking deeper, it looks like nobody is actually using the original libjpeg, everyone seems to be using libjpeg-turbo. Debian even uses it to provide libjpeg, they seem to have switched some time in the previous decade.

So it looks more like the case of nobody really caring about it.

The European Cyber Resilience Act

Posted Sep 24, 2023 6:55 UTC (Sun) by riking (guest, #95706) [Link]

> The obligation to notify about all issues also breaks normal disclosure processes. These days, vendors disclose vulnerabilities only after a fix is available. [..] The industry typically uses 90 days,

Note that the Google policy that set the 90-day standard uses 7 days for "actively exploited vulnerabilities".

someone is using py3k… badly

Posted Sep 26, 2023 2:37 UTC (Tue) by mirabilos (subscriber, #84359) [Link]

“ENISA - b'RSS 2.0'”

haha… (kudos to lynx for rendering those RSS links visibly)


Copyright © 2023, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds