|
|
Subscribe / Log in / New account

Woodruff: Weird architectures weren't supported to begin with

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 3, 2021 13:33 UTC (Wed) by farnz (subscriber, #17727)
In reply to: Woodruff: Weird architectures weren't supported to begin with by pizza
Parent article: Woodruff: Weird architectures weren't supported to begin with

And there's the thing - what "supported" means matters - there's a difference between "we won't stop you trying to use it, but it may not even compile, let alone do what it's supposed to do", "it will definitely compile to a binary, but may not work as intended and we won't fix it" and "we will make a best-effort attempt to keep it working on this platform". And there's a big difference between "we will make a best-effort attempt to keep it working on this platform" and "if it does not work for you, we will fix it". If you're going to offer one level of support, you need to either be able to offer it yourself, unaided by upstream, or have upstream agree to limits on what they do that make it possible for you to offer that level - be it upstream on the hook for "best effort support for HP-PA", or "we won't use languages not supported by GCC". So, you need to be clear on what you mean by support - and by my standards, most projects are reasonably honest about what they mean (they promise to make their best efforts to help me resolve issues, but don't promise a fix). Dishonesty comes in when you promise one level of support (e.g. "best-effort"), while not putting in place the needed bits to allow you to uphold that promise, on the basis that "nothing will go wrong".

The current fuss is happening because the pyca upstream were offering "we won't stop you trying to use it, but it may not even compile" levels of support for niche platforms, while the Gentoo downstream was offering "we will make a best-effort attempt to keep it working on this platform", along with a certain amount of fuss because Alpine Linux had issues that it's working on resolving.

It all explodes because instead of the Gentoo downstream saying either "right, we'll sort out what we need to support pyca on HP-PA (and S390 etc), thanks for the advance warning that we'll need to do this", they instead went to "how dare you make a change that makes our support promise that we never told you about, let alone got you to agree to, hard to meet". If this is the level of support that Gentoo wants to offer to HP-PA users for pyca, then it either needs to get upstream in on the promise (hence only using things that should work on HP-PA), or needs to do the work themselves despite upstream (forking, implementing dependencies, whatever).

And while Red Hat is good about not asking random contributors to be on the hook to their customers, not all distributions are that nice. I've had to tell one distribution company (not Red Hat) that I am not going to backport my kernel patches for them, nor am I prepared to assist their customers with the problem they had that my patch fixed for me, and that threatening me with legal action is not going to help, either.


to post comments

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 3, 2021 14:53 UTC (Wed) by pizza (subscriber, #46) [Link] (30 responses)

> Dishonesty comes in when you promise one level of support (e.g. "best-effort"), while not putting in place the needed bits to allow you to uphold that promise, on the basis that "nothing will go wrong".

Okay, fair enough. (Though IMO the term "best-effort" is highly subjective, and anyone who interprets that as more than "you're entirely on your own" is a naive fool...)

>And while Red Hat is good about not asking random contributors to be on the hook to their customers, not all distributions are that nice. I've had to tell one distribution company (not Red Hat) that I am not going to backport my kernel patches for them, nor am I prepared to assist their customers with the problem they had that my patch fixed for me, and that threatening me with legal action is not going to help, either.

They actually threatened you with legal action? Just... wow. That's a level of douchiness that even I've not run into.

I've found that the best way to shut up entitled folks is to apply the "charge 'em 'till you like 'em" principle" -- "I estimate what you're asking for will take me $X hours. My standard scheduled-well-in-advance rate is $Y. If you want it done faster than that, my rate will be $Y^$N, where $N is directly proportional to the turnaround time, how little I care about your problem, and your demonstrated level of douchiness."

I rarely hear back after that.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 3, 2021 15:44 UTC (Wed) by farnz (subscriber, #17727) [Link]

I'd agree that best-effort is subjective, and that, absent a legal contract defining it, is meaningless - and that for all open-source, the legally required level of support is "none at all - you're on your own". That said, we're also discussing the social context, and what is and is not fair to expect of your upstreams.

In the social context, there are two lines of dependency:

  1. What upstreams explicitly promise to their downstreams. While you have no way to enforce that promise, it seems unfair to state that when making your own social promises, you can't rely on other people's social promises. For example, upstream may promise
  2. What is implicitly promised by upstream to their downstreams by upstream's behaviour. This is largely what's triggered this fuss - upstream changed behaviour by using a new programming language, and downstreams are upset.

Here, we had a mismatch between the implicit promises upstream thought they were making (to provide a Python cryptography module that has decent security, is easy enough to use correctly that it can be your first choice for cryptography, and works on an explicit set of platforms), and the implicit promises that downstream thought were being made (to provide a Python cryptography module that only depends on things that can be built for any Linux platform using GCC). The result is pain for the downstream that thought that more was promised than actually was, and the fuss is because that downstream is trying to push the pain back upstream rather than accepting that upstream never promised them what they need.

So, the legal position is clear - downstream has no way to enforce that upstream does what they want. The social position is less clear; did upstream give downstream reason to believe that upstream would behave in a way that downstream is happy with, or did downstream expect something unreasonable of upstream, given what upstream has explicitly promised? Personally, I view downstream as in the wrong here - they made assumptions about what upstream was going to do, didn't tell upstream what they expected of them, and then got upset when upstream didn't live up to their assumptions.

And yep, they got douchy. I told my employer what they wanted, my employer wrote back with a quote for my time (vastly inflated, of course, and would have resulted in a nice bonus for me and my colleagues - small firm), and they wrote back to me directly threatening to sue because I had not fixed their customer's problem, and the fact that my employer was willing to quote them for taking up my time implied that I had incurred liability for the problem that I had not caused, merely failed to fix to their satisfaction. I passed that back to my employer, we contracted a suitable lawyer, said company paid our lawyer's fees to make us go away. I assume the douchy person got thoroughly reprimanded for threatening me - it certainly meant that when one of their colleagues contacted me asking about the intention behind my fix, I replied pointing to the legal threats and saying that I would not work with them as a result.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 4, 2021 15:09 UTC (Thu) by johannbg (guest, #65743) [Link] (28 responses)

" They actually threatened you with legal action? Just... wow. That's a level of douchiness that even I've not run into. "

Well it's getting even better or worse depending how you look at it.

Corporations with or through governments and or governments themselves are trying to implementing a real names policy for project owners and or contributors to opensource software for one purpose only accountability/liability.

Take for example this fairly recent Google "security" post. [1]

" Goal: For Critical Software, Owners and Maintainers Cannot be Anonymous

Attackers like to have anonymity. There have been past supply-chain attacks where attackers capitalized on anonymity and worked their way through package communities to become maintainers, without anyone realizing this “new maintainer” had malicious intent (compromising source code was eventually injected upstream). To mitigate this risk, our view is that owners and maintainers of critical software must not be anonymous.

It is conceivable that contributors, unlike owners and maintainers, could be anonymous, but only if their code has passed multiple reviews by trusted parties.

It is also conceivable that we could have “verified” identities, in which a trusted entity knows the real identity, but for privacy reasons the public does not. This would enable decisions about independence as well as prosecution for illegal behavior. "

So individual(s) decide to create a F/OSS project and after a while it gets picked up by some corporation or government, with or without the projects creators/owners knowledge about it.

It gets classified as a core component of said corporations product portfolio or government infrastructure or simply is being used in illicit purposes without their knowledge.

The projects creators/owners go about their usual F/OSS business, write/remove/review/accept & commit their own code or a contribution from someone which fixes 1 bug but introduces 3 new ones.

That "fix" introduces "a security flaw" in the project or otherwise prevents it from working "normally" in the corporations "product" or government infrastructure and suddenly the project owner(s) and or it's contributors have become liable in the process.

You as the project owner for a) not validating that "John Doe" was a real person and exactly that "John Doe" on the planet. b) for not being all knowing in the language being used in your project and committing the fix that lead to things "break".

The contributor for not using his "real name" but instead of using a pseudonym to distinguish himself from the rest of the "John Doe" on the planet ( Everyone that use a pseudonym are doing so to conduct in illegal behavior right. "pizza" must be a bad person not a persons that simply loves pizza's. ) or simply create PR with a fix that broke other stuff hence you must have been doing so for illegal purposes not because you simply knew anybetter. etc. etc. etc.

As things are progressing, accountability/liability will enter the F/OSS world regardless of anykind of "license" that the project uses.

1. https://security.googleblog.com/2021/02/know-prevent-fix-...

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 4, 2021 15:59 UTC (Thu) by pizza (subscriber, #46) [Link] (27 responses)

> As things are progressing, accountability/liability will enter the F/OSS world regardless of anykind of "license" that the project uses.

Given that "the software is provided as-is, without any warranty whatsoever", it will be a pretty hard to convince a court that the software owners/maintainers/authors/contributors/roommates/pets/toasters are somehow liable for an end-user's choice to use that software for safety-critical tasks, much less commercial inconvenience.

Absent a contract, the standard license disclaimer applies.

(and any laws mandating strict liability to software authors will completely tank the economy from top to bottom, and such will be strongly opposed by pretty much everyone that's not a law firm seeking to cash in)

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 4, 2021 16:40 UTC (Thu) by johannbg (guest, #65743) [Link] (26 responses)

An legislation ( like a real names policy ) will just be introduced and implemented which circumvents that. One way or another accountability ( thus liability ) will find it's way to F/OSS.

I would not be surprised if software ends up being heavily regulated in the future in similar manners as building codes/control/regulation and anything that does not follow that regulation could be deemed illegal and at that point whatever license ( open/closed ) the project has, has become irrelevant.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 4, 2021 17:41 UTC (Thu) by pizza (subscriber, #46) [Link] (25 responses)

> I would not be surprised if software ends up being heavily regulated in the future in similar manners as building codes/control/regulation and anything that does not follow that regulation could be deemed illegal and at that point whatever license ( open/closed ) the project has, has become irrelevant.

I'm reminded of a saying attributed to the former ruler of Dubai:

"My grandfather rode a camel, my father rode a camel, I drive a Mercedes, my son drives a Land Rover, his son will drive a Land Rover, but his son will ride a camel"

Strict regulation/liability will pretty much end the "Software industry" overnight, and take most of the economy down with it.

You'll never get me to say that the "Software industry" is healthy, but this cure will be vastly worse than the disease, and everyone (especially those who control the levers of power) knows it.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 18:20 UTC (Fri) by khim (subscriber, #9252) [Link] (24 responses)

> Strict regulation/liability will pretty much end the "Software industry" overnight, and take most of the economy down with it.

Only if you would introduce these strict regulations overnight. But then, if someone introduced today's rules for building 300 years ago… economy would have crashed, too.

> You'll never get me to say that the "Software industry" is healthy, but this cure will be vastly worse than the disease, and everyone (especially those who control the levers of power) knows it.

At some point they would have no choice. Cost of programming errors grows larger and larger the more interconnected the world becomes. At some point either new software would have to become more reliable… or we should stop using it. The attempt to cause former may lead to latter… but government would try, anyway.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 20:03 UTC (Fri) by johannbg (guest, #65743) [Link] (23 responses)

Parts of the industry already are regulated and we are at the point where cars are being recalled due to failing software in brakes <--- so it does not take more then couple of EV Hyundai's [1] of the world or AI/self operated vehicles out of control and without brakes, plowing through series of kindergarden schools or someone simply placing a signal jammer at a busy intersections which causes a vehicle to lose it's "connection" to the "cloud" and <bam> for the entire software industry to become heavily regulated.

It's inevitable evolution that the industry will be regulated ( which already gradually has been happening ) and people will be held accountable for what they write regardless under which license the program they wrote ends up having so if people think that F/OSS licensed program will somehow be exempt from those eventual rules and regulation they are sadly mistaken. The blame game ends with the person that wrote the code and the people that reviewed and committed it as Volkswagen already has proven [2].

1. https://thedriven.io/2020/12/22/hyundai-recalls-kona-elec...
2. https://www.reuters.com/article/us-volkswagen-emissions-s...

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 20:50 UTC (Fri) by pizza (subscriber, #46) [Link] (19 responses)

> It's inevitable evolution that the industry will be regulated ( which already gradually has been happening ) and people will be held accountable for what they write regardless under which license the program they wrote ends up having so if people think that F/OSS licensed program will somehow be exempt from those eventual rules and regulation they are sadly mistaken.

"Held accountable" how, exactly, and to what extent? Criminal liability resulting in jail time or fines? Or some sort of civil liability? Either way, where does this "accountability" end?

> The blame game ends with the person that wrote the code and the people that reviewed and committed it as Volkswagen already has proven [2].

What VW did wasn't a "coding mistake" - It was outright fraud, willfully perpetrated all the way up to the highest executive levels across at least two continents.

The "engineer" in the article didn't write a single line of code; he instead he was acting in a managerial role and signed off on stuff he knew was false -- in other words, he was one of the folks who actively lied to regulators and directly committed a crime.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 21:43 UTC (Fri) by johannbg (guest, #65743) [Link]

> "Held accountable" how, exactly, and to what extent? Criminal liability resulting in jail time or fines? Or some sort of civil liability? Either way, where does this "accountability" end?

Some form of liability. The real name policy ( like google was advocating for ) are designed and implemented to go after a person(s) not a program(s).

The vehicle example I gave it's not an unlikely to happen in a F/OSS world [1].

I'm pretty sure further down the line we will see some examples in court, that show how far the term "coding mistake" can be stretched and I have no idea what exactly role he played within VW but initially VW blamed a "rogue" team of engineer within the company but obviously the entire developer team who wrote the code should have gone to jail it's not like they did not know it was unethical and illegal. All of the persons involved could have said no I wont do that, I quit or get someone else to do it etc. it's not like they can hide their heinous act behind some "management" decision.

1. https://www.autoware.org/

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 22:36 UTC (Fri) by kleptog (subscriber, #1183) [Link] (17 responses)

> "Held accountable" how, exactly, and to what extent? Criminal liability resulting in jail time or fines? Or some sort of civil liability? Either way, where does this "accountability" end?

The question of liability is the easy part and already done: whatever entity sells the product/service to the customer is liable. They bear full responsibility. They may try to legally pass off the risk onto any of their suppliers for different parts, but for the consumer that is irrelevant, they sue the entity they bought the product/service from. If the supplier chooses to use Free Software, then they are also liable for its proper functioning, unless they can find someone to accept the risk for them.

Regulations would be required if it turns out that just apportioning liability is not enough. Since no manufacturer is ever required to use free software, I don't see how there is likely to be any issue there.

Now, software companies get away with disclaiming liability because that's seems to work and not too many people complain about it. Regulations tend to be written in blood so unless we see a situation where lots of people start dying due to dodgy software I don't see a lot of regulation occurring outside of where it is already, eg cars, planes, medical devices.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 23:04 UTC (Fri) by johannbg (guest, #65743) [Link]

Well I see plenty of use for regulation on any software used in nations critical infrastructure sectors and depending on the country and which sector a lot of F/OSS software might be in use.

You see, the things is that governments have a tendency to become very unhappy if they suddenly find out that someone, out of kindness of their heart was helping them backup their data or ensure they meet their environment commitments by shutting down the nations power grid etc. despite it all being done in good faith and the nations best interest. :)

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 0:23 UTC (Sat) by pizza (subscriber, #46) [Link] (15 responses)

> The question of liability is the easy part and already done: whatever entity sells the product/service to the customer is liable. They bear full responsibility. They may try to legally pass off the risk onto any of their suppliers for different parts, but for the consumer that is irrelevant, they sue the entity they bought the product/service from. If the supplier chooses to use Free Software, then they are also liable for its proper functioning, unless they can find someone to accept the risk for them.

See, I'm fine with that. There's an existing supplier<->customer relationship.

But how exactly can I, by virtue of writing a random pile of F/OSS that I published on my web site, when I have no relationship with the end-user or the company that manufactured $widget_x, be held liable for that widget failing and injuring the end-user, especially when I state up front that my pile of F/OSS comes with no warranty, little testing, and might randomly fail and kill your cat? [1]

If I'm to be held liable for <whatever> then I need to be compensated in proportion to that risk, or I'd have to be downright stupid to do it.

> Now, software companies get away with disclaiming liability because that's seems to work and not too many people complain about it. Regulations tend to be written in blood so unless we see a situation where lots of people start dying due to dodgy software I don't see a lot of regulation occurring outside of where it is already, eg cars, planes, medical devices.

Sure, but it's not "software" failing, it's "safety-critical product X (that happens to be partially implemented using software) failing".

That's the key difference. You're not regulating *software*, you're regulating *product that contain software*.

[1] Yes, I actually state that in the documentation. Should I somehow be liable because the user didn't read the documentation too?

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 8:30 UTC (Sat) by johannbg (guest, #65743) [Link] (14 responses)

You do realize that you are the in the role of the manufacturer right?
You wrote the code.
You made the code publicly available/accessible.
You end up having to take responsibility/accountability for it.

If a person creates a lemonade and that person then makes her or his lemonade publicly available and accessible under a sign "Free lemonade, drink at your own risk" and all the kids in the neighborhood see "Free lemonade" on the street corner, grab themselves some lemonade and start dying because that person decided to mix some acid into his or hers lemonade.

Is that person free of any responsibility and accountability because that person put up a lemonade stand and had a sign that says "Free lemonade, drink at your own risk" and allowed everyone to watch how she/he poured acid into the lemonade as she/he was mixing the lemonade.

The answer to that is no and one of the reasons why food regulations and laws surrounding that exist in the world.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 11:00 UTC (Sat) by mpr22 (subscriber, #60784) [Link] (13 responses)

That scenario displays clear mens rea (the acid was deliberately introduced to the lemonade) and actus reus (the lemonade was offered with the intent that it would be consumed, and it would be reasonably foreseeable that serious injury or death would result from such consumption) for multiple counts of murder under the English common law (let alone modern food safety statutes and regulations), making it profoundly unsuitable for a discussion of the standard of liability where malice is not obviously feasible.

What we're talking about here is more along the lines of:

Person (not necessarily natural person) 1: "I released some software for free. I haven't prepared a formal proof of its correctness, nor an evidence package illustrating its suitability for safety-critical uses, so it's unsuitable for use in safety-critical systems unless you are willing to do all that work yourself."

Person (not necessarily natural person) 2: "Don't care, I'm using it in my driverless car anyway."

Person 2 is the natural target (and, in practice, probably the only worthwhile target) for civil proceedings.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 11:00 UTC (Sat) by mpr22 (subscriber, #60784) [Link]

"not obviously feasible" <---- Er, I meant "not obviously present". Brain. Fingers.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 13:31 UTC (Sat) by johannbg (guest, #65743) [Link] (10 responses)

What we are talking about is holding people accountable for their actions. No "license" is going to prevent that.

As software becomes more and more integrated part of the society, more rules,regulation and laws will be built around that.

If it did not do that then everybody would just F/OSS their software ( corporation and people alike ) and be free from any accountability.

It might not be what people like or want but it's inevitable evolution, just as has happened in every other industry.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 15:02 UTC (Sat) by mpr22 (subscriber, #60784) [Link] (1 responses)

If you incorporate software into your product without verifying its safety, then in the first instance the natural civil liability for any defective behaviour arising out of faults in the software that you incorporated into your product lies with you, even if someone else wrote it (especially if they wrote it in their spare time, before your product even existed as a concept, and distributed it for free).

Now, if you can prove a sufficient degree of negligence, recklessness, breach of contract, sabotage, and/or fraud on the part of whoever made the software, then it would be natural for some portion (possibly as much as 100% depending on the nature of the defective behaviour, the nature of the tortious conduct, and the adequacy of your attempts to guard against defects in the software) of the liability be transferred to them.

Any idiot who decides it would be a good idea to grab a pile of pre-existing code written by a no-name rando with a net worth of a few thousand dollars and stick it in their widget without stopping to ask themselves "should I really trust, without further verification on my part, this software written by some no-name rando whose response to a liability claim would be to declare bankruptcy because the legal fees alone would be more than their combined net worth and gross annual salary?" needs to reconnect with reality.

Woodruff: Weird architectures weren't supported to begin with

Posted Dec 22, 2022 6:25 UTC (Thu) by mrugiero (guest, #153040) [Link]

> Any idiot who decides it would be a good idea to grab a pile of pre-existing code written by a no-name rando with a net worth of a few thousand dollars and stick it in their widget without stopping to ask themselves "should I really trust, without further verification on my part, this software written by some no-name rando whose response to a liability claim would be to declare bankruptcy because the legal fees alone would be more than their combined net worth and gross annual salary?" needs to reconnect with reality.

And that's, IMO, where the regulation should start and stop.
You provide a paid product or service? Either you can prove who provided your supplies or you're directly liable. In other words, a proper regulation wouldn't be about what's required for you to do something, but what's required for those who are supposed to "guarantee" it works correctly.
One of those requirements is have a face and name for everyone involved in the making. If you can't do that for an open source project, then don't use it for your product or you're liable for any defects it may have. The author of such code should still be one with full choice about whether to remain anonymous, it's you as downstream that's responsible of picking only the public ones. Anyone who decides to use the anonymous' code is legally responsible for doing so.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 16:25 UTC (Sat) by pizza (subscriber, #46) [Link]

> What we are talking about is holding people accountable for their actions. No "license" is going to prevent that.

Remember that absent the "license" you don't have the rights to use my software, period. Only that license gives you that right.

In exchange for the right to use my software, you have you have to agree to not hold me liable should your house burn down.

You chose to use my software, and your house burns down.

Who is "accountable" here, the software author who explicitly stated, in advance, that their stuff can't be trusted, or the person who chose to use it anyway?

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 21:29 UTC (Sat) by farnz (subscriber, #17727) [Link] (6 responses)

Equally, though, at least in my jurisdiction, liability laws for anything only expect you to be liable for reasonably foreseeable consequences of your actions, and do allow for disclaimers of liability even for things like buildings and cars. Not total disclaimers, but (for example) the skylights for my house have a legally enforceable disclaimer of liability for faults other than manufacturing defects and undisclosed issues with the design.

The much more likely model for software liability here would be to copy consumer goods (everything from pens through to cars); in that model, the default liability for any product is limited to the price paid to you, and a full refund for the faulty goods is normally the limit of your liability. The only time where you have further liability beyond the purchase price paid to you is when you could foresee that the part you supplied would, when used correctly, cause the extra costs that the buyer has incurred; one way that you can foresee such things is if the buyer explicitly tells you about those extra costs.

So, for example, I buy a washing machine; it doesn't wash clothes and doesn't function. The seller's liability to me is limited to a full refund. I tell the seller that I need a washing machine that works because if I can't wash my clothes tonight, I'm going to have to spend £100 on new clothes tomorrow, and that is enough that the seller has to either refuse to deal with me (allowed, of course!), or is liable to me for £100 above a full refund.

And note that these only apply if the seller is acting as a business, not as a private individual; if I sell you my home-made 6m band dipole aerial, then caveat emptor applies (unless I'm producing aerials as a business).

For a practical example of how all of that interacts, consider the engines sold by Honda for other people to build into products. If I buy an iGXV800 from Honda on their normal terms, they are liable to me if the engine does not function, or fails in use, to the cost of refunding me for the engine in full. However, if I build that engine into a motorbike of my own design and sell it to you, Honda don't acquire any additional liability; if you have a catastrophic engine failure while riding it, Honda are still only liable for the cost of the engine, as they did not foresee the extra consequences of a failure while using it in a motorbike (it's not sold for that use). Had I bought a Honda CB300R motorbike, and sold it to you, Honda could now foresee the consequences of a catastrophic engine failure while riding it, and would have the extra liability that results.

Translated to open source terms, most projects would have no liability still - the default position is that you're liable to supply a full refund if it doesn't work, but as no money has changed hands, that's a non-effect. Samsung would be on the hook, however, if the software in a Samsung phone does not work, even if it includes open source, because money changed hands; even then, the normal limit is the money I paid for the phone. Tesla, on the other hand, could be on the hook for far more money, even if their FSD software is mostly open source, and even if the failure is caused by an open source component, because Tesla could predict that it might crash. The developer of (say) a computer vision component used in the FSD software, however, is only liable to Tesla for what they agreed to (as part of a contract), or the money paid by Tesla to them for the software (probably nothing if it's open source).

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 22:28 UTC (Sat) by johannbg (guest, #65743) [Link] (5 responses)

If we take for example this latest hack into Microsoft Exchange server [1][2] which has compromised over several hundred thousands instances worldwide ( which seems to be somewhat hushed since this is probably bigger hack then the solarwinds one ), I would not be surprised that the "manufacture" of a software will be held liable for "faults" in the software ( negligence ) in the future.

Arguably Microsoft in this case should be held accountable for their own negligence towards the US government, it's tax payers ( which probably have spent billions in license fees ) and of course the rest of the world as well.

People also need to realize that as open source has become more widespread and used, it has also
become increasingly affected by societal issues, including, both ethical and political issues ( US gov + Huawei + Google case is probably a good example of such political issue ), all of which will affect how the future framework ( rules and regulation ) surrounding it and how the rest of the software sector is shaped in the future.

https://cyber.dhs.gov/ed/21-02/
https://www.microsoft.com/security/blog/2021/03/02/hafniu...

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 23:25 UTC (Sat) by mpr22 (subscriber, #60784) [Link] (1 responses)

The natural strategy for dealing with liability issues around Exchange is to say "this is a commercial product offered for sale, so no, you cannot in fact waive the expectation that what you provide is of merchantable quality".

That liability model breaks down with free software because identifying an entity to which you can both reasonably(1) and usefully(2) attach civil liability will frequently lie somewhere between "difficult" and "impossible".

(1) "Reasonably" meaning that it is fair and equitable to hold the identified entity responsible in tort for the incident that has occurred.

(2) "Usefully" meaning the plaintiffs have a realistic prospect of recovering a useful percentage of their damages from the defendants identified, rather than just bankrupting the defendants to the sole benefit of the lawyers.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 7, 2021 14:11 UTC (Sun) by Wol (subscriber, #4433) [Link]

And identifying the individual concerned will lie between "difficult" and "impossible".

Take sendmail (seeing as we're talking about MS Exchange Server) as a case in point.

Allman wrote it in the kinder, gentler days of the gentleman's internet. Lots of people modified it to do things Eric never thought of. Then came the crackers who abused it.

Is it Allman's fault - for not forseeing the future? Is it the fault of the people who re-purposed it to suit themselves? Is it the fault of the distros, or the software repositories, who made it freely available? Is it the fault of the people who didn't understand how to configure it securely?

Even identifying who those individuals are is fraught with problems.

Cheers,
Wol

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 23:44 UTC (Sat) by farnz (subscriber, #17727) [Link] (2 responses)

And, taking that Microsoft Exchange server issue as an example, Microsoft are the final vendor; by default, unless they could reasonably foresee the issue, they'd be liable to at most a full refund for the licence fees everyone has paid them for licences for those instances. Of course, they could be liable for more - but no amount of disclaimers will limit their liability below the sum paid in my local jurisdiction.

This doesn't mean that it will affect open source - Exchange is a product, but there's no liability on (e.g.) the RSGB for publishing circuit diagrams in RadCom that could be dangerous if constructed badly. That said, it will affect people like Canonical, SUSE and Red Hat - if you're selling open source software (even just as a bundle with support), you become liable to ensure it works, or to refund people.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 7, 2021 0:07 UTC (Sun) by pizza (subscriber, #46) [Link] (1 responses)

> But no amount of disclaimers will limit their liability below the sum paid in my local jurisdiction.

This is a key point -- If your jurisdiction decided to change the law to override disclaimers of warranty and/or liability waivers, it would affect far more than the likes of Microsoft or "software". I don't think it's an exaggeration to say that it would send most of the economy to a screeching halt, and any software/products/services offered under the new regime will come with a _much_ higher price point, proprotional to the heightened, un-waiveable liability the seller/producer/manufacturer is potentially responsible for.

This distinction is why you can buy device that measures your pulse for $20, but a "medical device" that does the same thing (with the same fundamental components!) costs $2000. The "medical device" is sold with explicit promises of merchantability, reliability, and accuracy, and there are major penalties if it fails even if there was no malice or negligence involved. The development and certification process necessary to meet those requirements is quite extensive, and therefore quite expensive. Plus you have to carry significant amounts of insurance to ensure you can meet those liabilities.

> That said, it will affect people like Canonical, SUSE and Red Hat - if you're selling open source software (even just as a bundle with support), you become liable to ensure it works, or to refund people.

"works" for what purpose, exactly? That's going to have to be explicitly spelled out...

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 7, 2021 20:13 UTC (Sun) by farnz (subscriber, #17727) [Link]

Disclaimers of warranty are already overriden by local law here, and have been since the 1970s (coming up to 50 years). A product has to be of a reasonable standard given the price charged, and to last a reasonable time, again taking the price into account; it is expected to function as advertised before the sale.

So, the $10 pulse oximeter I own calls out what I can expect of it on the packaging - high error margins, low reliability. There's no disclaimer of warranty, nor a liability waiver; instead, there's setting of expectations so that the company selling the device is clear that they're not liable for anything beyond the functionality of the device, and that the functionality is about what you'd expect for a $10 device. If it doesn't do what they've promised it does to a reasonable standard for a $10 device, then my options are limited to a repair, replacement or full refund at the vendor's discretion in the first instance (I get a right to a refund if they cannot repair or replace) - so $10 at most. The similar device a local hospital uses does indeed cost a lot more - but as you say, that is because they are promising a lot more.

The net effect is that companies become very clear about what functionality you can expect from a device, because full refunds are not something you want to give very often. For Canonical, Red Hat, etc, the result is that you advertise a lower expectation, because you can be held to that; so Exchange would have to be advertised as insecure to escape liability for the hack.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 6, 2021 16:18 UTC (Sat) by pizza (subscriber, #46) [Link]

> Person 2 is the natural target (and, in practice, probably the only worthwhile target) for civil proceedings.

I agree completely; (though "worthwhile" in this context should mean "going after anyone else will get you laughed out of court and on the hook for the other parties' else's legal fees)

Every single scenario suggested as a reason we need "accountability/liability" has been malicious in nature (ie involving mens rea &| actus reus). I have yet to see anyone explain what sort of liability should flow from accidents, why that should extend all the way to the individual software authors (instead of the "owners" of the software) and how the software profession can possibly survive that.

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 21:18 UTC (Fri) by Wol (subscriber, #4433) [Link] (2 responses)

> someone simply placing a signal jammer at a busy intersections which causes a vehicle to lose it's "connection" to the "cloud" and <bam> for the entire software industry to become heavily regulated.

That should NOT be possible. And that's one of the big problems with this - too many people think this is the correct solution when it is provably disastrous. An autonomous vehicle should be exactly that - autonomous! If it relies on external back-up, then that backup will *inevitably* fail when it is needed. Muphrys law and all that ...

Cheers,
Wol

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 21:54 UTC (Fri) by johannbg (guest, #65743) [Link]

Well the industry does not like to talk about how you can easily fool or disrupt vehicle these days with signal jammer, 3d printed objects etc. since that does not quite sell the idea to investor and governments + We already have a fully autonomous vehicle with all the capable means you describe and they are called taxis...

Woodruff: Weird architectures weren't supported to begin with

Posted Mar 5, 2021 22:12 UTC (Fri) by mathstuf (subscriber, #69389) [Link]

> Muphrys law and all that ...

Is this a British thing? I think you're referring to Murphy's Law. Muphry's Law seems to be:

> If you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written.

and a deliberate misspelling of the other one.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds