Not logged in
Log in now
Create an account
Subscribe to LWN
LWN.net Weekly Edition for May 16, 2013
A look at the PyPy 2.0 release
PostgreSQL 9.3 beta: Federated databases and more
LWN.net Weekly Edition for May 9, 2013
(Nearly) full tickless operation in 3.10
Linux and automotive computing security
Posted Oct 11, 2012 8:16 UTC (Thu) by hickinbottoms (subscriber, #14798)
Being involved in this world as well I can say that whilst testing is a considerable part of the process (the back-end of the development model, if you like), the majority of the effort lies in the front-end during and before the design phase.
You can't design a safety-critical system without knowing what the safety requirements are, and they're often harder to identify than you imagine. For example a hypothetical brake-control system might have a safety requirement that the brakes are applied within X ms of being commanded, with Y reliability, which is a fairly easy requirement to spot. Slightly harder is that it's also likely to be potentially hazardous for the brakes to be applied when not commanded, so you need to spot that and engineer the requirements appropriately -- there have been aircraft losses during landing for such failures if my memory serves me correctly.
It's this identification of the requirements and the associated safety analysis process involving tools such as fault trees, event trees, FMEA/FMECA, hazard analysis/logs, SIL analysis etc that makes safety-critical development really hard and expensive. It is, however, critical to get this right before diving into coding and testing since as we know changing the requirements of systems after they're built is difficult and often leads to unexpected behaviours being implemented. The high-integrity world is littered with examples of failures caused by changed requirements or systems being used to fulfil requirements that were never identified.
Because the resulting design of the system is heavily-influenced by the requirements analysis that got you there it's also very difficult to make a convincing safety case and retrospectively develop a safety substantiation for a system that hasn't been designed that way from the outset.
As the parent poster says, you can't stop non-trivial software from having bugs and crashing, but you can build a confident argument that such failure cannot lead to a hazardous condition with an intolerable frequency. The safety analysis process lets you make such statements with evidence.
It's always a little disappointing that at the end of the day you just end up with 'normal-looking' software that isn't somehow magical and better -- but it's the confidence that it's more likely to do what's expected and that when it doesn't it can't lead to situations you've not at least considered that's important.
Posted Oct 11, 2012 15:01 UTC (Thu) by rgmoore (✭ supporter ✭, #75)
You can't design a safety-critical system without knowing what the safety requirements are, and they're often harder to identify than you imagine.
Yes, and in this case, it turns out that one of the things the designers failed to identify is that they couldn't necessarily trust all of the other systems on the CAN. It's easy to understand why somebody might make that mistake, but the major thrust of the security researchers' article is that it is a mistake. Now they need to go back to the drawing board and design a better set of specifications for their networking component so it won't let the system be subverted by malicious messages.
Writing tests cases
Posted Oct 11, 2012 11:57 UTC (Thu) by man_ls (subscriber, #15091)
Would it be safe to say that for these systems, the majority of the actual effort is expended on writing testcases?
I also ask them to write tests to check their code, and a test harness so the testing can be done mechanically. These are useful skills that are pretty much independent of specific languages or environments.
I am just speaking about coding, but obviously it is not the only development activity. I am not surprised to learn from the above poster that analysis and design take even longer than coding.
Posted Oct 18, 2012 18:22 UTC (Thu) by TRauMa (guest, #16483)
Posted Oct 11, 2012 14:57 UTC (Thu) by ortalo (subscriber, #4654)
Posted Oct 11, 2012 15:00 UTC (Thu) by ortalo (subscriber, #4654)
But is that enough for security (!= safety)?
Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds