The Internet of criminal things
Cars, at this point, can be thought of as a rolling network of computers with some interesting peripheral devices, some of which may involve internal combustion technology. The details of an engine's operation have been under software control for a long time, and replacement ROMs changing a car's performance characteristics have been commonplace for nearly as long. Modern "trusted execution" technology makes the creation of such ROMs more difficult, but that turns out not to be an obstacle if the company wanting to subvert an engine's control software is the manufacturer itself.
Volkswagen's hack must have been easily done: one could, for example, have the engine-control software apply a different set of parameters when a connection to the on-board diagnostic port is detected. No need for the attachment of a separate "defeat device" (as the press seems to like to call it) and no need for an elaborate company-wide conspiracy. A single commit by a single engineer at the behest of a single manager would suffice. In retrospect, the surprising part of this story is not that somebody at Volkswagen gave in to the temptation to engage in a bit of benchmark cheating; the surprise is that far more incidents of this nature have not yet come to light.
The consequences of this cheating are severe. Emissions testing is a key part of a strategy that has significantly improved air quality in American cities over the last several decades. Subverting that testing means more poison in the air, more health problems, and more environmental degradation. It is a criminal act on a massive scale. The consequences for Volkswagen are likely to be severe — but probably not severe enough.
As many others have pointed out, VW was certainly helped by the ease with which antifeatures can be hidden in software shipped to others. When we get into a car, we trust our lives and health to a large body of proprietary control software; the source is unavailable, so we cannot inspect it for bugs, vulnerabilities, or explicit evil. Legal regimes in much of the world make a crime out of reverse-engineering this software, so we cannot try to figure out how it operates even without the source. Digital rights management (DRM) mechanisms built into the hardware make that reverse engineering even harder; this DRM may even be mandated by government agencies fearful of individuals modifying their own engine-control software.
Those in favor of such DRM requirements should bear in mind that, by some counts, VW has shipped over 11 million cars with corrupt engine-control software in it. DRM has, in the end, enabled the crime it was meant to prevent, and on a far wider scale that would have otherwise been possible.
Cars are not the only vehicle (so to speak) for software that can hide user-hostile antifeatures. In the US, the Federal Communications Commission is currently pondering changes that would make it far harder to put free software onto WiFi devices. One need not even consider the damage such rules may do to free-software development, which has been the primary source of innovation and improvement in this area, to see where such rules could lead. We cannot expect corporations, many of which show levels of restraint inferior to that of a typical toddler, to resist the temptation to put spyware or malware into their widely distributed devices sitting in privileged positions on thousands of networks. We cannot really even trust them to adhere to the spectrum rules that are the motivation for the proposed restrictions; VW's lack of respect for emissions rules has made that clear.
Similar problems exist with voting machines, Internet-connected appliances, phone handsets, fitness monitors, set-top boxes, and more. Each of these devices is, at a minimum, in a position to spy on us. Keeping governmental fingers out of these devices is a challenge in its own right, but companies will often find a strong incentive to play games of their own. Companies that are struggling, or even those that fear a downturn in the next quarter's numbers, will often give in to that incentive; when all it takes is an easily hidden patch, why not?
This will not be the first time that somebody points out that it is hard to see a solution that doesn't involve making those patches harder to hide. That, of course, means moving toward something that looks a lot like free software. If VW's engine-control software were open (with reproducible builds so that the software running in a specific car could be verified), it would have been far harder for the company to get away with violating the rules for as long as it did. Source availability is far from a guarantee that the code will be reviewed or that any reviewers will actually find deliberately introduced antifeatures, but it improves the odds considerably. Many a company might find the backbone to resist temptation if it knew that its code would be reviewed by sharp-eyed outsiders. Said companies might just find the wherewithal to clean up the code and fix some of their bugs as well.
A free-software mandate for safety-critical (and privacy-critical) software
seems unlikely to happen anytime soon, alas. Decriminalizing research into
how these systems operate might be a more achievable goal, but there are
challenges there too; the Electronic Frontier Foundation has run
into significant opposition in its efforts to get a ruling that
investigating automotive software is not a violation of the
anti-circumvention provisions of the US Digital Millennium Copyright Act,
for example. Hidden, proprietary software gives a lot of power to those
who control it; they will not give it up willingly. As a result, we can,
unfortunately, expect to continue to be subjected to surveillance and
criminal behavior from the devices that we think we own. We can't say we
weren't warned.
| Index entries for this article | |
|---|---|
| Security | Automotive |
| Security | Embedded systems |
