|LWN.net needs you!|
Without subscribers, LWN would simply not exist. Please consider signing up for a subscription and helping to keep LWN publishing
David Safford's talk for the 2013 Linux Security Summit was in two parts—with two separate sets of slides. That's because the US Department of Homeland Security (DHS), which sponsored IBM's work on hardware roots of trust for embedded devices—part one of the talk—was quite clear that it didn't want to be associated with any kind of device cracking. So part two, which concerned circumventing "verified boot" on a Samsung ARM Chromebook, had to be a completely separate talk. The DHS's misgivings notwithstanding, the two topics are clearly related; understanding both leads to a clearer picture of the security of our devices.
The DHS is interested in what can be done to verify the integrity of the code that is running on all types of systems. For servers, desktops, and mobile devices, there are a variety of existing solutions: crypto hardware, secure boot, Trusted Platform Module (TPM) hardware, and so on. But for low-cost embedded devices like home routers, there is no support for integrity checking. The DHS is also interested in integrity checking for even lower cost sensors, Safford said, but those don't run Linux so they weren't part of his investigations.
Because routers and the like are such low-cost and low-margin systems, the researchers focused on what could be done for "zero cost" to the manufacturer. The team looked at four devices: the venerable Linksys WRT54G router (which spawned projects like OpenWrt and DD-WRT), the Pogoplug local network attached storage (NAS) cache, and two other router devices, the TP-Link MR3020 and D-Link DIR-505. All of those typically retail for around $50.
The MR3020, which was the focus for much of the talk, retails for $30. It has very limited space in its 4M flash chip, so the challenge to adding integrity features is "not so much technical" as it is in "squeezing things in" to the flash. For example, the MR3020 only has 64K of space for U-Boot, 1M for the kernel, and 2.8M for the root filesystem.
Safford gave a handful of examples of router vulnerabilities over the last few years. Beyond just theoretical examples, he noted that 4.5 million home routers were actually compromised in Brazil in 2011. While the vulnerabilities he listed were at the higher levels (typically the web interface), they do show that these embedded devices are most certainly targets.
So, without increasing the cost of these devices, what can be done to ensure that the firmware is what it is expected to be? In the supply chain, the wrong firmware could be added when the system is built or changed somewhere along the way. Safford said that IBM had some (unnamed) customers who had run into just this kind of problem.
So there needs to be a way to "measure" the firmware's integrity in hardware and then to lock the firmware down so that rootkits or other malware cannot modify it. In addition, these devices typically do not have support for signed updates, so malicious update files can be distributed and installed. There is also no ability for the system to refuse to boot if the code has been changed (i.e. secure or trusted boot).
Providing those capabilities was the goal for the project, he said. He showed a table (also present in his slides [PDF] and associated paper [PDF]) outlining the abilities of each of the devices in four separate integrity categories: "Measure BIOS?", "Lock BIOS?", "Secure local updates?", and "Secure Boot?". All of the boxes were "No", except that both the Pogoplug and WRT54G had a way to measure—verify—the firmware (by reading it using SATA via an immutable boot ROM and JTAG, respectively). By the end of the talk, those boxes had all been changed to "Yes" by the changes Safford and his team had made.
The traditional approaches for integrity revolve around either attestation (e.g. trusted boot) or a trusted chain of signed code as in UEFI secure boot. Attestation means that a system uses a TPM to measure everything read and executed, then sends that information to a trusted system for verification before being allowed to continue. There are several National Institute of Standards and Technology (NIST) standards that govern parts of the integrity puzzle, including trusted boot, but there are none, at least yet, that govern secure boot. Safford is working with NIST to get that process started.
Since a TPM chip is "expensive" ($0.75), it violates the zero-cost constraint. But in order to verify the firmware, it must be read somehow. The firmware itself cannot be involved in that step as it may lie about its contents to avoid malware detection. The Serial Peripheral Interface (SPI) bus provides a mechanism to read the contents of the flash for devices lacking other means (e.g. JTAG). That bus can be shared if it has proper buffering, but both the MR3020 and DIR-505 lack the resistors needed.
Enter the Bus Pirate—a device that can be used to read the SPI bus. Using it requires adding three buffering resistors to the Atheros System-on-Chip (SoC) used by the devices, but that adds less than $0.01 to the cost of the device, which is close enough to zero cost that device makers can probably be convinced. That means that users (or device makers) can verify the contents of the flash fairly inexpensively (a Bus Pirate costs around $30).
Once the contents of the flash are verified, there needs to be a way to lock it down so that it can only be modified by those verified to be physically present. The SPI flash chips used by all of the devices have a status register that governs which addresses in the flash can be written, along with an overall write-disable bit. That register can be locked from any updates by holding the chip's write-protect (!WP) pin low. Physical presence can be proved by holding down a button at boot to drive !WP high.
Safford showed the modifications made to the MR3020 and DIR-505 to support the physical presence test. The WPS (Wireless Protected Setup) button was repurposed on the MR3020, while an unused sliding switch position was used on the DIR-505. The paper indicates that similar changes were made on the other two devices. Both the slides and paper have pictures of the modifications made to the devices. In addition, U-Boot was modified so that it locks the entire flash on each boot, but if !WP is held high when power is applied, U-Boot will unlock the flash.
Adding secure boot support to U-Boot was the next step. Once the root of trust is extended into the kernel, the kernel's integrity subsystem can take over to handle integrity verification from there. So it is a matter of verifying the kernel itself. The modified U-Boot will use a public key that is stored at the end of its partition to verify the signature of the kernel. That signature is stored at the end of the kernel partition.
As mentioned earlier, the trick is in getting that code (and key) to fit into the 64K U-Boot partition. Using code derived from PolarSSL, with everything unneeded removed, the modified U-Boot weighed in at 62K. Though Safford was never very specific, the U-Boot modifications must also provide RSA signature checking for updates to the firmware. Support for signed updates is one of the integrity requirements that were successfully tackled by the project.
Through some effectively zero-cost modifications, and some changes to the firmware, the team was able to achieve its integrity goals. All of the devices now support all four of the integrity requirements they set out to fulfill.
Moving on to the DHS-unapproved portion of the talk, Safford showed how one can take control of a Samsung ARM Chromebook. The work on that was done in his spare time, he said, but many of the tools used for adding secure boot for embedded devices are the same as those for removing and altering a system with secure boot. The Chromebook is a "very secure" system, but the verified boot (VB) mechanism does not allow users to take control of the boot process.
However, a fairly simple hardware modification (removing a washer to change the !WP signal) will allow the owner to take control of the device, as Safford found. Beyond the hardware change, it also requires some scripts and instructions [tar.gz] that Safford wrote. Unlike the embedded devices described above, there is a full 4M flash just for U-Boot on the Chromebook, so there is "an embarrassment of riches" for adding code on those systems. VB has been added to the U-Boot upstream code, incidentally, but it is way too large (700K) for use in routers, he said.
In the normal VB operation, there is no way to write to the upper half of the SPI flash, which contains a copy of U-Boot and Google's root key. That key is used to verify two keys (firmware and kernel) stored in the read-write half of the SPI flash. The firmware key is used to verify another copy of U-Boot that lives in the modifiable portion of the flash. That U-Boot is responsible for verifying the kernel (which actually lives in a separate MMC flash) before booting it.
Holding down ESC and "refresh" while powering on the system will boot whatever kernel is installed, without checking the signatures. That is the "developer mode" for the system, but it circumvents secure boot, which is not what Safford set out to do. He wants to use secure boot but control the keys himself. In addition, developer mode must be enabled each time the system boots and you get a "scary screen" that says "OS verification is OFF".
A less scary approach is to use a non-verifying U-Boot that gets installed in place of the kernel. That U-Boot is signed, but does no verification of the kernel (installed in a different part of the MMC flash) before booting it. That way you don't have to invoke developer mode, nor do you get the scary screen, but you still don't get secure boot.
Removing the washer is the way forward as it allows read-write access to the entire SPI flash. Once that is done, Safford has a set of scripts that can be run from a developer-mode kernel to create new key pairs, sign the read-write U-Boot, sign the kernel, and verify all of the signatures. If any of that goes wrong, one may end up at the "Chrome OS is missing or damaged" screen, which actually means the device hasn't been "bricked" and can be restored from a USB device. Even in the event of bricking, one can recover the device using Bus Pirate as the SPI flash is properly buffered, he said (seemingly from a fair amount of experience).
As part of his demo, he wanted an easy way to show that he had gained control of the low-level boot code in the SPI flash. He decided to change the "chrome" text in the upper left of the "scary screen" to "DaveOS", which actually turned out to be one of the harder ways to demonstrate it. Because of the format of the logo and where it was stored in the flash, it turned out to be rather painful to change, he said with a chuckle.
As Kees Cook pointed out, the washer removal trick was a deliberate choice in the design of the system. Google and Samsung wanted people to be able to take control of the keys for the device, but didn't want an attacker to be able to do so quickly while the user's attention was momentarily distracted. Safford agreed that it was a reasonable compromise, but that it is important for users to be able to set their own keys.
The slides [PDF] for the second half of the talk are instructive as well, with a number of pictures of the infamous washer, scary and DaveOS screens, the Bus Pirate in action, and so on. Seeing the problem from both angles, adding and subtracting secure boot functionality, was useful to help better understand integrity verification. Techniques like secure boot certainly can be used in user-unfriendly ways to lock down devices, but it can also provide some amount of peace of mind. As long as users can provide their own keys—or disable the feature entirely—secure boot is likely be a boon for many.
[I would like to thank LWN subscribers for travel assistance to New Orleans for LSS.]
Copyright © 2013, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds