User: Password:
|
|
Subscribe / Log in / New account

Integrity and embedded devices

LWN.net needs you!

Without subscribers, LWN would simply not exist. Please consider signing up for a subscription and helping to keep LWN publishing

By Jake Edge
October 2, 2013
Linux Security Summit

David Safford's talk for the 2013 Linux Security Summit was in two parts—with two separate sets of slides. That's because the US Department of Homeland Security (DHS), which sponsored IBM's work on hardware roots of trust for embedded devices—part one of the talk—was quite clear that it didn't want to be associated with any kind of device cracking. So part two, which concerned circumventing "verified boot" on a Samsung ARM Chromebook, had to be a completely separate talk. The DHS's misgivings notwithstanding, the two topics are clearly related; understanding both leads to a clearer picture of the security of our devices.

The DHS is interested in what can be done to verify the integrity of the code that is running on all types of systems. For servers, desktops, and mobile devices, there are a variety of existing solutions: crypto hardware, secure boot, Trusted Platform Module (TPM) hardware, and so on. But for low-cost embedded devices like home routers, there is no support for integrity checking. The DHS is also interested in integrity checking for even lower cost sensors, Safford said, but those don't run Linux so they weren't part of his investigations.

Because routers and the like are such low-cost and low-margin systems, the researchers focused on what could be done for "zero cost" to the manufacturer. The team looked at four devices: the venerable Linksys WRT54G router (which spawned projects like OpenWrt and DD-WRT), the Pogoplug local network attached storage (NAS) cache, and two other router devices, the TP-Link MR3020 and D-Link DIR-505. All of those typically retail for around $50.

The MR3020, which was the focus for much of the talk, retails for $30. It has very limited space in its 4M flash chip, so the challenge to adding integrity features is "not so much technical" as it is in "squeezing things in" to the flash. For example, the MR3020 only has 64K of space for U-Boot, 1M for the kernel, and 2.8M for the root filesystem.

Safford gave a handful of examples of router vulnerabilities over the last few years. Beyond just theoretical examples, he noted that 4.5 million home routers were actually compromised in Brazil in 2011. While the vulnerabilities he listed were at the higher levels (typically the web interface), they do show that these embedded devices are most certainly targets.

So, without increasing the cost of these devices, what can be done to ensure that the firmware is what it is expected to be? In the supply chain, the wrong firmware could be added when the system is built or changed somewhere along the way. Safford said that IBM had some (unnamed) customers who had run into just this kind of problem.

So there needs to be a way to "measure" the firmware's integrity in hardware and then to lock the firmware down so that rootkits or other malware cannot modify it. In addition, these devices typically do not have support for signed updates, so malicious update files can be distributed and installed. There is also no ability for the system to refuse to boot if the code has been changed (i.e. secure or trusted boot).

Providing those capabilities was the goal for the project, he said. He showed a table (also present in his slides [PDF] and associated paper [PDF]) outlining the abilities of each of the devices in four separate integrity categories: "Measure BIOS?", "Lock BIOS?", "Secure local updates?", and "Secure Boot?". All of the boxes were "No", except that both the Pogoplug and WRT54G had a way to measure—verify—the firmware (by reading it using SATA via an immutable boot ROM and JTAG, respectively). By the end of the talk, those boxes had all been changed to "Yes" by the changes Safford and his team had made.

The traditional approaches for integrity revolve around either attestation (e.g. trusted boot) or a trusted chain of signed code as in UEFI secure boot. Attestation means that a system uses a TPM to measure everything read and executed, then sends that information to a trusted system for verification before being allowed to continue. There are several National Institute of Standards and Technology (NIST) standards that govern parts of the integrity puzzle, including trusted boot, but there are none, at least yet, that govern secure boot. Safford is working with NIST to get that process started.

Since a TPM chip is "expensive" ($0.75), it violates the zero-cost constraint. But in order to verify the firmware, it must be read somehow. The firmware itself cannot be involved in that step as it may lie about its contents to avoid malware detection. The Serial Peripheral Interface (SPI) bus provides a mechanism to read the contents of the flash for devices lacking other means (e.g. JTAG). That bus can be shared if it has proper buffering, but both the MR3020 and DIR-505 lack the resistors needed.

[Bus Pirate on MR3020]

Enter the Bus Pirate—a device that can be used to read the SPI bus. Using it requires adding three buffering resistors to the Atheros System-on-Chip (SoC) used by the devices, but that adds less than $0.01 to the cost of the device, which is close enough to zero cost that device makers can probably be convinced. That means that users (or device makers) can verify the contents of the flash fairly inexpensively (a Bus Pirate costs around $30).

Once the contents of the flash are verified, there needs to be a way to lock it down so that it can only be modified by those verified to be physically present. The SPI flash chips used by all of the devices have a status register that governs which addresses in the flash can be written, along with an overall write-disable bit. That register can be locked from any updates by holding the chip's write-protect (!WP) pin low. Physical presence can be proved by holding down a button at boot to drive !WP high.

Safford showed the modifications made to the MR3020 and DIR-505 to support the physical presence test. The WPS (Wireless Protected Setup) button was repurposed on the MR3020, while an unused sliding switch position was used on the DIR-505. The paper indicates that similar changes were made on the other two devices. Both the slides and paper have pictures of the modifications made to the devices. In addition, U-Boot was modified so that it locks the entire flash on each boot, but if !WP is held high when power is applied, U-Boot will unlock the flash.

Adding secure boot support to U-Boot was the next step. Once the root of trust is extended into the kernel, the kernel's integrity subsystem can take over to handle integrity verification from there. So it is a matter of verifying the kernel itself. The modified U-Boot will use a public key that is stored at the end of its partition to verify the signature of the kernel. That signature is stored at the end of the kernel partition.

As mentioned earlier, the trick is in getting that code (and key) to fit into the 64K U-Boot partition. Using code derived from PolarSSL, with everything unneeded removed, the modified U-Boot weighed in at 62K. Though Safford was never very specific, the U-Boot modifications must also provide RSA signature checking for updates to the firmware. Support for signed updates is one of the integrity requirements that were successfully tackled by the project.

Through some effectively zero-cost modifications, and some changes to the firmware, the team was able to achieve its integrity goals. All of the devices now support all four of the integrity requirements they set out to fulfill.

Breaking verified boot

Moving on to the DHS-unapproved portion of the talk, Safford showed how one can take control of a Samsung ARM Chromebook. The work on that was done in his spare time, he said, but many of the tools used for adding secure boot for embedded devices are the same as those for removing and altering a system with secure boot. The Chromebook is a "very secure" system, but the verified boot (VB) mechanism does not allow users to take control of the boot process.

However, a fairly simple hardware modification (removing a washer to change the !WP signal) will allow the owner to take control of the device, as Safford found. Beyond the hardware change, it also requires some scripts and instructions [tar.gz] that Safford wrote. Unlike the embedded devices described above, there is a full 4M flash just for U-Boot on the Chromebook, so there is "an embarrassment of riches" for adding code on those systems. VB has been added to the U-Boot upstream code, incidentally, but it is way too large (700K) for use in routers, he said.

In the normal VB operation, there is no way to write to the upper half of the SPI flash, which contains a copy of U-Boot and Google's root key. That key is used to verify two keys (firmware and kernel) stored in the read-write half of the SPI flash. The firmware key is used to verify another copy of U-Boot that lives in the modifiable portion of the flash. That U-Boot is responsible for verifying the kernel (which actually lives in a separate MMC flash) before booting it.

Holding down ESC and "refresh" while powering on the system will boot whatever kernel is installed, without checking the signatures. That is the "developer mode" for the system, but it circumvents secure boot, which is not what Safford set out to do. He wants to use secure boot but control the keys himself. In addition, developer mode must be enabled each time the system boots and you get a "scary screen" that says "OS verification is OFF".

A less scary approach is to use a non-verifying U-Boot that gets installed in place of the kernel. That U-Boot is signed, but does no verification of the kernel (installed in a different part of the MMC flash) before booting it. That way you don't have to invoke developer mode, nor do you get the scary screen, but you still don't get secure boot.

[ARM Chromebook washer]

Removing the washer is the way forward as it allows read-write access to the entire SPI flash. Once that is done, Safford has a set of scripts that can be run from a developer-mode kernel to create new key pairs, sign the read-write U-Boot, sign the kernel, and verify all of the signatures. If any of that goes wrong, one may end up at the "Chrome OS is missing or damaged" screen, which actually means the device hasn't been "bricked" and can be restored from a USB device. Even in the event of bricking, one can recover the device using Bus Pirate as the SPI flash is properly buffered, he said (seemingly from a fair amount of experience).

As part of his demo, he wanted an easy way to show that he had gained control of the low-level boot code in the SPI flash. He decided to change the "chrome" text in the upper left of the "scary screen" to "DaveOS", which actually turned out to be one of the harder ways to demonstrate it. Because of the format of the logo and where it was stored in the flash, it turned out to be rather painful to change, he said with a chuckle.

As Kees Cook pointed out, the washer removal trick was a deliberate choice in the design of the system. Google and Samsung wanted people to be able to take control of the keys for the device, but didn't want an attacker to be able to do so quickly while the user's attention was momentarily distracted. Safford agreed that it was a reasonable compromise, but that it is important for users to be able to set their own keys.

The slides [PDF] for the second half of the talk are instructive as well, with a number of pictures of the infamous washer, scary and DaveOS screens, the Bus Pirate in action, and so on. Seeing the problem from both angles, adding and subtracting secure boot functionality, was useful to help better understand integrity verification. Techniques like secure boot certainly can be used in user-unfriendly ways to lock down devices, but it can also provide some amount of peace of mind. As long as users can provide their own keys—or disable the feature entirely—secure boot is likely be a boon for many.

[I would like to thank LWN subscribers for travel assistance to New Orleans for LSS.]


(Log in to post comments)

Integrity and embedded devices

Posted Oct 2, 2013 21:09 UTC (Wed) by kees (subscriber, #27264) [Link]

As mentioned, replacing the firmware with your own (or just keys) is a specific goal of Chrome OS. Some more details on this, and a lot about coreboot is here, from OSCON 2013:

https://docs.google.com/presentation/d/1eGPMu03vCxIO0a3oN...

Integrity and embedded devices

Posted Oct 4, 2013 11:52 UTC (Fri) by dpquigl (guest, #52852) [Link]

I agree. The Chromebooks are supposed to be open to hacking by their owners including flashing the firmware which performs the secure boot process. The article above mentions that you need to remove a washer to change a signal state from the hardware. If you look at the document that Kees posted (I was at that workshop at OSCON and saw them rip apart one of these chrome books) it is not a simple ordeal to "bypass" verified boot. For the Asus C7 they gave us in the class you have to rip the entire thing apart front and back to get it so you can flash the firmware with a new image.

Integrity and embedded devices

Posted Oct 2, 2013 22:19 UTC (Wed) by jimparis (subscriber, #38647) [Link]

Safford gave a handful of examples of router vulnerabilities over the last few years. Beyond just theoretical examples, he noted that 4.5 million home routers were actually compromised in Brazil in 2011. While the vulnerabilities he listed were at the higher levels (typically the web interface), they do show that these embedded devices are most certainly targets.
Is there an example of the type of attack that this work would prevent? Securely verifying the flash contents (albeit with external hardware -- how is that anything new?) certainly wouldn't have stopped any of the example attacks in the slides. Nor would write-protecting the flash, nor verifying a kernel signature. When vulnerabilities are a dime a dozen and remotely exploitable, it's not like replacing the flash contents was ever important to an attacker.

Integrity and embedded devices

Posted Oct 3, 2013 11:38 UTC (Thu) by safforddr (subscriber, #81020) [Link]

Locking the flash as shown does not fix the vulnerabilities, but it does prevent any exploits from changing any code or configuration data. The Brazil attacks changed the DNS configuration (stored in the flash) to point to malicious servers, and this would have been blocked.

Integrity and embedded devices

Posted Oct 3, 2013 13:54 UTC (Thu) by jimparis (subscriber, #38647) [Link]

But changing the DNS configuration is something that needs to be done anyway, as part of normal operation. Do you propose disabling that, and requiring the physical presence test (overriding /WP) for all reconfiguration? What's to prevent the attacker from getting in while /WP is overridden?

Integrity and embedded devices

Posted Oct 3, 2013 21:34 UTC (Thu) by jmorris42 (guest, #2203) [Link]

If you are like most people, you configure your home gateway and then don't touch it for months or years. Yes you could be infected in that brief window while doing the initial setup but this is still a 99% fix.

Integrity and embedded devices

Posted Oct 3, 2013 21:47 UTC (Thu) by jimparis (subscriber, #38647) [Link]

> this is still a 99% fix

But what is "this fix"? What does this work offer, that you couldn't already do? If your goal is to prevent the flash from changing, and you're OK with configuration being read-only after the initial setup, all you have to do is set the flash read-only with the software latch after configuration, and do the same at all subsequent boots. If the user wants to reconfigure, they hold the button at the back which disables the protection and clears the configuration.

I guess I just don't understand the point of this presentation. Of course you can read a flash chip with external hardware to verify its contents, that's nothing new. And sure, now u-boot can verify a kernel signature, but who cares? Which scenario had an attacker able to reflash the kernel but not reflash u-boot? Just make the kernel read-only in the same way that you made u-boot read-only. It's not like anyone updates it. Verifying a kernel signature does nothing to stop any of the real-world attacks against routers, anyway.

Integrity and embedded devices

Posted Oct 4, 2013 14:03 UTC (Fri) by safforddr (subscriber, #81020) [Link]

The point of the article is that you can't do the things
you describe (lock/unlock the flash, read the flash) with
the existing devices, because the vendors don't bother
to make it possible, even though it would cost them nothing.

You can't physically lock and unlock the flash unless they
connect the !WP pin to an appropriate switch. You can't
verify the flash contents without unsoldering the flash from
the board, unless they buffer the SPI bus.

I have discussed this and other security options with
device vendors, and they use the excuses that it would cost
too much, and no customers care. Hopefully this can stimulate
some discussion there.

As for secure boot (validating the signature of the kernel),
this makes updating more convenient - you can leave the kernel
and rootfs writeable, and easily updated, so long as the
u-boot based signature checking is locked. I have had several
embedded devices which required firmware updates - it seems
typical for devices with buggy firmware to be rushed out the
door, followed by an update which actually works 3-6 months
later. I even had a "smart" TV which frequently locked up
until they shipped out a patch 4 months after I bought it.
In addition, secure boot defends against malicious firmware
updates, although that's not yet a problem in the wild.


Copyright © 2013, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds