1. It doesn't need to know in the general case - if you know your target's crypto library then you just push a microcode update that recognises the specific series of instructions it executes in that path.
2. Secure Boot places the root of trust in the firmware, so yes, if you can't trust the firmware then you can't trust anything above that. But like I said, that's true even without Secure Boot.
3. There are at least 4 common firmware implementations that were independently developed. They're built with different compilers. This code is distributed to a much larger number of board vendors, each of whom then rebuilds it with their own choice of compiler.
Could a security agency compromise all of these? It's theoretically possible, but it doesn't seem like the easiest avenue of attack. The number of firmware implementations is larger than the number of operating systems that run on top of them - backdoor Windows and Linux and you have the same benefits for much less effort. That's not to say that firmware is secure and trustworthy, or that individuals won't be targeted, just that Secure Boot probably isn't the easiest avenue of attack.
4. Why would you do it that way? It'd be far too easy for a user to verify - remove the Microsoft key, check whether Windows boots. As you suggest, the obvious thing to do would be to have some additional embedded key that's checked regardless of whether or not the Microsoft key is present. So sure, removing the Microsoft key doesn't secure you against a security agency who's managed to compromise your firmware. But it *does* protect you against attacks where someone's found an exploit in something that was signed by Microsoft. Reducing your attack surface is an improvement in security.
5. So someone should just rebuild that source code and check whether it matches the binaries that Jetway ship. Matching obviously doesn't guarantee the absence of a backdoor, but a failure to do so would be a pretty strong indication that something's up.
Posted Oct 1, 2013 21:14 UTC (Tue) by PaXTeam (subscriber, #24616)
[Link]
1. matching the crypto library code (something microcode cannot do anyway) won't do any good because that code can occur in several other contexts than your target sshd and it can also be executed for arbitrary other purposes and the CPU can't tell which is which without solving the halting problem.
second, if you have the ability to force arbitrary microcode updates on your target then you're already root there for all intents and purposes, so you have easier ways to backdoor their system.
2. but according to local wisdom here Secure Boot is supposed to give us, well, a secure boot process if only we used our own keys. turns out we're no better off than trusting trust once again.
3. have you seen the Snowden leaks? do you know how many companies got compromised and/or compelled into cooperation by the NSA and other agencies? so yes, i'm not exactly impressed by '4 common firmware implementations', it's a very *small* industry actually (and what does building with different compilers matter? nothing?). in fact, if i were the NSA, i would have my men planted there long ago and the 'how do i compromise them' question would simply become 'when and what do you want me to do?'. as for which is more numerous, i bet there're more different vmlinux and heck, even ntoskrnl images out there than UEFI firmware updates.
4. that's not how it'd work, obviously a Windows boot despite (supposedly) not having the MS key would be a dead giveaway. however it's possible to still accept code signed by the (supposedly removed) MS key if there's an additional condition - no need for an extra key at all, just some embedded secret that only the backdoor owner would know (and whoever else reverse engineers it of course). in fact MS would not even have to be complicit here, they'd just sign such an image in good faith without being aware of the secret payload that'd trigger the 'accept this despite being signed by the removed MS key' logic in the backdoored UEFI firmware.
as for 'an exploit in something that was signed by MS' you probably didn't mean that, but rather an exploitable bug as i find it unlikely that an actual exploit embedded in the to-be-signed code would pass their processes whereas bugs (exploitable or not) slip by all the time. with this understanding it seems now that the worry about the MS key is not that someone abuses it to sign something bad (which is what i was going about before) but that otherwise well meaning code gets signed and then exploited due to its bugs. this is a legitimite concern and removing the MS key would indeed help here... except this problem applies equally well to any other key as well and considering where MS stands with its SDLC and other processes in the industry (read: mostly above everyone else) i think the users are worse off trusting anybody else's keys, key signing processes and software development capabilities than those of MS. perhaps a sad piece of truth for free software but as far as i'm concerned, this is the reality. so if the advice here is still that users should become their own CA and use their own keys for Secure Boot then my bet is that the likes of the NSA (and even less resourceful actors) will still have a field day owning their systems.
5. just a few points above you were bringing up different compilers as one of the reasons why universal backdooring would be so much harder for powerful and skilled actors (it isn't, but that's not the point here) and now you're suggesting that much less resourceful end users should try to gain confidence in their firmware by trying to second guess the exact toolchain and build environment their vendor used. sorry, this doesn't add up to a good argument ;).
NVIDIA to provide documentation for Nouveau
Posted Oct 1, 2013 22:23 UTC (Tue) by raven667 (subscriber, #5198)
[Link]
You guys seem to be talking about two different things and I'm not sure any effective communication is going on here. In one case you have a general statement that firmware, especially firmware that is capable of being updated, can harbor persistent threats, backdoors, etc. EFI provides standard capabilities, like network support, which may make these kinds of threats easier to design or more useful. These risk of these kinds of threats though are unchanged whether SecureBoot(tm) exists or not and so to rope in Secure Boot into the discussion is to muddy the waters about two different threats/risks and two different responses.
NVIDIA to provide documentation for Nouveau
Posted Oct 2, 2013 18:28 UTC (Wed) by mjg59 (subscriber, #23239)
[Link]
I do wonder if we're somehow talking past one another. We appear to agree that there's no fundamental reason to trust firmware - it's possible that it's deliberately backdoored, and it's almost certainly buggy in exploitable ways. When we build any kind of secure system we have to assume that the firmware isn't actively malicious, in the same way that we have to assume that everything else under our stack isn't actively malicious.
But Secure Boot isn't about protecting us from the firmware. It never has been. It's about limiting the set of objects that your firmware will run. Now obviously if a sufficiently powerful actor has leaned on your firmware vendor then they may be able to run arbitrary code on your firmware, but why bother? They could just have the firmware include some SMM code that'd trigger in specific circumstances and modify arbitrary addresses in your running OS.
Obviously Secure Boot does nothing to protect you against such actors, but that doesn't mean it adds nothing to security. Microsoft have signed literally hundreds of binaries. Fedora have signed significantly fewer than that, and all the ones signed by Fedora have also been signed by Microsoft. Removing the Microsoft key and only trusting the Fedora one clearly improves security, if only because you'll no longer be able to boot the Ubuntu grub that'll happily boot unsigned kernels. Perhaps you weren't aware that Microsoft is effectively the global signing authority for UEFI binaries?