LWN: Comments on "Adding a signing key to RPM" https://lwn.net/Articles/298400/ This is a special feed containing comments posted to the individual LWN article titled "Adding a signing key to RPM". en-us Sat, 13 Sep 2025 13:20:28 +0000 Sat, 13 Sep 2025 13:20:28 +0000 https://www.rssboard.org/rss-specification lwn@lwn.net Validity checking - should be mandatory? https://lwn.net/Articles/298959/ https://lwn.net/Articles/298959/ kripkenstein <div class="FormattedComment"> That makes sense, but I think I saw installation begin after a few of the downloaded files were corrupt. That is, apt-get might verify the signatures after downloading, but it doesn't first verify them all and only if they all pass then actually begin installation (or, at least verify that all dependents have been verified ).<br> <p> That, or I really misanalyzed what was going on in that system - could be. But I did see the actual downloaded files were corrupt, and that installation did begin and only halted somewhere during that process.<br> </div> Thu, 18 Sep 2008 04:51:55 +0000 Validity checking - should be mandatory? https://lwn.net/Articles/298926/ https://lwn.net/Articles/298926/ BenHutchings <div class="FormattedComment"> There are signatures on release files, which include checksums of the package lists, which include checksums of the packages. apt-get verifies the signature and checksums as it downloads files, before calling dpkg. But if you have a hardware problem that causes files to be corrupted between memory and disk then it may well pass a corrupt file to dpkg. There's little that can be done about this in software.<br> </div> Thu, 18 Sep 2008 02:39:28 +0000 Validity checking - should be mandatory? https://lwn.net/Articles/298924/ https://lwn.net/Articles/298924/ BenHutchings <div class="FormattedComment"> debsums is the usual tool for checking installed package integrity. "apt-get check" just checks the database.<br> </div> Thu, 18 Sep 2008 02:34:05 +0000 Universal signing of source code and source packages https://lwn.net/Articles/298568/ https://lwn.net/Articles/298568/ Nelson Probably once people stop responding to spam, start encrypting email, stop falling for phishing scams, etc.. <p> Truth be told, I check all the signatures can, maybe 20% of the time, it's a hassle to even find a public key. Then once you get a couple thousand keys in to GPG, it starts to slow down a bit on some operations. If you have it programmed to try an automatically fetch keys from a key server sometimes it just hangs out and chills for a while as it tries to rectify things as it pulls a new key in. <p> This seems like an ideal problem for google or someone to contribute to the solution of. It'd be nice to associate a PGP/GPG key with an openid and it'd be nice of gmail could maybe somehow indicate to other gmail users that we'd like encrypted mail if possible. Someone big needs to step in and give this kind of technology kind of the push it needs. Mon, 15 Sep 2008 21:40:15 +0000 Universal signing of source code and source packages https://lwn.net/Articles/298496/ https://lwn.net/Articles/298496/ jreiser <i>building the same package ... different compilers/linkers ... libraries</i> In the late 1980's <a href="http://en.wikipedia.org/wiki/Apollo_Computer">Apollo Computer</a> had DSEE (Domain Software Engineering Environment) which tracked not only source code but also tools and build scripts. DSEE did guarantee bit-identical outputs because it tracked and could re-generate <b>everything</b> that affected a build. An executable file had exactly one timestamp, in a designated field of the header. (There was no <tt>__TIME__</tt>.) Mon, 15 Sep 2008 16:55:01 +0000 Validity checking - should be mandatory? https://lwn.net/Articles/298471/ https://lwn.net/Articles/298471/ SEMW <div class="FormattedComment"> The Debian and Ubuntu versions of Tiger, the security auditor (<a rel="nofollow" href="http://www.nongnu.org/tiger/">http://www.nongnu.org/tiger/</a>) have a module for checksumming installed packages and comparing the result to what they should be (which is stored in /var/lib/dpkg/info/).<br> <p> Worryingly, on my system, a good 14 packages fail this test; including several Linux kernel modules (which really shouldn't have changed since I installed them). Synaptic shows them as fine, and "apt-get check" doesn't report any problems; so apparently, yes, the normal tools aren't checking properly.<br> <p> (Of course, this isn't a valid way to check for rootkits, since if someone has root access they could just change the checksums to match the compromised packages). <br> <p> </div> Mon, 15 Sep 2008 15:32:56 +0000 Validity checking - should be mandatory? https://lwn.net/Articles/298461/ https://lwn.net/Articles/298461/ epa <div class="FormattedComment"> The more scary thing is that if someone MITM'd your http connection they could change the contents of the packages and as long as the new data decompresses without error it would silently be installed! At least, that's my deduction from what you said, that dpkg goes as far as decompressing the package without checking any signature. Perhaps it does GPG verification on the uncompressed data and everything is safe after all?<br> <p> I agree with your main point: get everything downloaded, check the signatures before going any further, and only then start the last-minute validation of dependencies etc. and install the packages. This also protects you from possible buffer overruns in zlib or other packaging software that could be exploited by a malicious deb file.<br> </div> Mon, 15 Sep 2008 15:07:31 +0000 Validity checking - should be mandatory? https://lwn.net/Articles/298451/ https://lwn.net/Articles/298451/ kripkenstein <div class="FormattedComment"> It seems to me that proper validity checking should be mandatory in installing packages on any system. In my experience how this is handled on Ubuntu (and I guess Debian) leaves something to be desired, or perhaps I don't understand the rationale.<br> <p> The reason I became aware of this is I was running a machine with some sort of hardware problem, that caused large downloaded files to be corrupt sometimes (rarely, but enough to be a problem). What then happens with large updates in 'apt-get update' is packages are installed until the corrupt package causes dpkg to fail, halting the process with some error about decompression of the file failing. But previous packages were already installed, sometimes leaving the system in an unusable state.<br> <p> It seems to me that the simplest approach would be to download all files, then verify them (md5, key signing, etc.), and if even one fails then abort the process with an appropriate error message. A more complicated approach would install some packages even if others are corrupt, but only if there is no dependency. However, as I said before, perhaps I don't understand the rationale for the current method?<br> </div> Mon, 15 Sep 2008 13:00:02 +0000 Universal signing of source code and source packages https://lwn.net/Articles/298448/ https://lwn.net/Articles/298448/ AndyBurns <div class="FormattedComment"> I doubt building the same package, on the same machine twice would generate the exact same binary (timestamps would be included from __TIME__ macros) let alone building by different compilers/linkers, with different libraries installed.<br> <p> <p> </div> Mon, 15 Sep 2008 12:33:03 +0000 Universal signing of source code and source packages https://lwn.net/Articles/298435/ https://lwn.net/Articles/298435/ epa How long will it be until it becomes <i>de rigeur</i> to check signatures on all source code before building it? It's still very common to download a tarball with no signature and blithely type 'make'. But with version control systems such as git and hg that compute a secure hash of all content, you won't be downloading foo-1.5.2.tar.gz but instead revision 5891b5b522d5df086d0ff0b110fbd9d21bb4fc7163af34d08286a2e846f6be03. <p> Even when linking to a source tarball you can use the convention of putting a checksum in the #thingy part of the URI, as foo-1.5.2.tar.gz#sha256=71573b922a87abc3fd1a957f2cfa09d9e16998567dd878a85e12166112751806. It would be a good idea for all user-agents like Firefox and wget to check this, and have an option to add the #sha256= gunk to the URI automatically when you copy and paste it. <p> A collaborative checksum system, where 'make' automatically uploads checksums to a central server saying 'I built source with checksum abc into an object file with checksum xyz' might also catch potential trojaning, or at least spot build errors. (If the source package is designed for Fooboo Linux 5.5, then building it on that system should generate an exact binary package, and it would be interesting to see in what cases it does not.) Mon, 15 Sep 2008 12:06:27 +0000