|
|
Subscribe / Log in / New account

OpenPGP certificate flooding

By Jake Edge
July 2, 2019

A problem with the way that OpenPGP public-key certificates are handled by key servers and applications is wreaking some havoc, but not just for those who own the certificates (and keys)—anyone who has those keys on their keyring and does regular updates will be affected. It is effectively a denial of service attack, but one that propagates differently than most others. The mechanism of this "certificate flooding" is one that is normally used to add attestations to the key owner's identity (also known as "signing the key"), but because of the way most key servers work, it can be used to fill a certificate with "spam"—with far-reaching effects.

The problems have been known for many years, but they were graphically illustrated by attacks on the keys of two well-known members of the OpenPGP community, Daniel Kahn Gillmor ("dkg") and Robert J. Hansen ("rjh"), in late June. Gillmor first reported the attack on his blog. It turned out that someone had added multiple bogus certifications (or attestations) to his public key in the SKS key server pool; an additional 55,000 certifications were added, bloating his key to 17MB in size. Hansen's key got spammed even worse, with nearly 150,000 certifications—the maximum number that the OpenPGP protocol will support.

The idea behind these certifications is to support the "web of trust". If user Alice believes that a particular key for user Bob is valid (because, for example, they sat down over beers and verified that), Alice can so attest by adding a certification to Bob's key. Now if other users who trust Alice come across Bob's key, they can be reasonably sure that the key is Bob's because Alice (cryptographically) said so. That is the essence of the web of trust, though in practice, it is often not really used to do that kind of verification outside of highly technical communities. In addition, anyone can add a certification, whether they know the identity of the key holder or not.

The problem with the keys for Gillmor and Hansen that are stored in the SKS key-server network is that GNU Privacy Guard (GnuPG or GPG), the most widely used OpenPGP implementation, cannot even import the certificates into its newer keybox (.kbx) format due to the number of certifications they have. But using the older format (.gpg) leads to performance problems for any operation that uses the key. That leads to failures with the Enigmail Thunderbird add-on, for Git to take minutes to PGP sign a tag, for authentication to Monkeysphere to use an enormous amount of CPU, and probably other things as well, Gillmor said.

All of that is a pain for him, but it is worse than that. Anyone who has his key on their keyring will run into problems if they practice good "key hygiene" by periodically updating the keys on their keyring from the SKS servers. That will pick up key revocations, new subkeys, and the like, so their keyring will be up to date. But if they have Gillmor or Hansen on their keyrings and use the older format, that will lead to strange and hard to diagnose problems. If they use the newer format, they won't get updates for those keys, which may lead to other problems.

As Hansen details in his post on the subject, the problem can be traced to a decision made in the early 1990s about how key servers should function. There was concern that repressive governments would try to "coerce" key-server operators into substituting keys for ones controlled by the government. In order to combat that, key servers would not delete any information, so certificates and any information about them (e.g. certifications) would live forever. Multiple key servers were run in widely disparate locations and information would be synchronized between them regularly. If a key server did remove or modify a certificate, that would be recognized and the previous state would be restored.

That well-intentioned choice was fine for the time, but that time has passed. Unfortunately, according to Hansen, the SKS software is effectively unmaintained, in part because it was written in an idiosyncratic dialect of OCaml, but also because it was created as a proof of concept for a Ph.D. thesis:

[...] for software that's deployed in the field it makes maintenance quite difficult. Not only do we need to be bright enough to understand an algorithm that's literally someone's Ph.D thesis, but we need expertise in obscure programming languages and strange programming customs.

He noted that the idea of the immutability of certificate information is wired deeply into the guts of SKS. Changing that would be a much bigger job than simply fixing a bug or adding a small feature. He recommended that high-risk users not use the SKS key server network any longer. Both he and Gillmor pointed out that the new, experimental keys.openpgp.org key server is resistant to the certificate flooding attack. The keys.openpgp.org FAQ explains:

A "third party signature" is a signature on a key that was made by some other key. Most commonly, those are the signatures produced when "signing someone's key", which are the basis for the "Web of Trust". For a number of reasons, those signatures are not currently distributed via keys.openpgp.org.

The killer reason is spam. Third party signatures allow attaching arbitrary data to anyone's key, and nothing stops a malicious user from attaching so many megabytes of bloat to a key that it becomes practically unusable. Even worse, they could attach offensive or illegal content.

The keys.openpgp.org server is also set up to separate the identity and non-identity information of certificates. It will not distribute identity information (i.e. "user IDs" that include a name and email address) unless the owner verifies the email address. Meanwhile, the non-identity information (the key material and metadata) will be stored and distributed freely, though without the certifications. GnuPG and other OpenPGP software can refresh the key material regularly from the server for keys they already have on their keyrings, even if the owner has not consented to the distribution of the key's user ID information.

The News entry for keys.openpgp.org mentions some of the reasons behind starting a new key server. It noted that the SKS key server suffered from a number of problems with abuse, performance, and privacy (including GDPR compliance). Beyond that, SKS is not really being developed any longer as Hansen also pointed out. So keys.openpgp.org is based on a completely new key server, Hagrid, written in Rust using the Sequoia OpenPGP library. The plan is to eventually add more servers to create a pool, but it will not be an open federation model like SKS due to privacy and reliability concerns.

Gillmor's first post has some pointers on other ways to handle certificate flooding, including a link to an internet draft he created back in April for "Abuse-Resistant OpenPGP Keystores". His followup blog post reiterates many of those options as part of an effort to look at the impact of certificate flooding on the community. Hansen was rather more blunt in his post and in a followup that described some of the fallout he sees from the attack. He is particularly angry that advice he gave to Venezuelan activists and others to check his signature on documents may now lead them to effectively break their GnuPG installation.

Both Gillmor and Hansen are obviously distressed, personally, over the fact that their keys are affected by all of this. OpenPGP is already hard enough to use that adding unexpected hurdles impacts other people in ugly ways. As Gillmor put it:

But from several conversations i've had over the last 24 hours, i know personally at least a half-dozen different people who i personally know have lost hours of work, being stymied by the failing tools, some of that time spent confused and anxious and frustrated. Some of them thought they might have lost access to their encrypted e-mail messages entirely. Others were struggling to wrestle a suddenly non-responsive machine back into order. These are all good people doing other interesting work that I want to succeed, and I can't give them those hours back, or relieve them of that stress retroactively.

Other flooded keys have been found; in an LWN comment, Gillmor said that Tor project keys have been spammed. More keys undoubtedly will be flooded and the nature of the SKS key servers makes them a permanent part of the key store. Illegal content could perhaps be attached as certifications; those would also be "impossible" to remove. It is a little hard to see the SKS network surviving this incident in its current form—that may be for the best, in truth.

Daniel Lange has some suggestions on how to easily clean up these spammed keys, though it takes a huge amount of CPU time (he reports 1h45m for Hansen's key). That is rather distressing in its own right; as Filippo Valsorda put it:

Someone added a few thousand entries to a list that lets anyone append to it.

GnuPG, software supposed to defeat state actors, suddenly takes minutes to process entries.

How big is that list you ask? 17 MiB. Not GiB, 17 MiB. Like a large picture.

Clearly GnuPG needs a fix for that; Gillmor filed a bug to that effect. He also filed bugs in other components that failed, which he links from his post.

But apparently the problems with the SKS key server have been known for a long time. It is a bit reminiscent of the state of OpenSSL development pre-Heartbleed; SKS was largely unmaintained, chugging along in the background but not really having much attention paid to it. Those "in the know" were aware of its flaws, but did not have the resources to fix them. As with Heartbleed, it makes one wonder what other projects are out there, seemingly humming along, when, in reality, that hum may be the quiet ticking of a time bomb.


Index entries for this article
SecurityEncryption/Key management
SecurityVulnerabilities/Denial of service


to post comments

I'm having difficulty understanding

Posted Jul 2, 2019 20:19 UTC (Tue) by ms-tg (subscriber, #89231) [Link] (15 responses)

I'm having difficulty understanding how both of these things can be true at the same time:

1. Multiple oppressive state actors have been frustrated over the past 10 years by dissidents using GnuPG to encrypt communications

2. Glaring holes existed in the GnuPG ecosystem, well known for the past 10 years, such that the slightest poking can take down individual account setups, or even the entire key server network.

Can anyone help me see how both of these things can be true? One guess I'm considering is that actually the SKS key servers and the web-of-trust attestation model are actually very rarely used. Therefore it is possible that dissidents under oppressive regimes have, in fact, been communicating successfully using GnuPG keys they exchanged via other means, and never contacting key servers?

I'm having difficulty understanding

Posted Jul 2, 2019 20:34 UTC (Tue) by AngryChris (guest, #74783) [Link]

1. Governments complain that they can't read the encrypted communications.
2. The problem described in the article does not help with reading the communications, it's only in griefing people.

That's how both statements are true at the same time.

1. Encrypted communications still can't be intercepted.
2. People can be griefed/trolled/etc.

I'm having difficulty understanding

Posted Jul 2, 2019 21:17 UTC (Tue) by vadim (subscriber, #35271) [Link] (4 responses)

Yup. The WoT model is nigh unusable.

First, let's say you're a developer for CoolDistro. At some point you meet ExperiencedDev, who after a round of introductions signs your key. You sign theirs. You get the keyring for CoolDistro, and since ExperiencedDev signed pretty much everyone's key, you're pretty much set. No keyservers in sight.

But let's say you're a random guy, you went to FOSDEM and signed more than a hundred keys. You're well connected now. And you want to verify the key on the Tor browser. What do you do?

1. You get the signature, you try to verify it, and gpg tells you: "nope, this isn't trusted!". You don't have the signing key. Damn.
2. Okay, there's gpg --recv-keys, right? Nope, there's no default keyserver.
3. You google for, and configure gpg to use a damn keyserver. You get the key. Still no dice. The key isn't trusted, because you didn't sign that one.
4. You figure that somebody you met, at some point, must have signed it. Time to make sure you have every key you signed, and that you refreshed them from the keyserver. Now it should work. But it still doesn't.
6a. GPG's trust model works well for Alice -> Bob -> Carol chains. But it can actually extend further. What if you want to check an Alice -> Bob -> Carol -> Dave chain? Well, you're stuck. You might know who you signed, but if you don't know Carol, how do you find out you need her key? Here you can take on the desperate resort of recursively downloading the key of everyone you signed, plus every key *they* signed, hoping it might help. Happy scripting! And dealing with a key database of several thousand people, the vast majority of which you've never met.
6b. You might actually know about https://pgp.cs.uu.nl/, which some random guy runs and which might vanish at any moment, and which doesn't cover every key in existence. But it helps a lot.
7. What? It still doesn't work!?. GPG isn't happy with just signatures, you need to tell it how much you trust that key. Do gpg --update-trustdb and fill in the missing info. Things get fuzzy here. How much do you trust guy #53 of 120 you met at FOSDEM? Do you remember how well you checked his ID? Uhh... You get that done, and finally, that WORKS.
8. Time to celebrate with a drink of your choice.

Why, nice and easy, isn't it? It'll only take meeting a lot of people face to face, laboriously signing a hundred keys, then spending a day or two figuring out the above, and you can finally make some use of this web of trust. If you're technically minded enough to understand all that stuff, and have the patience for it.

I'd be willing to bet that the number of people who've done the above isn't very large. I've done it, and it actually worked, but boy is it annoying. You need to be a pretty special kind of person to put up with that.

PS: While writing this comment I tried to get the Tor Browser's current key to make sure I wasn't missing anything. It's got "100304 new signatures" on it. Wheee.

Trust

Posted Jul 3, 2019 8:17 UTC (Wed) by tialaramex (subscriber, #21167) [Link] (3 responses)

That step with "trust" you got wrong, and I'm bothering to point this out because it's illustrative of why the Web of Trust is flawed regardless of whether you're using carefully engineered software or some scripts thrown together in the 1990s by enthusiasts.

"How much do you trust guy #53 of 120 you met at FOSDEM? Do you remember how well you checked his ID?"

Nope. The system is asking how you much you _trust_ them, not whether you're sure they are who they say they are.

I am 100% certain my mother is my mother, but I wouldn't trust her as far as I can throw her. And this is where the WoT breaks down. Your trust metric must reflect your confidence that these people will do their part correctly in the WoT, but even conscientious users often don't understand how to do their part correctly, so realistically almost everybody's "trust" indication for almost everybody should be zero. At which point it's not a "web" it's just a bunch of unconnected points.

This is why things like Signal don't bother trying to invent a technologically complex way to "verify" other participants. Technology is used to make the only provided way _easy_ but not to try to conjure trust where it doesn't really exist. You make your own decisions on how to label other participants in a direct conversation and whether to consider you've "verified" they are who you thought they are. It's exactly as happy with you deciding you've verified "Deep Throat (gov insider?)" and not "Sally Anne Jenkins of Ohio" as vice versa. It doesn't mean anything to me that Sally claims to have verified Deep Throat, or that Deep Throat claims to have verified Sally.

Trust

Posted Jul 3, 2019 11:04 UTC (Wed) by vadim (subscriber, #35271) [Link] (2 responses)

Good point, that wasn't correct.

The problem there though is that there's no way to avoid trust in the situation. So let's get back to the Tor browser.

Option A: Personally reviewing the entire source code. Not really practical.

Option B: Personally knowing the developers, and having a personal deep familiarity with their work. Available to a few people at most.

Option C: Trusting some random organization to certify the signing key, because your browser trusts them, which you trust mostly because you hope you didn't obtain a compromised version.

Option D: Trusting the WoT to certify the signing key.

Signal's model isn't going to work here. I've been at a talk by Roger Dingledine at FOSDEM, which filled most of Janson (the big auditorium). There's no way the poor guy is going to sit there signing the keys of all those people: https://external-preview.redd.it/Tcci3W9pcAD5FuRyCiMhl9vP...

And if he did, he'd still only reach a minuscule amount of people. You absolutely need an intermediary.

As far as I know it comes down to either CAs or the WoT in the end. CAs are very vulnerable to government interference. And the WoT only works so long everyone is being very careful, which many people aren't.

I think in the end you have to compromise and to make some assumptions. Either hope that efforts like certificate transparency are enough to patch up the deficiencies of CAs, or hope your usage of the WoT doesn't have any obvious issues with it.

Trust

Posted Jul 3, 2019 13:10 UTC (Wed) by Kamiccolo (subscriber, #95159) [Link] (1 responses)

Your reddit link:
Error 403 Forbidden

Trust

Posted Jul 3, 2019 14:03 UTC (Wed) by vadim (subscriber, #35271) [Link]

Ah, not important. Just a picture of a lot of people.

I'm having difficulty understanding

Posted Jul 2, 2019 23:58 UTC (Tue) by thestinger (guest, #91827) [Link] (6 responses)

> Glaring holes existed in the GnuPG ecosystem, well known for the past 10 years, such that the slightest poking can take down individual account setups, or even the entire key server network.

It should be noted that these trivial denial of service vectors also exist when importing keys manually. GPG exposes a bunch of attack surface when importing a key, and it's not at all hardened. There are assorted trivial ways to permanently brick the keyring including null pointer dereferences. Of course, denial of service isn't the worst that could happen, but that's less trivial to do than leaving out a field and causing a null pointer dereference. The keyservers are aggravating this issue, but you do need to be wary about importing any untrusted public key with GPG. That's a major issue. It shouldn't just be for communicating with people you strongly trust, since then you can't use encryption as a default.

> Can anyone help me see how both of these things can be true? One guess I'm considering is that actually the SKS key servers and the web-of-trust attestation model are actually very rarely used. Therefore it is possible that dissidents under oppressive regimes have, in fact, been communicating successfully using GnuPG keys they exchanged via other means, and never contacting key servers?

I think you're wrong to assume that there's any substantial usage of GPG by dissidents. Modern end-to-end encrypted messaging apps have forward secrecy (crucial) and are far more usable.

The GPG keyring and web of trust are a failed approach, rarely if ever what people actually want and yet it essentially forces you into using it. It requires you to import keys into a keyring. It's not directly usable for even simple cases like verifying the signature of a file with a specific key. It can do it, but you need to create a dedicated keyring unless you're going to check the output and make sure the right key was used. This extends to the other use cases for it. It's a highly problematic design with poor usability.

The approach that's actually used in practice is verifying keys yourself, not relying on other people to do it, and without the complex keyring / web of trust getting in the way.

I'm having difficulty understanding

Posted Jul 3, 2019 8:32 UTC (Wed) by dd9jn (✭ supporter ✭, #4459) [Link] (2 responses)

Please be so kind and file a bug report or tell us here if you happen to know a case where an import or key access leads to a NULL-ptr deref or any other way to "brick your keyring". To mitigate DoS we have several limits implemented, like max. size of a user ID or an overall keyblock size of 5MiB (which used to allow the import of the largest actively used keyblock at that time).

DoS is a common problem with all PKIs, be it OpenPGP (we have been lucky in the past) or X.509 (for example try to use the cacert CRL). There is unfortunately not much we can do about it unless you want to turn OpenPGP into a centralized system with gatekeeper who cares about anti-spam measures.

I'm having difficulty understanding

Posted Jul 4, 2019 22:31 UTC (Thu) by marcH (subscriber, #57642) [Link]

> Please be so kind and file a bug report or tell us here if you happen to know a case where an import or key access leads to a NULL-ptr deref or any other way to "brick your keyring".

If you care about security maybe you should start looking at alternatives to a 50 years old language with built-in memory corruption features. It doesn't even have to be OCaml.

I'm having difficulty understanding

Posted Jul 10, 2019 2:20 UTC (Wed) by riking (guest, #95706) [Link]

> not much we can do about it

Have you considered running a fuzzer and fixing the bugs the fuzzer finds? That's how the mentioned nullptr bug was found.

I'm having difficulty understanding

Posted Jul 3, 2019 18:01 UTC (Wed) by NYKevin (subscriber, #129325) [Link] (1 responses)

If it is dereferencing NULL pointers, might it also be dereferencing non-NULL but invalid pointers? If so, there's probably an RCE vuln in there somewhere... I wonder how long it would take a determined nation-state attacker to backdoor everyone's boxes with booby-trapped keys?

I'm having difficulty understanding

Posted Jul 3, 2019 18:32 UTC (Wed) by dd9jn (✭ supporter ✭, #4459) [Link]

Derefing a NULL-ptr happens far more often than derefing an invalid pointer. This is the reason why practically all platforms do not map the page with address 0x0. For example

if (!a)
die ("something is wrong with %s\n", a->name);

is an obvious bug in the diagnostic but no real harm is done. Needs to be fixed of course.

I'm having difficulty understanding

Posted Jul 4, 2019 8:26 UTC (Thu) by madhatter (subscriber, #4665) [Link]

While we're dogpiling on how broken the web of trust is, note that Werner Koch is on record in these pages saying the same thing, nearly two years ago:

The problem is systemic: the web of trust, he feels, is inherently broken. It is only explicable to geeks, and not to all of them, it publishes a global social graph, because signatures on keys imply physical meetings on known dates, and it doesn't scale. His preference for general public key handling is Trust On First Use (TOFU).

Disclaimer: I wrote the linked article.

I'm having difficulty understanding

Posted Jul 3, 2019 0:06 UTC (Wed) by KaiRo (subscriber, #1987) [Link]

1. The state actors have not been nearly as frustrated as we thought and either found other ways to get their info or are less interested than we thought in targets that use PGP/GPG.
2. State actors are less interested in bringing down the system than decrypting data of specific targets - which won't be helped by DoS attacks but will be helped by storing the data and waiting for quantum computers to break public/private key encryption itself.

Just a guess.

I'm having difficulty understanding

Posted Jul 3, 2019 3:34 UTC (Wed) by mjg59 (subscriber, #23239) [Link]

Either:

1) PGP isn't a meaningful problem in terms of governments' abilities to monitor what dissidents are doing, or:
2) Dissidents aren't being put in danger by the keyservers being poisoned

(Or, of course, this attack /is/ being carried out by a government)

OpenPGP certificate flooding

Posted Jul 3, 2019 8:40 UTC (Wed) by dd9jn (✭ supporter ✭, #4459) [Link]

The more relevant bug report is https://dev.gnupg.org/T4591 .

Checking a 100k of signatures is obviously a performance problem and we can't do much about it. Unless we ignore key signatures and the WoT. I am all in favor of this but there is still a large crowd who favors the WoT and the (formerly) easy access via keyservers.

OpenPGP certificate flooding

Posted Jul 3, 2019 11:27 UTC (Wed) by pabs (subscriber, #43278) [Link] (5 responses)

For supporting the WoT while blocking spam, could the keyservers not simply require that additions of new signatures to a key pass prove that they own the corresponding private key (or even a subkey)? One bonus of this method would be that bogus signatures from people you have never exchanged signatures with but naively signed your key anyway get blocked. I can only think of one downside, it might be slightly more annoying for folks who have offline master keys. Seems worth the cost to me.

OpenPGP certificate flooding

Posted Jul 3, 2019 20:10 UTC (Wed) by ttelford (guest, #44176) [Link] (3 responses)

What does owning the private key to the new signature prove?

Yarrow, Fortuna, or Haveged produce arbitrary amounts of "entropy" in negligible amounts of time; it's trivial to create a new bogus key pair.

You just generate a bogus key, sign/spam, upload, toss the bogus key, repeat.

We wind up with the same spam problem with extra steps -- which our machine slaves won't even blink an eye at.

OpenPGP certificate flooding

Posted Jul 3, 2019 20:20 UTC (Wed) by vadim (subscriber, #35271) [Link] (2 responses)

I think the idea is proving you own the private key to the key being signed. So for instance, only the owner of the Tor key would be able to send updated versions of it to keyservers.

OpenPGP certificate flooding

Posted Jul 4, 2019 0:18 UTC (Thu) by pabs (subscriber, #43278) [Link] (1 responses)

Right, it would be pretty pointless for anyone else than the holder of the key being signed to be able to upload signatures.

For context; the proper way to get signatures on your OpenPGP key is that signers use caff or similar to send email containing signatures to the UIDs on your key to verify that the key holder also owns the email addresses. On receiving the emails the key holder imports the signatures and forwards their key to the keyserver network.

So my proposal fits into the proper workflow for obtaining and distributing signatures (that most communities use) and as a bonus eliminates both spam signatures and improperly distributed signatures that haven't verified UID control or haven't even verified fingerprints. Of course the signer and key holder could workaround this using other more manual transports, but hopefully those would be deprecated in all the tools surrounding signing.

OpenPGP certificate flooding

Posted Jul 8, 2019 16:26 UTC (Mon) by ttelford (guest, #44176) [Link]

Now it makes sense to me.

My naive thought is that it would be along the lines of:
1. Alice uploads her public key
2. Bob signs Alice's public key
3. For Bob's signature to be valid, Alice has to sign (or have already signed) Bob's key in her local keychain
4. Alice uploads the new (signed) public key, and Bob gets a copy of his public key signed by Alice.
5. Bob receives his public key (signed by Alice), and can (in turn) upload his public key (which is signed by Alice).

Though I'm sure there's a better idea than that...

OpenPGP certificate flooding

Posted Jul 4, 2019 1:35 UTC (Thu) by dkg (subscriber, #55359) [Link]

I agree with pabs here, this is the only sensible way to permit distribution of third-party certifications, but the devil is in the details.

Several months ago, I outlined a way to do that in an attempt to spur public discussion. It's not set in stone, and indeed, there is a proposal to amend it. It will take a bit more work to get the specification right, but the real work will be in the tooling to make it possible for normal humans to do exactly the right multi-party, serialized dance necessary to make something that an abuse-resistant keystore can feel confident in redistributing.

This work is not just crypto or RFC 4880 packet parsing/generating work -- that's the easy bit. The hard stuff is thinking about user experience. What is the smoothest way to present these options to the user so that they know what they're doing, without having to know all the gory details.

I don't have the bandwidth to develop that tooling myself right now. But if anyone wants to work on building it out and thinking about the user experience for that process, i'd be happy to act as a sounding board/tester/bug reporter.

OpenPGP certificate flooding

Posted Jul 3, 2019 17:04 UTC (Wed) by jcm (subscriber, #18262) [Link]


Copyright © 2019, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds