Bernstein's Blog
Bernstein's Blog
Posted Dec 9, 2025 22:36 UTC (Tue) by chris_se (subscriber, #99706)In reply to: Bernstein's Blog by muase
Parent article: Disagreements over post-quantum encryption for TLS
> > I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.
>
> First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake.
ECC isn't new, it's just a couple of years younger than RSA. But if you look at adoption, NIST recommended some ECC curves already back in 1999, but adoption of ECC only really started in the 2010s (yes, the first TLS standard with ECC was already in 2006, but it was extremely niche before ~2010), and e.g. OpenSSH started supporting ECC only in 2014, for example.
(And most (all?) CAs are still using RSA for their signatures nowadays btw. For example, LWN's own certificate is signed with RSA.)
> Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).
Your examples don't really make that much sense for this discussion in my eyes. The reason one wants to use ECC over RSA is primarily performance and key size. Sure, there are some benefits of newer ECC constructions that make it harder to mess up an implementation (doing RSA right especially w.r.t. padding is highly non-trivial), but fundamentally the security guarantees by RSA and ECC are on the same level. (Albeit with vastly different key sizes.) Therefore a hybrid RSA + ECC doesn't help you, because you lose the benefits of ECC here for no real gain. So with RSA or ECC it was always either or - either you were conservative and used RSA, or you trusted ECC enough that you'd be willing to adopt it in the early days.
PQC algorithms are fundamentally different: ideally we want to switch to an scheme that _has_ to include a PQC algorithm as soon as possible, for forward secrecy. Classical-only schemes should disappear as fast as possible. And in that scenario having a hybrid scheme is much more sensible.
(Btw. even with ECC we did learn a lot after they were initially introduced. The first ECC scheme was ECDSA, which has the same possible fatal private key leak as DSA if the implementation is not done properly, and only later were there people who proposed better schemes that don't use a DSA-like linear combination.)
> I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.
Your example contradicts you here: SHA3 has been blessed officially - and still has a very low adoption rate. (Though that is mostly inertia in my opinion, because there have been no new attacks discovered on SHA2, which is why there's no pressure in switching. Even worse, git is still in the process of migrating away from SHA1, and I see plenty of websites that still provide MD5 hashes of downloads. *shudder*)
> > While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms
> But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.
I think this is misleading. Due to the fact that the number of bytes required for these proposed algorithms (both for the keys themselves as well as in transit) is so large, they were not even remotely analyzed as closely in that time as e.g. ECC. This has obviously changed since then, but the age alone is not a pure indicator. Most cryptanalysis on PQC algorithms has happened in the last 10 years.
> That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.
And it's great that so much cryptanalysis has happened for the various PQC algorithms, and I'm very much in favor of pushing these algorithms forward. But RSA (and ECC though less so) is _much_ simpler than any PQC algorithm I've seen, so I don't think one should be comfortable _at this point in time_.
I do want those algorithms to become parts of standards (so that people are still motivated to analyze them going forward), and in 10 years time I could easily see the standards bodies revisiting this. (Or once a quantum computer exists that can break ECC, if that happens earlier.)
But we already have high-performance robust secure side-channel free ECC implementations available _right now_, and ECC keys are small, and computations are fast. Compared to the key sizes, computation requirements, and message sizes of any PQC algorithms, any ECC overhead is trivial. (I think a hybrid scheme with RSA would be not quite as clear-cut as a hybrid scheme with ECC.)
