Bernstein's Blog
Bernstein's Blog
Posted Dec 9, 2025 21:31 UTC (Tue) by muase (subscriber, #178466)In reply to: Bernstein's Blog by chris_se
Parent article: Disagreements over post-quantum encryption for TLS
However, I'm not too big of a fan of this argument:
> I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.
First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake. Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).
I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.
> While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms
But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.
All PQC winners are not new – the PQC contest started 2017, that's when the first iteration of Kyber (now ML-KEM) was submitted, aka 8 years ago. And it is important to point out that even back then, Kyber wasn't a new idea, but a rather straight evolution of 2005's LWE.
> Look at the amount of PQC algorithm candidates proposed by extremely knowledgeable and intelligent people that have since been broken to such an extent that they aren't an obstacle to even a classical computer
Yes, but that leaves us with two data points. First, _new_ mathematical problems are risky – that's true, but does not apply for lattices/LWE. Second, there have been tons of cryptanalysis – enough to find the problems with SIKE (probably more than for the entire TLS 1.3 construction).
That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.
Does that guarantee that there isn't an undiscovered mathematical weakness? Definitely not. But it has seen enough analysis that we can consider it reasonably unlikely; so unlikely that – with today's knowledge – we can consider it secure for normal use. Like I said, a combined scheme is a reasonable default; but there's really no good objective/scientific reason to aggressively object a fully optional, non-combined mode like it happened here.
