|
|
Log in / Subscribe / Register

Bernstein's Blog

Bernstein's Blog

Posted Dec 9, 2025 21:31 UTC (Tue) by muase (subscriber, #178466)
In reply to: Bernstein's Blog by chris_se
Parent article: Disagreements over post-quantum encryption for TLS

I agree that there's nothing wrong with a combined scheme; IMO it's a very reasonable default.

However, I'm not too big of a fan of this argument:
> I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.

First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake. Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).

I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.

> While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms

But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.

All PQC winners are not new – the PQC contest started 2017, that's when the first iteration of Kyber (now ML-KEM) was submitted, aka 8 years ago. And it is important to point out that even back then, Kyber wasn't a new idea, but a rather straight evolution of 2005's LWE.

> Look at the amount of PQC algorithm candidates proposed by extremely knowledgeable and intelligent people that have since been broken to such an extent that they aren't an obstacle to even a classical computer

Yes, but that leaves us with two data points. First, _new_ mathematical problems are risky – that's true, but does not apply for lattices/LWE. Second, there have been tons of cryptanalysis – enough to find the problems with SIKE (probably more than for the entire TLS 1.3 construction).

That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.

Does that guarantee that there isn't an undiscovered mathematical weakness? Definitely not. But it has seen enough analysis that we can consider it reasonably unlikely; so unlikely that – with today's knowledge – we can consider it secure for normal use. Like I said, a combined scheme is a reasonable default; but there's really no good objective/scientific reason to aggressively object a fully optional, non-combined mode like it happened here.


to post comments

Bernstein's Blog

Posted Dec 9, 2025 22:36 UTC (Tue) by chris_se (subscriber, #99706) [Link] (5 responses)

> However, I'm not too big of a fan of this argument:
> > I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.
>
> First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake.

ECC isn't new, it's just a couple of years younger than RSA. But if you look at adoption, NIST recommended some ECC curves already back in 1999, but adoption of ECC only really started in the 2010s (yes, the first TLS standard with ECC was already in 2006, but it was extremely niche before ~2010), and e.g. OpenSSH started supporting ECC only in 2014, for example.

(And most (all?) CAs are still using RSA for their signatures nowadays btw. For example, LWN's own certificate is signed with RSA.)

> Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).

Your examples don't really make that much sense for this discussion in my eyes. The reason one wants to use ECC over RSA is primarily performance and key size. Sure, there are some benefits of newer ECC constructions that make it harder to mess up an implementation (doing RSA right especially w.r.t. padding is highly non-trivial), but fundamentally the security guarantees by RSA and ECC are on the same level. (Albeit with vastly different key sizes.) Therefore a hybrid RSA + ECC doesn't help you, because you lose the benefits of ECC here for no real gain. So with RSA or ECC it was always either or - either you were conservative and used RSA, or you trusted ECC enough that you'd be willing to adopt it in the early days.

PQC algorithms are fundamentally different: ideally we want to switch to an scheme that _has_ to include a PQC algorithm as soon as possible, for forward secrecy. Classical-only schemes should disappear as fast as possible. And in that scenario having a hybrid scheme is much more sensible.

(Btw. even with ECC we did learn a lot after they were initially introduced. The first ECC scheme was ECDSA, which has the same possible fatal private key leak as DSA if the implementation is not done properly, and only later were there people who proposed better schemes that don't use a DSA-like linear combination.)

> I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.

Your example contradicts you here: SHA3 has been blessed officially - and still has a very low adoption rate. (Though that is mostly inertia in my opinion, because there have been no new attacks discovered on SHA2, which is why there's no pressure in switching. Even worse, git is still in the process of migrating away from SHA1, and I see plenty of websites that still provide MD5 hashes of downloads. *shudder*)

> > While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms
> But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.

I think this is misleading. Due to the fact that the number of bytes required for these proposed algorithms (both for the keys themselves as well as in transit) is so large, they were not even remotely analyzed as closely in that time as e.g. ECC. This has obviously changed since then, but the age alone is not a pure indicator. Most cryptanalysis on PQC algorithms has happened in the last 10 years.

> That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.

And it's great that so much cryptanalysis has happened for the various PQC algorithms, and I'm very much in favor of pushing these algorithms forward. But RSA (and ECC though less so) is _much_ simpler than any PQC algorithm I've seen, so I don't think one should be comfortable _at this point in time_.

I do want those algorithms to become parts of standards (so that people are still motivated to analyze them going forward), and in 10 years time I could easily see the standards bodies revisiting this. (Or once a quantum computer exists that can break ECC, if that happens earlier.)

But we already have high-performance robust secure side-channel free ECC implementations available _right now_, and ECC keys are small, and computations are fast. Compared to the key sizes, computation requirements, and message sizes of any PQC algorithms, any ECC overhead is trivial. (I think a hybrid scheme with RSA would be not quite as clear-cut as a hybrid scheme with ECC.)

Bernstein's Blog

Posted Dec 9, 2025 23:42 UTC (Tue) by muase (subscriber, #178466) [Link] (4 responses)

> ECC isn't new, it's just a couple of years younger than RSA

I know; that was not so well worded on my side. My argument was: If we had applied the "you cannot trust these young algorithms"-logic consistently, therefore we make hybrid _mandatory_ without alternative, we would have ended up with all kinds of combined schemes that'd still pop up as zombies everywhere.

> Your examples don't really make that much sense for this discussion in my eyes [...]

I see your point, but ECC was also about improving security. For a long time, the main benefit of ECC was that you could easily upgrade to a security level that was impractical to achieve with RSA or DH (the speed race came later); and a hybrid scheme RSA1024+P256 would have been a significant security improvement compared to just RSA1024, and would still have provided the fallback in case ECC would have broken.

But the entire area was very different back then, so maybe you're right and it's not a good example^^

> This has obviously changed since then, but the age alone is not a pure indicator. Most cryptanalysis on PQC algorithms has happened in the last 10 years.

Yes and no. Yes, time is not a pure indicator, but that works in both directions. Cryptanalysis has gotten so much better (in methods and quality) that this is not really comparable. Within the last few years we learned more about the PQC algorithms than what we learned about older algorithms in decades, simply because the field has evolved pretty crazy (and because of internet, and knowledge accumulation, and much better tooling and formal models and proofs, etc.). It's safe to say that we know much more about ML-KEM now than we did about AES or ECC when both became adopted.

I mainly included the argument because everyone is always talking about "young" and super new algorithms and stuff; and I wanted to oppose that a bit. The algorithms and math are not as young as many people think, and also not as green re cryptanalysis as many people seem to believe.

> PQC algorithms are fundamentally different: ideally we want to switch to an scheme that _has_ to include a PQC algorithm as soon as possible, for forward secrecy. Classical-only schemes should disappear as fast as possible. And in that scenario having a hybrid scheme is much more sensible.

Here we are in full agreement I think; like I said, IMO a combined scheme is a very reasonable default. I'm totally not opposing a combined scheme, I just don't think it makes sense to oppose an additional and _optional_ PQ-only ciphersuite either.

It's simply not an either-or; and – here we might be in disagreement? – I think ML-KEM is definitely mature enough to deserve its own dedicated cipher suite. Let's call it "experimental" or "special interest" – but I think we should define it, before others come along with proprietary schemes and extensions or custom incompatible suites etc. Nobody needs that^^

Bernstein's Blog

Posted Dec 10, 2025 9:47 UTC (Wed) by chris_se (subscriber, #99706) [Link] (3 responses)

> > ECC isn't new, it's just a couple of years younger than RSA
>
> I know; that was not so well worded on my side. My argument was: If we had applied the "you cannot trust these young algorithms"-logic consistently, therefore we make hybrid _mandatory_ without alternative, we would have ended up with all kinds of combined schemes that'd still pop up as zombies everywhere.

I think things were a lot different back then. The main difference is that we've learned that the "the more the merrier" approach to standardizing cipher suites is actually detrimental to security.

But I also disagree with you here, I personally wouldn't have thought it a bad idea if the first introduction of ECC in standards in ~2005 had been a hybrid scheme. In ~2015, different story.

> I see your point, but ECC was also about improving security. For a long time, the main benefit of ECC was that you could easily upgrade to a security level that was impractical to achieve with RSA or DH (the speed race came later);

"impractical" == too slow you mean? Still counts as performance in my eyes.

> It's safe to say that we know much more about ML-KEM now than we did about AES or ECC when both became adopted.

That still doesn't touch my argument that all PQC schemes I've looked at so far are more complicated in their construction than RSA and even ECC.

> It's simply not an either-or; and – here we might be in disagreement? – I think ML-KEM is definitely mature enough to deserve its own dedicated cipher suite. Let's call it "experimental" or "special interest" – but I think we should define it, before others come along with proprietary schemes and extensions or custom incompatible suites etc. Nobody needs that^^

As I've said in another part of this thread: I'm not 100% opposed to standardizing ML-KEM alone to avoid proprietary messes, if it's clearly marked as "do not use unless you really really know what you are doing". But I don't think it's been analyzed enough (due to its complexity) that I'd be comfortable in making this just an optional thing with even just a warning - I'd want this to be actively discouraged and the standard should indicate that it must be disabled by default unless configured otherwise. Give it another 10 years, and more experience in the field with it (especially when it also comes to side channels), and assuming nobody's broken it by then, I'd be happy to go ML-KEM only.

Bernstein's Blog

Posted Dec 11, 2025 0:36 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link]

I'd want this to be actively discouraged and the standard should indicate that it must be disabled by default unless configured otherwise.
It also needs some kind of protection against downgrade attacks. If you're going to officially discourage its use, there had better not be a way for an attacker to force people to use it in place of more trustworthy algorithms.

Bernstein's Blog

Posted Dec 11, 2025 12:20 UTC (Thu) by kleptog (subscriber, #1183) [Link] (1 responses)

> But I don't think it's been analyzed enough (due to its complexity)

I think the perceived complexity is related to unfamiliarity. ECC relies on fancy properties of groups, which we know are not PQ safe but are well known in the crypto-community. ML-KEM relies on some linear algebra and probability theory which to me sounds a lot less magic than ECC. Linear algebra and probability theory are some of the most studied areas of mathematics due to their ubiquitous use everywhere.

> that I'd be comfortable in making this just an optional thing

But whose comfort should we be listening to?

FWIW, I think standardizing a pure-PQ algorithm is a good idea because then we can move onto the next phase, namely algorithm implementations. Even though we've been using ECC for ages, making side-channel free implementation is still hard and we need to get the ball rolling on that now, not wait another ten years. It'll probably be at least ten years before any implementation is sufficiently available that people can even think about using it for public sites.

Bernstein's Blog

Posted Dec 11, 2025 12:46 UTC (Thu) by brunowolff (guest, #71160) [Link]

Lattices can use some more study. SIKE went from people thinking it was fine to completely broken not too long ago. There isn't a reason to take that risk now.

Implementors don't need a PQ only version of ML-KEM in the TLS standard to start implementing ML-KEM. There are already implementations. Also timing atacks are taken a lot more seriously now as compared to when AES came about. People are already doing constant time implementations and checking them. In fact there was a screw up in kyberslash where there was a divide using secret data, that was found and corrected not too long ago. There are libraries to help people get this correct on different hardware and compilers.


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds