|
|
Log in / Subscribe / Register

Disagreements over post-quantum encryption for TLS

By Daroc Alden
December 8, 2025

The Internet Engineering Task Force (IETF) is the standards body responsible for the TLS encryption standard — which your browser is using right now to allow you to read LWN.net. As part of its work to keep TLS secure, the IETF has been entertaining proposals to adopt "post-quantum" cryptography (that is, cryptography that is not known to be easily broken by a quantum computer) for TLS version 1.3. Discussion of the proposal has exposed a large disagreement between participants who worried about weakened security and others who worried about weakened marketability.

What is post-quantum cryptography?

In 1994, Peter Shor developed Shor's algorithm, which can use a quantum computer to factor large numbers asymptotically faster (i.e. faster by a proportion that grows as the size of the input does) than a classical computer can. This was a huge blow to the theoretical security of the then-common RSA public-key encryption algorithm, which depends on the factoring of numbers being hard in order to guarantee security. Later work extended Shor's algorithm to apply to other key-exchange algorithms, such as elliptic-curve Diffie-Hellman, the most common key-exchange algorithm on the modern internet. There are doubts that any attack using a quantum computer could actually be made practical — but given that the field of cryptography moves slowly, it could still be worth getting ahead of the curve.

Quantum computing is sometimes explained as trying all possible answers to a problem at once, but that is incorrect. If that were the case, quantum computers could trivially break any possible encryption algorithm. Instead, quantum computers work by applying a limited set of transformations to a quantum state that can be thought of as a high-dimensional unit-length vector. The beauty of Shor's algorithm is that he showed how to use these extremely limited operations to reliably factor numbers.

The study of post-quantum cryptography is about finding an encryption mechanism that none of the generalizations of Shor's algorithm or related quantum algorithms apply to: finding encryption techniques where there is no known way for a quantum computer to break them meaningfully faster than a classical computer can. While attackers may not be breaking encryption with quantum computers today, the worry is that they could use a "store now, decrypt later" attack to break today's cryptography with the theoretically much more capable quantum computers of tomorrow.

For TLS, the question is specifically how to make a post-quantum key-exchange mechanism. When a TLS connection is established, the server and client use public-key cryptography to agree on a shared encryption key without leaking that key to any eavesdroppers. Then they can use that shared key with (much less computationally expensive) symmetric encryption to secure the rest of the connection. Current symmetric encryption schemes are almost certainly not vulnerable to attack by quantum computers because of their radically different design, so the only part of TLS's security that needs to upgrade to avoid attacks from a quantum computer is the key-exchange mechanism.

Belt and suspenders

The problem, of course, is that trying to come up with novel, hard mathematical problems that can be used as the basis of an encryption scheme does not always work. Sometimes, cryptographers will pose a problem believing it to be sufficiently hard, and then a mathematician will come along and discover a new approach that makes attacking the problem feasible. That is exactly what happened to the SIKE protocol in 2022. Even when a cryptosystem is not completely broken, a particular implementation can still suffer from side-channel attacks or other problematic behaviors, as happened with post-quantum encryption standard Kyber/ML-KEM multiple times from its initial draft in 2017 to the present.

That's why, when the US National Institute of Standards and Technology (NIST) standardized Kyber/ML-KEM as its recommended post-quantum key-exchange mechanism in August 2024, it provided approved ways to combine a traditional key-exchange mechanism with a post-quantum key-exchange mechanism. When these algorithms are properly combined (which is not too difficult, although cryptographic implementations always require some care), the result is a hybrid scheme that remains secure so long as either one of its components remains secure.

The Linux Foundation's Open Quantum Safe project, which provides open-source implementations of post-quantum cryptography, fully supports this kind of hybrid scheme. The IETF's initial draft recommendation in 2023 for how to use post-quantum cryptography in TLS specifically said that TLS should use this kind of hybrid approach:

The migration to [post-quantum cryptography] is unique in the history of modern digital cryptography in that neither the traditional algorithms nor the post-quantum algorithms are fully trusted to protect data for the required data lifetimes. The traditional algorithms, such as RSA and elliptic curve, will fall to quantum cryptanalysis, while the post-quantum algorithms face uncertainty about the underlying mathematics, compliance issues (when certified implementations will be commercially available), unknown vulnerabilities, hardware and software implementations that have not had sufficient maturing time to rule out classical cryptanalytic attacks and implementation bugs.

During the transition from traditional to post-quantum algorithms, there is a desire or a requirement for protocols that use both algorithm types. The primary goal of a hybrid key exchange mechanism is to facilitate the establishment of a shared secret which remains secure as long as as one of the component key exchange mechanisms remains unbroken.

But the most recent draft from September 2025, which was ultimately adopted as a working-group document, relaxes that requirement, noting:

However, Pure PQC Key Exchange may be required for specific deployments with regulatory or compliance mandates that necessitate the exclusive use of post-quantum cryptography. Examples include sectors governed by stringent cryptographic standards.

This refers to the US National Security Agency (NSA) requirements for products purchased by the US government. The requirements "will effectively deprecate the use of RSA, Diffie-Hellman (DH), and elliptic curve cryptography (ECDH and ECDSA) when mandated." The NSA has a history of publicly endorsing weak (plausibly already broken, internally) cryptography in order to make its job — monitoring internet communications — easier. If the draft were to become an internet standard, the fact that it optionally permits the use of non-hybrid post-quantum cryptography might make some people feel that such cryptography is safe, when that is not the current academic consensus.

There are other arguments for allowing non-hybrid post-quantum encryption — mostly boiling down to the implementation and performance costs of supporting a more complex scheme. But when Firefox, Chrome, and the Open Quantum Safe project all already support and use hybrid post-quantum encryption, that motivation didn't ring true for other IETF participants.

Some proponents of the change argued that supporting non-hybrid post-quantum encryption would be simpler, since a non-hybrid encryption scheme would be simpler than a hybrid one. Opponents said that was focusing on the wrong kind of simplicity; adding another method of encryption to TLS makes implementations more complex, not less. They also pointed to the cost of modern elliptic-curve cryptography as being so much smaller than the cost of post-quantum cryptography that using both would not have a major impact on the performance of TLS.

From substance to process

The disagreement came to a head when Sean Turner, one of the chairs of the IETF working group discussing the topic, declared in March 2025 that consensus had been reached and the proposal ought to move to the next phase of standardization: adoption as a working-group document. Once a draft document is adopted, it enters a phase of editing by the members of the working group to ensure that it is clearly written and technically accurate, before being sent to the Internet Engineering Steering Group (IESG) to possibly become an internet standard.

Turner's decision to adopt the draft came as a surprise to some of the participants in the discussion, such as Daniel J. Bernstein, who strongly disagreed with weakening the requirements for TLS 1.3 to allow non-hybrid key-exchange mechanisms and had repeatedly said as much. The IETF operates on a consensus model where, in theory, objections raised on the mailing list need to be responded to and either refuted or used to improve the standard under discussion.

In practice, the other 23 participants in the discussion acknowledged the concerns of the six people who objected to the inclusion of non-hybrid post-quantum key-exchange mechanisms in the standard. The group that wanted to see the draft accepted just disagreed that it was an important weakening in the face of regulatory and maintenance concerns, and wanted to adopt the standard as written anyway.

From there, the discussion turned on the question of whether the working-group charter allowed for adopting a draft that reduced the security of TLS in this context. That question never reached a consensus either. After repeated appeals from Bernstein over the next several months, the IESG, which handles the IETF's internal policies and procedures, asked Paul Wouters and Deb Cooley, the IETF's area directors responsible for the TLS working group, whether Turner's declaration of consensus had been made correctly.

Wouters declared that Turner had made the right call, based on the state of the discussion at the time. He pointed out that while the draft permits TLS to use non-hybrid post-quantum key-exchange algorithms, it doesn't recommend them: the recommendation remains to use the hybrid versions where possible. He also noted that the many voices calling for adoption indicated that there was a market segment being served by the ability to use non-hybrid algorithms.

A few days after Wouters's response, on November 5, Turner called for last objections to adopting the draft as a working-group document. Employees of the NSA, the United Kingdom's Government Communications Headquarters (GCHQ), and Canada's Communications Security Establishment Canada (CSEC) all wrote in with their support, as did employees of several companies working on US military contracts. Quynh Dang, an employee of NIST, also supported publication as a working-group document, although claimed not to represent NIST in this matter. Among others, Stephen Farrell disagreed, calling for the standard to at least add language addressing the fact that security experts in the working group thought that the hybrid approach was more secure: "Absent that, I think producing an RFC based on this draft provides a misleading signal to the community."

As it stands now, the working group has adopted the draft that allows for non-hybrid post-quantum key-exchange mechanisms to be used in TLS. According to the IETF process, the draft will now be edited by the working-group members for clarity and technical accuracy, before being presented to the IESG for approval as an internet standard. At that point, companies wishing to sell their devices and applications to the US government will certainly enable the use of these less-secure mechanisms — and be able to truthfully advertise their products as meeting NIST, NSA, and IETF standards for security.

[ Thanks to Thomas Dalichow for bringing this topic to our attention. ]



to post comments

Bernstein's Blog

Posted Dec 8, 2025 19:24 UTC (Mon) by hDF (subscriber, #121224) [Link] (44 responses)

great extensive writeup of this on djb's blog at https://blog.cr.yp.to/

Bernstein's Blog

Posted Dec 9, 2025 11:43 UTC (Tue) by muase (subscriber, #178466) [Link] (43 responses)

While it's never wrong to hear the opinions of Bernstein when it comes to cryptography, I'd take his stances regarding everything related to PQC standardization with a grain of salt.

In the mailing lists, there were several occasions where he tried to derail the discussion with weird arguments, going so far that other colleagues accused him of spreading FUD. There were multiple occasions where he repeatedly failed to explain his positions in a sound way (aka they simply didn't make much sense), and instead of clarifying he tried to derail the discussion instead. Quite a bit of his PQC criticism is more of a minority opinion, and not necessarily consensus/status quo in the cryptographic community.

All in all it can be said that during PQC-standardization, Bernstein repeatedly did not argue in good faith – that doesn't necessarily mean that he's wrong; but readers should be aware of that.

Links:
https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/S...
https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/G...
https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/C...
https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/W...

There are others; but those are the most-prominent ones I remember that caused the WTF-moments here.

Bernstein's Blog

Posted Dec 9, 2025 14:31 UTC (Tue) by chris_se (subscriber, #99706) [Link] (41 responses)

I don't know enough about the mathematical details of cryptography to know on which side I'd be on in the discussions you linked. From reading parts of the threads you linked what is clear to me is that djb espouses a very "me vs NIST" mentality here - but I don't know even remotely enough about the history of these discussions whether he has a good reason for that or is just needlessly being contrarian here.

What I can say though is the following: on the substance of hybrid vs. pure I'm fully on dbj's side here. I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes. While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms. Look at the amount of PQC algorithm candidates proposed by extremely knowledgeable and intelligent people that have since been broken to such an extent that they aren't an obstacle to even a classical computer. This gives me a lot of pause in relying _solely_ on a PQC algorithm.

I think it's great that we are now already thinking about PQC because at some point someone will build a capable enough quantum computer. But until that happens, I think it's grossly negligent to switch to a pure PQC algorithm _at this point in time_, because until then we will be sure that the classical algorithm will provide the necessary guarantees, even if the specific PQC algorithm is broken.

Bernstein's Blog

Posted Dec 9, 2025 15:23 UTC (Tue) by geofft (subscriber, #59789) [Link] (19 responses)

I think I saw an argument on the mailing list that one large company wants to use TLS with a PQ-only algorithm internal to their data centers, and as I understand it this form of "standardization" would simply give it a constant identifier for use with TLS, so they could contribute such implementations to publicly-reviewed OSS libraries and expect interoperability between suitably configured libraries. From that perspective, it can be argued that it's hard to fathom why one would want to prohibit others from using this, as that is the only technical effect of refusing to advance this standard.

I do agree, broadly, that this is an unusual choice and people who do not have multiple qualified cryptographers available to advise them should generally not make this choice, and in particular it's not the right choice for anyone in a fiduciary position over someone else's communication security (e.g. web browsers, default configs of OSS libraries), only for people who can appropriately sign up for the consequences to their own communication security. So I think the decision to hold off on advancing the document until there's clear text to this effect is a wise one.

But at the same time, I think this argument is so obvious that anyone (you, me, etc.) could make it, and djb would better serve himself by understanding this, encouraging someone else to take up the cause of making the point, and disengaging personally. For that reason I am also sympathetic to what is being called the "pro-censorship" position - the IETF, unlike a government, has no power to restrict what djb or anyone else writes on a blog, posts on social media or in a comment section like this one, emails a colleague privately, publishes in a journal, etc. If a particular technical argument is sound, surely at least one other person can understand it and convey it, so unlike e.g. political debate, it doesn't seem to me there's a reason to act like every individual person has a fundamental right to participate. And it is easier for a technical point to be seen through to resolution if it is not drowned in what has clearly turned into interpersonal conflict.

The IETF also has no power to control what people think of it and the extent to which people trust it. News sites like this one will report on their decisions. Governments that have laws or regulations about IETF standards can change their laws or regulations. If the IETF decides to be so ruthless in "censorship" that this affects the technical quality of its work, people will see that and respond—just as people did see that Dual_EC_DRBG was bonkers despite coming from NIST and almost everyone responded appropriately.

Bernstein's Blog

Posted Dec 9, 2025 18:52 UTC (Tue) by brunowolff (guest, #71160) [Link] (16 responses)

He isn't trying to prohibit stupid people from insisting on doing stupid things, he is trying to prevent people who don't know any better from doing stupid things. His argument is that people who see that a PQ only standard exists are going to think it is safe to use, because otherwise why would it be a standard.
It is way too early to be fully trusting PQ only algorithms. But because there are organizations recording all of the data now, hoping to be able to decrypt it later with PQ computers, it makes sense to use hybrids to try to do something about that, even though it might not work.

Bernstein's Blog

Posted Dec 10, 2025 0:01 UTC (Wed) by hailfinger (subscriber, #76962) [Link] (15 responses)

> He isn't trying to prohibit stupid people from insisting on doing stupid things, he is trying to prevent people who don't know any better from doing stupid things. His argument is that people who see that a PQ only standard exists are going to think it is safe to use, because otherwise why would it be a standard.

This argument postulates that some people are clever enough to look for a standard and read it, but that the same people are way too stupid to understand what's written in the standard, and that those very same people will be misled. That argument sounds surprisingly like a religion which wants to prevent people from straying from the true path of enlightenment. It also reminds me of the "think of the children!" mind trick.

Bernstein's Blog

Posted Dec 10, 2025 0:49 UTC (Wed) by Wol (subscriber, #4433) [Link] (7 responses)

You're missing the fact that advertisers rarely have a clue about what they're advertising. They will quite happily advertise "Our crypto is stronger because we use a PQ algorithm", and the PHBs will insist on it because they believe the hype.

If the standard says that pure-PQ MUST be disabled by default, then the security regulators should have enough sense to wallop such PHBs with a clue-by-four.

Seriously. Don't underestimate the stupidity of your average PHB. What's that saying? "There's no-one harder to educate than someone who's livelihood depends on their not understanding ..."

(My biggest bug bear is those people who claim to be eco-friendly because normal farming activity (usually of crop trees) "mops up carbon dioxide". No it doesn't! It's just part of the normal carbon cycle and makes absolutely NO difference to global warming!)

Cheers,
Wol

Bernstein's Blog

Posted Dec 10, 2025 9:47 UTC (Wed) by farnz (subscriber, #17727) [Link] (6 responses)

It's trivial for the algorithm to be named something like "experimental_insecure_pqc_algoname" (e.g. "experimental_insecure_NTRU_enc") in the standard, and to reserve a number for it. Then, if it's later determined to be secure, it can have the name "pqc_algoname" (e.g. "NTRU_encrypt"), with "experimental_insecure_pqc_algoname" as a deprecated alias for it.

If a PHB then says "our crypto is stronger because we use experimental_insecure_NTRU_enc", then they're likely to have regulators and other PHBs alike point out their error.

Bernstein's Blog

Posted Dec 10, 2025 16:07 UTC (Wed) by hailfinger (subscriber, #76962) [Link] (5 responses)

Are you proposing labeling an algorithm as insecure on the basis that you don't feel comfortable with it? "Insecure" has a very specific meaning, and "not old/proven enough" is not it.
Algorithm IDs and names are descriptive of what the algorithm is, not assessments of its security.

Bernstein's Blog

Posted Dec 10, 2025 16:22 UTC (Wed) by farnz (subscriber, #17727) [Link] (4 responses)

I'm asserting that it's possible for the standard to name an algorithm such that anyone using it without fully understanding the implications of that decision gets ridiculed by their peers and people they respect for doing so, even if none of them have an understanding of cryptography.

The precise name you choose to give it for now is a detail of that - but standing up and saying "we use experimental_possibly_insecure_enc_ntru1 cryptography for post-quantum security" will get you laughed at, in a way that "we use NTRUEncrypt for post-quantum security" will not. And that's enough to let the people who really understand what they're doing experiment with PQC in the open (thus getting us experience of practical gotchas as well as cryptographic faults), while stopping the clueless from using it because "obviously" pure PQC is better than hybrid PQC, right?

Bernstein's Blog

Posted Dec 10, 2025 23:19 UTC (Wed) by hailfinger (subscriber, #76962) [Link] (3 responses)

But why do you want to label an algorithm that way? Why do you want people to (quoting you) "gets ridiculed by their peers"?
This obsession with labeling some algorithm as insecure or (alternatively) not having that algorithm in a standard is really extreme.

It's structurally similar to the fight against schools teaching undesirable topics.

However, if you think that labeling algorithms as "insecure" without proof of actual insecurity is okay, then anybody may request the same labeling for RSA and any elliptic curve algorithms. You know what? That's a great idea! Let's just label all the algorithms as insecure because there is at least one person per algorithm not trusting that algorithm. Sure, that defeats the purpose of labeling in the first place. However, the debate has long since shifted from debating actual merit to forcibly preventing the opponent from entering the playing field.

Bernstein's Blog

Posted Dec 11, 2025 0:12 UTC (Thu) by brunowolff (guest, #71160) [Link]

> This obsession with labeling some algorithm as insecure or (alternatively) not having that algorithm in a standard is really extreme.

You have a point about the silly name; but not including poor choices in standards isn't extreme, it is expected behavior.

Bernstein's Blog

Posted Dec 11, 2025 10:16 UTC (Thu) by farnz (subscriber, #17727) [Link] (1 responses)

Because labelling it is a compromise position between "we should not include this algorithm because it might be insecure, and if we include it, people who don't understand cryptography " and "we should include this algorithm so that we can see how it works in the real world".

If everyone agreed on including it, then we wouldn't need to label it. But some people say it shouldn't be included because it's "not yet proven secure, so must be treated as insecure, but people who shouldn't use it will get attracted by the name".

If everyone agreed it should be excluded, then we wouldn't need to label it. But some people say it shouldn't be excluded because it's "not yet proven insecure, and is useful in our environment".

Labelling it is one way to compromise between the two; it addresses the attractive nuisance side, because the name makes it clear that it's not what you want and should be disabled, while still leaving it in the standard for people who want to use it despite the unknown risks.

Bernstein's Blog

Posted Jan 5, 2026 11:55 UTC (Mon) by sammythesnake (guest, #17693) [Link]

I feel like there ought to be a status dual to "deprecated" that the PQ-only option could be in, how about "probationary"? In 5 years or whatever, that status could be reviewed and either removed, or changed to deprecated (or left unchanged, if that makes more sense)

Bernstein's Blog

Posted Dec 10, 2025 1:26 UTC (Wed) by brunowolff (guest, #71160) [Link] (6 responses)

> This argument postulates that some people are clever enough to look for a standard and read it,

No it doesn't. It postulates someone knows that there is a PQ only standard and decides that they should use it. The person deciding this isn't necessarily even the person doing the implementation. They may be influnced by others to use it, similar to how RSA Security was bribed to include Dual EC and make it the default.

Bernstein's Blog

Posted Dec 10, 2025 15:37 UTC (Wed) by hailfinger (subscriber, #76962) [Link] (5 responses)

I fail to see why that would be a problem. We have the NULL cipher in so many standards and nobody raises a stink.
Yet for PQ algorithms there is an almost religious fight against using them standalone because some people feel that those algorithms are not proven enough.

Bernstein's Blog

Posted Dec 10, 2025 17:24 UTC (Wed) by brunowolff (guest, #71160) [Link] (4 responses)

People do make a stink about having NULL ciphers in protocols. They can make downgrade attacks easier and allow for people to think encryption is being used when it isn't.

Bernstein's Blog

Posted Jan 5, 2026 11:59 UTC (Mon) by sammythesnake (guest, #17693) [Link] (3 responses)

Additionally, and importantly, a PHB is a lot less likely to misunderstand "NULL encryption" as a GoodIdea™ than "Post Quantum Cryptography" *Something* to protect against that seems only sensible to me...

Bernstein's Blog

Posted Jan 5, 2026 12:58 UTC (Mon) by Wol (subscriber, #4433) [Link] (2 responses)

Until someone decides to call it ROT-26 :-)

Cheers,
Wol

Bernstein's Blog

Posted Jan 5, 2026 15:08 UTC (Mon) by amacater (subscriber, #790) [Link] (1 responses)

ROT52 - it's the only way to be sure and with two additional encryption rounds it's bound to be more secure.

Bernstein's Blog

Posted Jan 5, 2026 16:01 UTC (Mon) by paulj (subscriber, #341) [Link]

I'd also add 2 rounds of XOR encryption, so that if one algorithm is broken you still have the protection of the other algorithm. Very unlikely 2 algorithms would be broken at once!

Bernstein's Blog

Posted Dec 9, 2025 20:06 UTC (Tue) by chris_se (subscriber, #99706) [Link]

> I think I saw an argument on the mailing list that one large company wants to use TLS with a PQ-only algorithm internal to their data centers, and as I understand it this form of "standardization" would simply give it a constant identifier for use with TLS, so they could contribute such implementations to publicly-reviewed OSS libraries and expect interoperability between suitably configured libraries. From that perspective, it can be argued that it's hard to fathom why one would want to prohibit others from using this, as that is the only technical effect of refusing to advance this standard.

If that was the sole reason they could have added some text to the standard like "The non-hybrid algorithm is optional and its use is discouraged. If implemented, it MUST be disabled by default and MUST require explicit configuration to enable it and documentation of the software regarding this algorithm MUST mention that the standard discourages its use". i.e. not just encouraging to use hybrid algorithms (which appears to be the current "consensus") but explicitly making it clear that this was added for some corner cases and nobody who doesn't know any better should use this.

Bernstein's Blog

Posted Dec 9, 2025 23:21 UTC (Tue) by mcatanzaro (subscriber, #93033) [Link]

TLS 1.2 was a mess of ciphersuites. We learned from that. TLS 1.3 has only a few options. Turns out, fewer is better.

Bernstein's Blog

Posted Dec 9, 2025 19:09 UTC (Tue) by brunowolff (guest, #71160) [Link] (13 responses)

> I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time.

You mean, other than the obvious one, that the NSA is trying to weaken encryption, like we know they have done more than once (DES, Dual EC, IPSEC) in the past?

Bernstein's Blog

Posted Dec 9, 2025 19:55 UTC (Tue) by chris_se (subscriber, #99706) [Link] (2 responses)

> > I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time.
>
> You mean, other than the obvious one, that the NSA is trying to weaken encryption, like we know they have done more than once (DES, Dual EC, IPSEC) in the past?

Ok, let me rephrase this: s/anyone/anyone with an honest interest in widely available strong cryptography/ ;-)

That said, I don't think things are quite as clear-cut in this situation (in contrast to the egregious Dual EC DRBG case [*]), the following scenarios are possible in my opinion:

1. The NSA wants the new standard to be weak and can already break the specific PQC algorithm described here.
2. The NSA wants the new standard to be weak and believes it will be able to break the specific PQC algorithm described here very soon. (They found a weakness that they think with further development will allow them to break it, but can't do it just yet.)
3. The NSA wants the new standard to be weak and is gambling on the fact that the specific PQC algorithm will be broken in the near future (but could be completely wrong).
4. The NSA has some other interest in forcing this to be standardized (e.g. because they already specified this in a secret contract and have given a contractor a lot of money and want to ensure this becomes a standard) but don't want to explicitly weaken the new standard.
5. The NSA does not want to weaken the new standard, but some middle manager with a huge ego is forcing the other employees to do this, not for any technical reason, but because they decided this at some point and now want to ram this through.

If you think (1) or (2) are likely true then that specific PQC algorithm should be avoided regardless of whether a pure or a hybrid application is used. Personally I think either (3), (4) or (5) is the case, and I find either one of them quite plausible.

[*] The Dual EC DRBG case was unique because it gave the NSA the possibility to insert a back door into the algorithm without making the algorithm intrinsically insecure if you DIDN'T know the secret parameters used to create the back door. For the NSA there was no downside to this - nobody else would be able to break this, but they would. I don't think this applies in the case of the PQC algorithms that were standardized, because after the Dual EC DRBG fiasco people in crypto competitions have been very wary of specific magic numbers and I don't believe the current standardized PQC algorithms allow for such a type of back door. And while the NSA wants to be able to break all crypto themselves, they also have a vested interest in preventing other people from breaking it, which is why e.g. the DES situation was not as clear-cut as the Dual EC DRBG case, in that they did strengthen DES against specific attacks (while limiting key sizes to make brute force easier, because back then they did have an edge on compute power, which I don't believe they have anymore).

Bernstein's Blog

Posted Dec 9, 2025 21:55 UTC (Tue) by ballombe (subscriber, #9523) [Link] (1 responses)

You forget the more likely:

6. The NSA wants the new standard to require major change in sensitive code paths so that they can exploit bugs in the implementation independently of the strength of PQC.

Bernstein's Blog

Posted Dec 9, 2025 23:17 UTC (Tue) by dvdeug (subscriber, #10998) [Link]

Which strikes me as unlikely. The NSA probably has the best cryptoanalysis in the world. It has computer resources only a large government could devote to cracking encryption. Does the NSA think it is worth making encryption vulnerable to Joe Schmoe with a brain and a hundred-dollar laptop (and the Russian mob and North Korea) just so they can crack in?

Bernstein's Blog

Posted Dec 10, 2025 9:46 UTC (Wed) by farnz (subscriber, #17727) [Link] (9 responses)

The NSA as an organisation is schizophrenic about encryption strength.

One part of the NSA is tasked with monitoring communications, and has a vested interest in weakening the encryption used to a level that the NSA can break, while another part is tasked with making USA entities' (including private citizens) communications secure against foreign adversaries, and has a vested interest in making sure that even if China's Ministry of State Security successfully infiltrates the NSA and matches or exceeds their capabilities for a time, encryption is still too strong for the MSS to decrypt.

And that makes tracking what the NSA is doing extremely hard from the outside; we won't know for decades (if not centuries) which part of the NSA is pushing for a pure PQC algorithm. If it's the part that wants to break encryption, then we are right to reject it; but if it's the part that wants to keep USA communications secure against foreign intelligence agencies, and they know a way to break all known hybrid algorithms that they can't disclose because the monitoring side of the agency is actively exploiting it, then we do want pure PQC.

But it's impossible to tell which is which from the outside - and we've seen examples of both (NSA insisted on a change to DES that protected against a cryptanalysis process that wasn't yet publicly known, but NSA also insisted on specific constants in Dual_EC_DRBG that weakened it if you knew how the NSA had chosen those constants).

Bernstein's Blog

Posted Dec 10, 2025 11:03 UTC (Wed) by johill (subscriber, #25196) [Link] (1 responses)

Definitely true, though I think

> they know a way to break all known hybrid algorithms

is highly unlikely. There isn't even a single hybrid construction, you have hybrid KEMs (of use e.g. for TLS), hybrid PAKEs (likely of no use for TLS), and probably others, those are the two I've recently looked into (for 802.11.)

(KEM = Key Encapsulation Mechanism, PAKE = Password Authenticated Key Exchange)

Now from a TLS perspective you could argue that hybrid KEMs are the only interesting case, but even then it seems pretty hard to imagine that effectively doing two KEMs (say ECDH and ML-KEM) in parallel and mixing the results with a strong hash would result in something weaker than each portion. Even if one side was broken to the point of always returning zeroes, you can't predict the output of a say SHA-2 has over the combination (which might be but is not necessarily simple concatenation.)
There's a current TLS draft (https://datatracker.ietf.org/doc/draft-ietf-tls-ecdhe-mlkem/) that's widely deployed which simply concatenates the two secrets. Obviously that only works if the full resulting secret is always used, but TLS always uses an HMAC (HKDF-Extract) to derive other keys from it.

PAKEs are more complicated because ML constructions apparently don't have indistinguishability properties, so an attacker can statistically check offline whether or not they guessed the right password, so you have to be really careful when designing hybrid PAKEs to not make it weakest-of-both, for example a trivial "parallel + combine-by-hash" would be weakest-of-both since an attack on each one lets an attacker discover the shared password.

Ultimately, I think it's extremely unlikely that there's a class attack against hybrid constructions to the point of making it something that's fundamentally broken/undesirable for all use cases/all kinds of hybrids.

In the FAQ (I think) NIST brings one example where a hybrid is needed for "protocol reasons" (IKEv4 I think), but I don't really understand what "protocol reasons" means while we're changing a protocol to include PQ crypto. I think if they (or adjacent folks) knew about a class attack against that, they'd recommend redesigning those protocols entirely (and possibly breaking backward compat instead, which I assume could be the only "protocol reason".)

Not that any of this gives us more information as to their motivations :)

Personally, I'm leaning towards hybrids because I really see no good reason not to and I'd think it's better to guard against all sides. Yes, in general one needs to be careful about how ones does that. Also, the overhead is pretty small (data exchanged for ML-KEM is at least an order of magnitude bigger data than for say ECDHE).

What does the NSA know?

Posted Dec 10, 2025 11:53 UTC (Wed) by farnz (subscriber, #17727) [Link]

The problem is that we don't know whether there's an underlying break that applies to all the current hybrid constructions, or whether the NSA has a set of 100 known tools that between them break all the known hybrid constructions.

And because the NSA is so hugely secretive, we have no way of knowing whether or not they've got a huge set of tools that break hybrid algorithms but not pure PQC or whether they're pushing for pure PQC because they've broken the pure PQC algorithms suggested, but not the hybrids.

Indeed, it's even possible that both are true, and we're screwed either way, with one bit of the NSA pushing for pure PQC because they can break all the hybrid options and want security, and another bit pushing for pure PQC because they've broken that and want insecurity.

Bernstein's Blog

Posted Dec 10, 2025 18:15 UTC (Wed) by brunowolff (guest, #71160) [Link]

It seems pretty clear that NSA's attitude toward publically available encryption is that they should have ways to break it or work around it (by making correct implementations harder or having access to one of the end points), even if that includes risks of other bad actors also getting access.
It seems very unlikely they have any real interest in providing private citizens communications secure against foreign adversaries. They do have an an interest in protecting businesses communications, including that with their customers.
They didn't need differential cryptanalysis for DES when the 56bit key size was too small.
They also messed up with Dual EC, and some other actor used that infrastructure with different constants against Juniper routers.
We learned a lot about the NSA in 2013. That may or may not happen again before several decades go by.

Bernstein's Blog

Posted Dec 13, 2025 19:26 UTC (Sat) by marcH (subscriber, #57642) [Link] (5 responses)

> The NSA as an organisation is schizophrenic about... One part of the NSA is tasked with... while another part is tasked with...

You mean: "schizophrenic" like every other organisation made of many individuals?

> and we've seen examples of both (NSA insisted on ... but NSA also insisted on ...)

A huge part of this problem is: language. For instance, even when highlighting this precise issue, you keep using a singular "NSA" instead of the more accurate "some [other] NSA people" and used the adjective "schizophrenic" like it's a single person.

Language influences the way we think and this always struck me as an impressive example.

Bernstein's Blog

Posted Dec 14, 2025 16:43 UTC (Sun) by farnz (subscriber, #17727) [Link] (4 responses)

No, I mean that even one person doing everything the NSA is tasked with would face trouble resolving the inherent contradiction in its tasks; it's not that different people in the organisation have different priorities and interests, but rather that the NSA is supposed to both ensure that American businesses and government agencies can use unbreakable encryption no matter who they're communicating with, while also ensuring that non-American entities have no access to encryption the NSA can't break no matter who they're communicating with.

Even if the NSA was a singular person, that would be an impossible pair of missions to deliver on - how do you deliver encryption that's both broken by the NSA and unbreakable by anyone simultaneously to a non-American entity communicating with an American business or government agency?

Bernstein's Blog

Posted Dec 15, 2025 14:20 UTC (Mon) by paulj (subscriber, #341) [Link] (3 responses)

Standardising encryption that the NSA is confident only the NSA can break would be one way to meet that objective. Course, achieving that confidence in the face of an existence proof of a way to break an algorithm is... a tall order - but perhaps they have methods for that (e.g., judgement calls by analysing what systems other SIGINT agencies approve of/use for their governments and militaries; human intel from sister agencies; etc.).

Bernstein's Blog

Posted Dec 15, 2025 14:33 UTC (Mon) by farnz (subscriber, #17727) [Link] (2 responses)

The objective is that nobody (not even the NSA) can break it if both endpoints are USA entities, but only the NSA can break it if one or more entities using it is non-USA.

The only way to do that is to ban exports of encryption, with associated 1st Amendment concerns, so that the unbreakable encryption is only available to US entities, and to communicate with non-US entities you must use encryption the NSA is confident only the NSA can break.

Bernstein's Blog

Posted Dec 15, 2025 14:39 UTC (Mon) by paulj (subscriber, #341) [Link] (1 responses)

I think experience already shows this approach is impossible.

Bernstein's Blog

Posted Dec 15, 2025 15:08 UTC (Mon) by farnz (subscriber, #17727) [Link]

Indeed, but it's what the NSA is required to do - prevent non-US entities from communicating with encryption the NSA can break, while ensuring that US entities have access to encryption that cannot be broken at all, not even by the NSA, but only when communicating with other US entities.

This is an impossible task, and the NSA trying to do it is why it ends up completely untrustworthy - since you never know whether you're dealing with someone who's focused on the "non-US entities cannot communicate without us breaking their encryption", or whether you're dealing with someone who's focusing on "US entities must have access to unbreakable encryption".

And you'd still have that problem if the NSA was a single person - how do you know whether they're focusing on "non-US entities must not have encryption we cannot break" or "US entities must have encryption no-one can break"?

Bernstein's Blog

Posted Dec 9, 2025 21:31 UTC (Tue) by muase (subscriber, #178466) [Link] (6 responses)

I agree that there's nothing wrong with a combined scheme; IMO it's a very reasonable default.

However, I'm not too big of a fan of this argument:
> I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.

First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake. Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).

I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.

> While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms

But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.

All PQC winners are not new – the PQC contest started 2017, that's when the first iteration of Kyber (now ML-KEM) was submitted, aka 8 years ago. And it is important to point out that even back then, Kyber wasn't a new idea, but a rather straight evolution of 2005's LWE.

> Look at the amount of PQC algorithm candidates proposed by extremely knowledgeable and intelligent people that have since been broken to such an extent that they aren't an obstacle to even a classical computer

Yes, but that leaves us with two data points. First, _new_ mathematical problems are risky – that's true, but does not apply for lattices/LWE. Second, there have been tons of cryptanalysis – enough to find the problems with SIKE (probably more than for the entire TLS 1.3 construction).

That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.

Does that guarantee that there isn't an undiscovered mathematical weakness? Definitely not. But it has seen enough analysis that we can consider it reasonably unlikely; so unlikely that – with today's knowledge – we can consider it secure for normal use. Like I said, a combined scheme is a reasonable default; but there's really no good objective/scientific reason to aggressively object a fully optional, non-combined mode like it happened here.

Bernstein's Blog

Posted Dec 9, 2025 22:36 UTC (Tue) by chris_se (subscriber, #99706) [Link] (5 responses)

> However, I'm not too big of a fan of this argument:
> > I cannot fathom the reasoning why anyone would want to standardize a pure PQC algorithm at this point in time. There has been _extensive_ cryptoanalysis on both RSA and EC, and we can be quite confident that they will not be broken by classical computers during any of our lifetimes.
>
> First, because this is true for every "new" algorithm – if we go for this logic, we would still use mandatory RSA+ECC combinations for every handshake.

ECC isn't new, it's just a couple of years younger than RSA. But if you look at adoption, NIST recommended some ECC curves already back in 1999, but adoption of ECC only really started in the 2010s (yes, the first TLS standard with ECC was already in 2006, but it was extremely niche before ~2010), and e.g. OpenSSH started supporting ECC only in 2014, for example.

(And most (all?) CAs are still using RSA for their signatures nowadays btw. For example, LWN's own certificate is signed with RSA.)

> Or even better, RSA+P256+Curve25519. Or SHA2+SHA3+Blake2. Or AES-GCM+ChachaPoly(+HMAC for additional security).

Your examples don't really make that much sense for this discussion in my eyes. The reason one wants to use ECC over RSA is primarily performance and key size. Sure, there are some benefits of newer ECC constructions that make it harder to mess up an implementation (doing RSA right especially w.r.t. padding is highly non-trivial), but fundamentally the security guarantees by RSA and ECC are on the same level. (Albeit with vastly different key sizes.) Therefore a hybrid RSA + ECC doesn't help you, because you lose the benefits of ECC here for no real gain. So with RSA or ECC it was always either or - either you were conservative and used RSA, or you trusted ECC enough that you'd be willing to adopt it in the early days.

PQC algorithms are fundamentally different: ideally we want to switch to an scheme that _has_ to include a PQC algorithm as soon as possible, for forward secrecy. Classical-only schemes should disappear as fast as possible. And in that scenario having a hybrid scheme is much more sensible.

(Btw. even with ECC we did learn a lot after they were initially introduced. The first ECC scheme was ECDSA, which has the same possible fatal private key leak as DSA if the implementation is not done properly, and only later were there people who proposed better schemes that don't use a DSA-like linear combination.)

> I'm exaggerating here a bit for the sake of the argument; but IMO there's some truth in it, as with that logic, you will never evolve. And evolution and "blessing" of new algorithms is important here, or we'll end up with another SHA3 nobody uses – and nobody will analyze anymore in 5 or 10 years.

Your example contradicts you here: SHA3 has been blessed officially - and still has a very low adoption rate. (Though that is mostly inertia in my opinion, because there have been no new attacks discovered on SHA2, which is why there's no pressure in switching. Even worse, git is still in the process of migrating away from SHA1, and I see plenty of websites that still provide MD5 hashes of downloads. *shudder*)

> > While there has been a lot of cryptoanalysis done on PQC algorithms especially in the last years, they have not yet been vetted even remotely as well as classical algorithms
> But when has it been enough cryptanalysis for normal use? NTRU has been around since 1996, that's makes almost 30 years for lattice-based cryptography by now. LWE and LWE-class-problems are known since 2005, that makes 20 years. McEliece has been around since 1978 – that's close to 50(!) years by now, and seven years older than the earliest ECC suggestions.

I think this is misleading. Due to the fact that the number of bytes required for these proposed algorithms (both for the keys themselves as well as in transit) is so large, they were not even remotely analyzed as closely in that time as e.g. ECC. This has obviously changed since then, but the age alone is not a pure indicator. Most cryptanalysis on PQC algorithms has happened in the last 10 years.

> That's one of the reasons why others are even talking about FUD here – ML-KEM has seen _extensive_ cryptanalysis over a long time, and the underlying mathematical problem is old enough to be considered well-known by today's standards. Additionally, due to the nature of the contest, those 7 years have resulted in a really immense amount of concentrated cryptanalysis and mathematical research, all done with todays knowledge and quality standards.

And it's great that so much cryptanalysis has happened for the various PQC algorithms, and I'm very much in favor of pushing these algorithms forward. But RSA (and ECC though less so) is _much_ simpler than any PQC algorithm I've seen, so I don't think one should be comfortable _at this point in time_.

I do want those algorithms to become parts of standards (so that people are still motivated to analyze them going forward), and in 10 years time I could easily see the standards bodies revisiting this. (Or once a quantum computer exists that can break ECC, if that happens earlier.)

But we already have high-performance robust secure side-channel free ECC implementations available _right now_, and ECC keys are small, and computations are fast. Compared to the key sizes, computation requirements, and message sizes of any PQC algorithms, any ECC overhead is trivial. (I think a hybrid scheme with RSA would be not quite as clear-cut as a hybrid scheme with ECC.)

Bernstein's Blog

Posted Dec 9, 2025 23:42 UTC (Tue) by muase (subscriber, #178466) [Link] (4 responses)

> ECC isn't new, it's just a couple of years younger than RSA

I know; that was not so well worded on my side. My argument was: If we had applied the "you cannot trust these young algorithms"-logic consistently, therefore we make hybrid _mandatory_ without alternative, we would have ended up with all kinds of combined schemes that'd still pop up as zombies everywhere.

> Your examples don't really make that much sense for this discussion in my eyes [...]

I see your point, but ECC was also about improving security. For a long time, the main benefit of ECC was that you could easily upgrade to a security level that was impractical to achieve with RSA or DH (the speed race came later); and a hybrid scheme RSA1024+P256 would have been a significant security improvement compared to just RSA1024, and would still have provided the fallback in case ECC would have broken.

But the entire area was very different back then, so maybe you're right and it's not a good example^^

> This has obviously changed since then, but the age alone is not a pure indicator. Most cryptanalysis on PQC algorithms has happened in the last 10 years.

Yes and no. Yes, time is not a pure indicator, but that works in both directions. Cryptanalysis has gotten so much better (in methods and quality) that this is not really comparable. Within the last few years we learned more about the PQC algorithms than what we learned about older algorithms in decades, simply because the field has evolved pretty crazy (and because of internet, and knowledge accumulation, and much better tooling and formal models and proofs, etc.). It's safe to say that we know much more about ML-KEM now than we did about AES or ECC when both became adopted.

I mainly included the argument because everyone is always talking about "young" and super new algorithms and stuff; and I wanted to oppose that a bit. The algorithms and math are not as young as many people think, and also not as green re cryptanalysis as many people seem to believe.

> PQC algorithms are fundamentally different: ideally we want to switch to an scheme that _has_ to include a PQC algorithm as soon as possible, for forward secrecy. Classical-only schemes should disappear as fast as possible. And in that scenario having a hybrid scheme is much more sensible.

Here we are in full agreement I think; like I said, IMO a combined scheme is a very reasonable default. I'm totally not opposing a combined scheme, I just don't think it makes sense to oppose an additional and _optional_ PQ-only ciphersuite either.

It's simply not an either-or; and – here we might be in disagreement? – I think ML-KEM is definitely mature enough to deserve its own dedicated cipher suite. Let's call it "experimental" or "special interest" – but I think we should define it, before others come along with proprietary schemes and extensions or custom incompatible suites etc. Nobody needs that^^

Bernstein's Blog

Posted Dec 10, 2025 9:47 UTC (Wed) by chris_se (subscriber, #99706) [Link] (3 responses)

> > ECC isn't new, it's just a couple of years younger than RSA
>
> I know; that was not so well worded on my side. My argument was: If we had applied the "you cannot trust these young algorithms"-logic consistently, therefore we make hybrid _mandatory_ without alternative, we would have ended up with all kinds of combined schemes that'd still pop up as zombies everywhere.

I think things were a lot different back then. The main difference is that we've learned that the "the more the merrier" approach to standardizing cipher suites is actually detrimental to security.

But I also disagree with you here, I personally wouldn't have thought it a bad idea if the first introduction of ECC in standards in ~2005 had been a hybrid scheme. In ~2015, different story.

> I see your point, but ECC was also about improving security. For a long time, the main benefit of ECC was that you could easily upgrade to a security level that was impractical to achieve with RSA or DH (the speed race came later);

"impractical" == too slow you mean? Still counts as performance in my eyes.

> It's safe to say that we know much more about ML-KEM now than we did about AES or ECC when both became adopted.

That still doesn't touch my argument that all PQC schemes I've looked at so far are more complicated in their construction than RSA and even ECC.

> It's simply not an either-or; and – here we might be in disagreement? – I think ML-KEM is definitely mature enough to deserve its own dedicated cipher suite. Let's call it "experimental" or "special interest" – but I think we should define it, before others come along with proprietary schemes and extensions or custom incompatible suites etc. Nobody needs that^^

As I've said in another part of this thread: I'm not 100% opposed to standardizing ML-KEM alone to avoid proprietary messes, if it's clearly marked as "do not use unless you really really know what you are doing". But I don't think it's been analyzed enough (due to its complexity) that I'd be comfortable in making this just an optional thing with even just a warning - I'd want this to be actively discouraged and the standard should indicate that it must be disabled by default unless configured otherwise. Give it another 10 years, and more experience in the field with it (especially when it also comes to side channels), and assuming nobody's broken it by then, I'd be happy to go ML-KEM only.

Bernstein's Blog

Posted Dec 11, 2025 0:36 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link]

I'd want this to be actively discouraged and the standard should indicate that it must be disabled by default unless configured otherwise.
It also needs some kind of protection against downgrade attacks. If you're going to officially discourage its use, there had better not be a way for an attacker to force people to use it in place of more trustworthy algorithms.

Bernstein's Blog

Posted Dec 11, 2025 12:20 UTC (Thu) by kleptog (subscriber, #1183) [Link] (1 responses)

> But I don't think it's been analyzed enough (due to its complexity)

I think the perceived complexity is related to unfamiliarity. ECC relies on fancy properties of groups, which we know are not PQ safe but are well known in the crypto-community. ML-KEM relies on some linear algebra and probability theory which to me sounds a lot less magic than ECC. Linear algebra and probability theory are some of the most studied areas of mathematics due to their ubiquitous use everywhere.

> that I'd be comfortable in making this just an optional thing

But whose comfort should we be listening to?

FWIW, I think standardizing a pure-PQ algorithm is a good idea because then we can move onto the next phase, namely algorithm implementations. Even though we've been using ECC for ages, making side-channel free implementation is still hard and we need to get the ball rolling on that now, not wait another ten years. It'll probably be at least ten years before any implementation is sufficiently available that people can even think about using it for public sites.

Bernstein's Blog

Posted Dec 11, 2025 12:46 UTC (Thu) by brunowolff (guest, #71160) [Link]

Lattices can use some more study. SIKE went from people thinking it was fine to completely broken not too long ago. There isn't a reason to take that risk now.

Implementors don't need a PQ only version of ML-KEM in the TLS standard to start implementing ML-KEM. There are already implementations. Also timing atacks are taken a lot more seriously now as compared to when AES came about. People are already doing constant time implementations and checking them. In fact there was a screw up in kyberslash where there was a divide using secret data, that was found and corrected not too long ago. There are libraries to help people get this correct on different hardware and compilers.

Bernstein's Blog

Posted Dec 9, 2025 19:17 UTC (Tue) by ballombe (subscriber, #9523) [Link]

... and hopefully take the comment by anybody working for the NSA and GCHQ with a pound of salt ?
Remove their posts from the list and look at what is remaining.

WG last call ended without consensus to ppublish?

Posted Dec 8, 2025 19:47 UTC (Mon) by geofft (subscriber, #59789) [Link] (1 responses)

If I'm reading right, https://mailarchive.ietf.org/arch/msg/tls/Gc6KVPrVHn-QCke... from less than a day ago is one of the other WG chairs declaring that there is NOT consensus to publish the WG item as an Internet-Draft?

Also, I'm a little bit confused by terminology but I'm not totally following the process outlined in this argument. I think the consensus call in March was whether to adopt the submitted document from the individual authors as a WG item, which is what Bernstein objected to and appealed unsuccessfully. The current consensus call is whether to send the document on to the IESG for standardization. Is that correct? When the article says "the working group has adopted the draft," is that referring to the March adoption that was just confirmed on appeal?

WG last call ended without consensus to ppublish?

Posted Dec 8, 2025 21:00 UTC (Mon) by ballombe (subscriber, #9523) [Link]

There were three options:
1. Publish as is
2. Publish with text encouraging hybrid crypto
3. Do not publish

The chairs decided that option 2 has consensus (which requires the document to be updated and resubmitted).

In any case, thanks Daroc for having written this article.

Hybrid should be required

Posted Dec 9, 2025 19:31 UTC (Tue) by david.a.wheeler (subscriber, #72896) [Link] (3 responses)

At this point in time, I think hybrid should be the only allowed approach. PQC algorithms are far less mature, as evidenced by SIKE. Hybrid is the only way to ensure we aren't making things worse by using new, less mature algorithms.

Hybrid should be required

Posted Dec 9, 2025 23:32 UTC (Tue) by hailfinger (subscriber, #76962) [Link] (2 responses)

Does that mean you want to disallow pure-RSA and pure-ECC as well?

Hybrid should be required

Posted Dec 10, 2025 1:16 UTC (Wed) by brunowolff (guest, #71160) [Link] (1 responses)

There are actually use cases for RSA and ECC only. They require enough less resources than PQ algorithms that it might matter. For systems with secrets that expire quickly, PQ protection may not be important, since as far as we know there aren't currently any PQ machines that can break currently used key sizes for RSA and ECC. The converse isn't true, as there is very litte extra relative cost to adding RSA or ECC to a PQ algorithm and there is significant safety added by doing so.

Hybrid should be required

Posted Dec 10, 2025 11:52 UTC (Wed) by hkario (subscriber, #94864) [Link]

ML-KEM-768 is actually faster than X25519, ECDH with P-256, not to mention FFDH 2048...

It's "less risky", not "more secure"

Posted Dec 10, 2025 19:12 UTC (Wed) by squarooticus (subscriber, #105300) [Link]

The characterization that "the hybrid approach [was] more secure" is not quite right, because we don't know how secure ML-KEM or other post-quantum key exchanges are. What it is is *less risky*: over any given timespan, there is less of a risk of complete breakage (from some novel attack on a post-quantum algorithm) because of the presumption that ECDH will not be broken by a practical quantum computer anytime soon, and that even when it is that it will be very expensive for some period to actually mount such an attack.


Copyright © 2025, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds