|
|
Subscribe / Log in / New account

Verified cryptography for Firefox 57

The Mozilla Security Blog announces that Firefox 57 will benefit from the addition of a formally verified crypto package. "The first result of this collaboration, an implementation of the Curve25519 key establishment algorithm (RFC7748), has just landed in Firefox Nightly. Curve25519 is widely used for key-exchange in TLS, and was recently standardized by the IETF. As an additional bonus, besides being formally verified, the HACL* Curve25519 implementation is also almost 20% faster on 64 bit platforms than the existing NSS implementation (19500 scalar multiplications per second instead of 15100) which represents an improvement in both security and performance to our users."

to post comments

Verified cryptography for Firefox 57

Posted Sep 14, 2017 15:30 UTC (Thu) by ms (subscriber, #41272) [Link] (34 responses)

I don't think I've ever whinged about a title before on LWN (or actually much at all on LWN), but I would suggest this is close to being misleading. The code has been verified to be free of large classes of bugs. But I worry the title can suggest the cryptography has been "verified as unbreakable". Maybe it's just me.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 15:55 UTC (Thu) by rvfh (guest, #31018) [Link] (1 responses)

As usual on LWN, the title is the original article title...

parenthetical correctness

Posted Sep 28, 2017 5:04 UTC (Thu) by Garak (guest, #99377) [Link]

and if the title had read

Verified cryptography for Firefox 57 (mozilla security blog)

than presumably ms and I would have had different reactions. Something akin to, e.g.

Firefox takes a Quantum leap forward with new developer edition (ars technica)

I.e. if it weren't for the "(ars technica)" there, I would be disturbed by the verbiage as an LWN headline choice. Sigh.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 15:56 UTC (Thu) by ledow (guest, #11753) [Link] (15 responses)

Not really.

You can't ever verify that it's unbreakable. That's like asking "we guarantee that someone can't invent new maths in the next decade that we currently know nothing of?"

Code verification, however, just verifies that the code does what it says it would do and doesn't get caught in the traps available.

To be honest, it doesn't even mean that the code doesn't leak memory, or pass out encrypted data to the world, or have a backdoor for the NSA. Unless the specification it was "verified" against says that too.

The algorithm is mathematically verified to be converted to code correctly. That's about the best way you can put it. And you can't stick that in a headline.

The algorithm could still be wrong. Or processors could still mess up certain assumptions (e.g. that people can't sniff the key mid-operation), etc.

To be honest, given the expense of verifying code like this, I'd be happier if they just put that effort/money into verifying ECC itself, rather than a particular implementation of it.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 16:11 UTC (Thu) by ms (subscriber, #41272) [Link] (1 responses)

> The algorithm is mathematically verified to be converted to code correctly. That's about the best way you can put it. And you can't stick that in a headline.

You totally could - especially on LWN :)

Verified cryptography for Firefox 57

Posted Sep 16, 2017 17:05 UTC (Sat) by dahveed311 (subscriber, #103741) [Link]

Agreed. I come here because this isn't buzzfeed.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 20:41 UTC (Thu) by HelloWorld (guest, #56129) [Link] (12 responses)

> You can't ever verify that it's unbreakable. That's like asking "we guarantee that someone can't invent new maths in the next decade that we currently know nothing of?"
An encryption algorithm can be considered unbreakable by conventional (that is, non-quantum) computers if there is no algorithm that can decrypt an encrypted text in less than exponential time on average without knowing the key. I can't see why proving that would be any less possible than proving that there is no sorting algorithm faster than O(n*log(n)), and this has in fact been proven. Is it difficult? Sure, it might require completely new approaches to complexity theory. But saying it's impossible is tantamount to saying that nobody can ever invent this new math that we currently know nothing of. Oh, the irony…

Verified cryptography for Firefox 57

Posted Sep 15, 2017 19:08 UTC (Fri) by simcop2387 (subscriber, #101710) [Link] (3 responses)

It's not really about exponential time, or the complexity. Just that you can't do the decryption without having to try all of the possible keys. DES (single DES) was broken before some changes to it by NSA researchers because of badly chosen parameters that allowed it to be vulnerable to differential crypt-analysis. it still takes exponential time related to the key size, but you no longer would have to try all possible keys.

Verified cryptography for Firefox 57

Posted Sep 16, 2017 16:42 UTC (Sat) by HelloWorld (guest, #56129) [Link] (2 responses)

> It's not really about exponential time, or the complexity. Just that you can't do the decryption without having to try all of the possible keys.
Those boil down to the same thing. Do you actually know what exponential complexity means in this context?

Verified cryptography for Firefox 57

Posted Sep 16, 2017 18:22 UTC (Sat) by mpr22 (subscriber, #60784) [Link] (1 responses)

For an N-bit key, a cryptanalytical attack that takes K * 2^(ceil(N/64)) operations where K << 2^64 is still, in the strict mathematical sense, exponential in N... but N has to be pretty damn big before you're safe from that attack.

Verified cryptography for Firefox 57

Posted Sep 16, 2017 19:48 UTC (Sat) by HelloWorld (guest, #56129) [Link]

Right, point taken.

sorting

Posted Sep 16, 2017 6:03 UTC (Sat) by ncm (guest, #165) [Link] (7 responses)

Radix sort is O(n).

It is sometimes possible to make proofs about existence of algorithms, but all too often the proof relies on hidden assumptions that your opponent is happy to violate. Despite that the Halting Problem is proven insoluble, we routinely prove that particular programs reliably terminate.

All there is to say whenever somebody does what you have just proven impossible is "huh" -- an expressian universal to all languages, even those not used to express proofs.

A proof that a library function really satisfies a spec despite all the peculiar properties of what programming languages call integers is a notable achievement. A proof that the spec has all the properties claimed for it would be another.

sorting

Posted Sep 16, 2017 8:55 UTC (Sat) by HelloWorld (guest, #56129) [Link] (5 responses)

> Radix sort is O(n)
Yes, the O(n * log(n)) result applies to comparison-based algorithms, which radix sort is not. This is well-known and not a real counter-argument to my point, because you can avoid the problem by simply not making that sort of assumption.

> Despite that the Halting Problem is proven insoluble, we routinely prove that particular programs reliably terminate.
It is proven that there is no program that can reliably determine for all programs whether or not they're going to halt. This is very different from saying that there is no program that can reliably determine for any programs whether or not they're going to halt. The fact that we can prove termination for *some* programs doesn't say anything about the halting problem, and that you seem to think it does tells more about your math education (or lack thereof) than anything else.

sorting

Posted Sep 17, 2017 4:36 UTC (Sun) by ncm (guest, #165) [Link] (4 responses)

I see that you miss the point.

sorting

Posted Sep 17, 2017 13:11 UTC (Sun) by HelloWorld (guest, #56129) [Link] (3 responses)

It's not that I miss the point, it's that there's no point to miss.

sorting

Posted Sep 17, 2017 22:21 UTC (Sun) by matthias (subscriber, #94967) [Link] (1 responses)

The point of ncm was, that you always have to make assumptions to prove anything. For the lower bound of sorting, you have to assume comparison based sorting. For proving lower bounds on cracking cryptography you need other assumptions. E.g. most of the complexity theory is build on the assumption that all sensible models of computation can simulate each other with polynomial overhead. An assumption that is not necessarily true for quantum computers.

It is very unlikely that a proof that some encryption is secure can be made without any assumptions. It works for the One-Tine-Pad, where even the information theory says, that the encryption cannot be broken. But this is a very special case. The BB84 quantum cryptography protocol was proved secure. However, the proof assumes that our understanding of quantum mechanics is correct. I do not see any way, how such a proof could be done without this assumption.

Somewhere below, you give the example that there is no rational solution for x^2=2. Of course mathematicians can disprove the existence of some well defined mathematical object. The point is that breaking a cryptographic scheme is not that well defined. Once you establish a precise mathematical definition, this definition will imply certain assumptions, e.g., model of computation.

I want to mention two more very prominent assumptions that no one has proven, but that are fundamental to all complexity theory:
1. The Church-Turing Thesis stating that the functions computable in our universe are exactly the Turing computable functions.
2. Consistency of ZFC (fundamental to maths in general, cannot be proven in ZFC).

You should not misunderstand me. I do not say that proofs are worthless. I did my PHD in theoretical computer science, so I know the value of proofs. A formal proof saying ECC is unbreakable (by Turing machines) would be worth a lot (especially as it would imply P!=NP). However, if one of the two above assumptions would turn out to be false, someone could come up with different maths (this could be what ledow meant). I believe in both assumptions, but I do not believe that we can ever proof them to be true.

sorting

Posted Sep 19, 2017 2:32 UTC (Tue) by HelloWorld (guest, #56129) [Link]

Right, I actually explicitly excluded quantum computers in my original post. Thanks for your post, it was very insightful.

sorting

Posted Sep 18, 2017 16:11 UTC (Mon) by ncm (guest, #165) [Link]

Aggressively missing the point is not an improvement over idly or negligently missing the point.

sorting

Posted Sep 16, 2017 9:30 UTC (Sat) by farnz (subscriber, #17727) [Link]

Note that with the Halting Problem, the get-out is that you can divide programs into three piles, but not two:

  1. Programs that definitely halt.
  2. Programs that definitely do not halt.
  3. Programs that might or might not halt.

The trick to proving termination in real-world situations is twofold; first, minimise the number of programs you put into category 3. Then, declare that all programs in category 3 are unknowable, and it's an error case to deal with one of them - for many practical uses of the halting problem, this amounts to declaring that programs in category three do not halt.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 16:19 UTC (Thu) by xtifr (guest, #143) [Link] (3 responses)

"Verified" seems like a pretty odd word to use for that even if you're ignorant enough to believe that such a thing is possible. (Which I suspect that very few LWN readers are.) My misreading was that they've finally confirmed that Mozilla has cryptography <em>at all</em>. :)

Verified cryptography for Firefox 57

Posted Sep 14, 2017 18:34 UTC (Thu) by riddochc (guest, #43) [Link]

That was my reaction too. "So the previous version wasn't verified to have cryptography? Good thing someone checked, then, just in case."

Verified cryptography for Firefox 57

Posted Sep 14, 2017 20:55 UTC (Thu) by HelloWorld (guest, #56129) [Link]

> "Verified" seems like a pretty odd word to use for that even if you're ignorant enough to believe that such a thing is possible. (Which I suspect that very few LWN readers are.)
I don't see why it wouldn't be possible. An encryption algorithm can be considered unbreakable if there is no algorithm that can decrypt an encrypted text in less than exponential time on average without knowing the key. And I wouldn't know any reason in principle why such a property would be impossible to prove. Mathematicians prove the non-existence of things all the time. Euclid did it more than 2000 years ago when he proved that there are no natural numbers p and q such that (p/q)^2 = 2.

Anyway, I don't think you should be calling people ignorant in cases like this.

Verified cryptography for Firefox 57

Posted Sep 21, 2017 17:09 UTC (Thu) by smitty_one_each (subscriber, #28989) [Link]

The scare quotes make the point effectively.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 18:01 UTC (Thu) by flussence (guest, #85566) [Link] (4 responses)

It wouldn't really make a difference if it *was* mathematically unbreakable, since the code's never been the weak link in the chain. The pathetically broken CA system has always been the problem here.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 18:46 UTC (Thu) by derobert (subscriber, #89569) [Link] (3 responses)

Vulnerabilities like Heartbleed show that the code can be a problem, too. (Given, that was OpenSSL not NSS).

Verified cryptography for Firefox 57

Posted Sep 15, 2017 11:34 UTC (Fri) by hmh (subscriber, #3838) [Link] (2 responses)

Side-channels are also a real, and very troublesome issue when implementing crypto. One thing even gnupg doesn't get fully correct yet, for example, is fully constant-time bignumber math which is power and timing-invariant relative to the data being processed (it is mostly acceptable to be variable according to the data type size, though, since this is assumed to be known to the adversary in every crypto algorithm design).

And code full of side-channels would still be perfectly correct, you need some rather extreme constraints to have noise, power, time, cache, and even execution-unit/performance-counter side-channel protections... for example, not only two multiplications need to always take the same time, they must always do the same number of muls/adds/etc (time, power) *in the same sequence* (power, RF noise), with the same memory ordering pattern (cache)...

Do note gnupg (and openssl, etc) have all sort of defenses to mitigate known such side-channels that have been proved to be exploitable, but they're not perfect. This one reason why I mostly assume only half of the secret keys are not being leaked somehow (thus using twice the currently recommended size at the very least).

Verified cryptography for Firefox 57

Posted Sep 17, 2017 4:38 UTC (Sun) by biergaizi (guest, #92498) [Link] (1 responses)

The Curve25519 ECC algorithm is side-channel resistance by design, everything is constant time, no branches in the code path. There may be other side-channels but the major ones are closed by design. ChaCha20, a steam cipher, still designed by Daniel J. Bernstein, is also side-channel resistance. If the code is verified, I assume the implementation is as secure as design.

In contrast, it's nearly impossible to implement AES withing leaking timings, until the algorithm implemented by the hardware. The design of CBC mode in SSL/TLS is the worst one, first there's Lucky13 attack exploiting timings, the fix is using constant time code, e.g. no "if" statement is allowed, implementing logic using XOR/AND/NOT, etc, program like Brainf**k, unfortunately the fix itself causes another timing leak, the padding oracle of CVE-2016-2107, only discovered years later.

It's 2017, and one can say if someone still designs algorithms and cryptosystems on paper, but practically completely impossible to implement on physical computers without leaking timings, then the design itself is not good at all in the first place.

Immature fields can be an exception, like Post-Quantum Cryptography. Unfortunately we waited for 15 years for an algorithm design that solves practical issues, and now a new wave of "secure on paper, extremely difficult to implement on physical computers" algorithms are coming, inevitably...

Verified cryptography for Firefox 57

Posted Sep 24, 2017 3:09 UTC (Sun) by tterribe (guest, #66972) [Link]

As important as (if not more important than) branches are memory access patterns. I think this is the source of trouble in most problematic AES implementations. See Section 5.1 of RFC 7748.

An excellent property for side-channel-resistant attacks to have is that all memory accesses are completely deterministic, i.e., the locations that are read from or written to are not dependent on the secret key or the plain text.

Verified cryptography for Firefox 57

Posted Sep 14, 2017 22:42 UTC (Thu) by roc (subscriber, #30627) [Link] (5 responses)

It's not just verified to be free of large classes of bugs, but it's verified to be a correct implementation (of Curve25519 in this case), i.e. it produces the correct answers for all inputs.

People who are interested should read about the overarching project: https://project-everest.github.io/

"Verified as unbreakable" is not yet present in this case, but it's a realistic goal for the future, for the right definition of the term. We are able to verify that various cryptographic protocols are "unbreakable", in the sense that if the endpoints implement those protocols correctly and certain algorithmic hardness results hold, then an adversary who can only observe and manipulate what's on the wire learns nothing about certain data (e.g. the contents of packets). Work like Everest can prove the correctness of the endpoint implementations. The hardness assumptions can only be defeated by an adversary who makes mathematical breakthroughs --- we're talking Fields Medal-worthy work --- or if you use too-small key sizes. Against verified crypto, plausible attacks will be either direct attacks against endpoints, or attacks that learn something other than what the protocol was protecting (e.g. packet timing instead of packet contents).

finally someone who knows what he's talking about

Posted Sep 14, 2017 23:20 UTC (Thu) by HelloWorld (guest, #56129) [Link] (1 responses)

Thank you for this interesting comment!

finally someone who knows what he's talking about

Posted Sep 15, 2017 8:35 UTC (Fri) by Wol (subscriber, #4433) [Link]

This is what I moan about when I say Science isn't Maths.

The whole point of a proof in maths is that it shows your assumptions are self-consistent.

The whole point of a proof in science is that it shows your assumptions are in line with reality.

And a lot of the comments here show that people have difficulty grasping the difference (yes, that seems to be the norm :-(

That's why, when you're presenting a proof like this, it's important to state all your assumptions (like the grandparent post does), but it's also why peoples' eyes glaze over because it's all maths and so many people seem to be scared of maths :-(

Cheers,
Wol

Verified cryptography for Firefox 57

Posted Sep 15, 2017 11:07 UTC (Fri) by ballombe (subscriber, #9523) [Link] (2 responses)

There is a big asymmetry here:
If you prove that ECDL cannot be solved in polynomial time in general, you will probably get the Field medal if you are young enough.
However, if you prove that ECDL can be solved in subexponential time, chance are you will not be awarded the Field medal.

Why ? because there a precedent for breaking similar algorithms, while there is no precedent for negative proofs
(this is an instance of the P=NP problem).

The truth is a large number of mathematicians expect ECDL to be solvable in subexponential time, so are unlikely to work toward proving the opposite.

Verified cryptography for Firefox 57

Posted Sep 15, 2017 13:34 UTC (Fri) by Wol (subscriber, #4433) [Link] (1 responses)

And how many of these things turn out to be solved almost by accident?

I think it was Fermat's, but I'm sure there have been several, where some mathematician presented a proof of something completely unrelated and then added, almost in a footnote, "by the way, Fermat's is a trivial subset of this proof".

I think everyone knew a big announcement was coming, but really didn't expect a Fermat proof to be announced and dismissed in such a trivial manner!

Cheers,
Wol

Verified cryptography for Firefox 57

Posted Sep 19, 2017 14:09 UTC (Tue) by gowen (guest, #23914) [Link]

I think it was Fermat's, but I'm sure there have been several, where some mathematician presented a proof of something completely unrelated and then added, almost in a footnote, "by the way, Fermat's is a trivial subset of this proof".
This was Wiles and Fermat, but its worth noting that reducing FLT to a footnote was a jokey way by Wiles to bury the lede. He knew what the big story was here, and everyone in the room (who were basically the worlds foremost minds on modular forms) knew the best-known implication of the Taniyama-Shinmura Conjecture - and hence where Wiles was likely to end up - well before that point. FWIW - I believe the lectures ended "Corollary: Fermat's Last Theorem"

Verified cryptography for Firefox 57

Posted Sep 15, 2017 10:51 UTC (Fri) by BillyBob (guest, #118071) [Link]

I'd argue that most LWN readers understand what 'formal verification' is.


Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds