LWN.net Logo

BruCON: Can we trust cryptography?

September 30, 2009

This article was contributed by Koen Vervloesem

On September 18 and 19, the community-made conference BruCON made its first appearance in Brussels. BruCON is organized by a small group of Belgian people working in the security industry who wanted to create a security conference with room for independent research and without a commercial undertone. As one of the organizers, Benny Ketelslegers, said at the beginning of the conference: "We have a lot of security researchers in Belgium, but we didn't have a conference here that suits our needs."

Being a Belgian conference, there couldn't be a better speaker for the first lecture than Vincent Rijmen, a Belgian cryptographer and one of the designers of the Advanced Encryption Standard (AES). He is currently working as an associate professor at the University of Leuven. His talk was named Trusted cryptography [PDF] and discussed the growing doubts about the trust we can put in cryptology and its applications. His point being: today we can't trust cryptography.

Rijmen took the audience back to where cryptography started. In the Roman days, we had the Caesar cipher, which is a simple substitution cipher in which each letter in the plain text is replaced by a letter some fixed number of positions down the alphabet. This encryption method is named after Julius Caesar, who used it with a shift of three positions to communicate with his generals. Polyalphabetic substitution ciphers, like Vigenère or the Enigma machine used by Germany in World War II, were also driven by military requirements.

So cryptography in the past was used solely for military purposes, Rijmen explained, but there is more to it than that: "Encryption was used between trusted parties, with secure perimeters. This is much different with our current use of encryption: think about our bank cards, where crackers can even measure the power consumption of the smartcard chip to try to crack the encryption."

The shift to the public

This shift began in the 1970s and 1980s, for example with the concept of public-key cryptography introduced by Whitfield Diffie and Martin Hellman, and RSA being invented by Ronald Rivest, Adi Shamir and Leonard Adleman. As a result of these technical breakthroughs, Rijmen maintains that cryptography finally entered the public world:

Suddenly cryptography was used for non-military purposes, such as secure communications between citizens, PKI (Public Key Infrastructure) and key agreement, digital signatures, blind signatures, digital cash, etcetera. Cryptography was even used as a means against nuclear arms: the American cryptographer Gustavus Simmons designed a protocol to verify adherence to the Comprehensive Test Ban Treaty for nuclear weapons, based on sensor data that could not be trusted completely. All this was a real cryptologic revolution.

This is how we have come to the current situation, where modern communication networks are all based on cryptography. We have the A5/1 stream cipher in the GSM cellular telephone standard, the KeeLoq block cipher used in the majority of remote car keys, and the Wired Equivalent Privacy (WEP) algorithm which was the first attempt to secure IEEE 802.11 wireless networks. It is not a coincidence that Rijmen named these protocols: they are all broken. The A5/1 design was initially kept secret, but after leaks and reverse engineering several serious weaknesses have been identified. KeeLoq was cryptanalyzed with much success in recent years. And WEP, which is using the stream cipher RC4, can be broken in minutes with off-the-shelf hardware and open source software such as Aircrack-ng. "RC4 is a good protocol, but it is used incorrectly in WEP. If one of my students came up with such a design, he should redo my course and come back next year", Rijmen told the audience.

Security myths and evil cryptography

All these defective designs, supposedly made by smart people, don't improve trust in cryptography. Rijmen blames the defective designs to a couple of "industry myths" that don't seem to die out:

Many companies think that it's OK to first go to market and later add security, as if one could add security as an extra layer. Then there's the myth that obscurity, for example keeping your algorithm secret, means extra security. And then many companies think the more complex their solution is, the better. Others say that they have no room, money or time to add security. That's, for example, a problem with many chip designers. And the last myth is that many companies think they'll never need to update, and then they hardcode everything. I have even seen designs where the output of the random number generator was hardcoded...

But there are cases where cryptography evidently works. When there is a business case, companies are suddenly able to do it right. For example, HP implemented authentication between the printer and the ink cartridge, as Rijmen explains:

If you use a non-HP cartridge, an HP printer prints with less quality, to make the user think the HP cartridges are better. The same happens with batteries in mobile phones: some phones raise their antenna power to the maximum if you use a battery from another company, just to drain the battery and make you think the phone's own batteries are better.

But it's not only in industry where things go wrong. Rijmen maintains that there are also some fairly pervasive research myths circulating. Many security researchers are too academic and think that a good security model is a model that allows them to prove theorems. In their eyes, security is, then, what they can prove about some objects in their abstract mathematical models. This whole abstract notion of security amounts to a degenerate concept of "good research" as applying well-known methods to well-known problems, taking all the creativity and innovations out of the research.

Added to this, we see that malware writers have discovered cryptography. They use it to escape detection or to implement recovery after partial exposure. But there's also a worrying trend where malware encrypts the hard drive of a victim and then the malware writer extorts the victim to get his data back. The consequence of these bad cryptography examples in industry, academia, and the "evil cryptography" used by malware writers means that the public loses trust in the technology. And that's were Rijmen's talk came at a breakpoint. His message was clear:

Luddites are coming in action because of these failures. For example, in many countries, also in Belgium, there are movements against electronic voting because they don't trust the cryptography in it. If we continue like this, we risk that cryptography will be abandoned and return only to military applications. It's time for a change.

It was funny to see how most of the audience's questions were about Rijmen's example of electronic voting, although he hastened to add that e-voting was just an example of the perils of bad cryptography and their consequences for our trust in cryptography in general. It was not, in any way, a remark about the security of particular e-voting implementations. Many persons attending his talk were genuinely worried about the security of e-voting, but don't consider themselves Luddites. One person stated that he doesn't trust e-voting because this centralizes the power of counting the votes into a black-box network of electronic devices from the same producer. In contrast, traditional voting decentralizes the counting by outsourcing it to thousands of people, which makes the votes less susceptible to manipulation.

A new kind of cryptography development

To regain trust in cryptography, Rijmen has two proposals: collaborative standards development and best practices. As an example case, Rijmen points to the development of his own AES. In January 1997, the National Institute of Standards and Technology (NIST) announced the initiative to develop a successor to the aging Data Encryption Standard (DES). NIST asked for input from interested parties, and in September 1997 there was a call for new algorithms. In the next three years, fifteen different designs were submitted, analyzed, narrowed down, and, in October 2000, NIST announced the winner: Rijndael, designed by Vincent Rijmen and Joan Daemen. Rijmen stressed some remarkable facts about the AES process:

NIST identified and approached the relevant academic community for the development of its new algorithm, even outside the USA, which makes it even more remarkable. Moreover, they forced the submitters to adopt a 128-bit block length and a key length of at least 128-bit. Ciphers with this strength were rare at the time of the announcement. Their process with evaluation rounds and conferences was a good cross-breeding between academia and industry. Hundreds of papers, reports, notes and comments were published. And last but not least: the whole process was open with many contributions. Rather than simply publishing a successor, NIST was open for all suggestions.

According to Rijmen, the AES process should be taken as an example for collaborative standards development, not only for algorithms like AES, but also for protocols and even applications. The organizers of such a competition should invite the relevant people to contribute, get both the industry and academia on board, and envision future requirements. Moreover, they should advertise the development process, motivate submitters and reviewers, and evaluate the evaluations. Last but not least, they should push the result after all this work.

Rijmen's second proposal is to limit the number of standards and standard solutions, an approach that he calls green cryptography. It's all about recycling: reuse ideas that have proven their merits, and keep the implementations simple. This makes sense, because complexity is the culprit behind a lot of instances of cryptography failing:

From the cryptographer's perspective, this means that they recycle existing design strategies, components and cryptographic primitives. To my delight I see this happening now in the SHA-3 competition: many candidates recycle parts or ideas of AES: in round 1 out of the 51 candidates 17 were AES-based, while in round 2 of the 14 candidates still 6 were AES-based.

From the developer's perspective, this means that they have a marketplace of algorithms to pick from, and developers should be discouraged of making their own home-brew algorithms: "Unless you can absolutely not, use a standard." Rijmen gave an example of how it shouldn't be done. Since 2000, there is a trend to combine encryption and authentication into one operation, because encryption without authentication leads to weaknesses in almost all applications. There are a couple of standards and RFCs for authenticated encryption, but what did Microsoft do with its BitLocker Drive Encryption in Windows Vista and 7? It uses AES (which is good), in CBC mode (which Rijmen calls "the standard mode in the 1980s, not in 2000"), and without authentication, against all security trends. Microsoft's explanation was that "There is no space to store authentication tags on the hard disk", although each hard disk reserves space for bad blocks. Rijmen's take-home message is that we don't need better cryptography, but better implementations, sticking to the standards: "Cryptography is not do-it-yourself stuff."

Security should be open

Regarding the open source aspect, Rijmen concluded that openness has been the pulse of cryptographic design in the last few decades, and that we should expect the same from its implementations: "Openness works in cryptography because cryptographers have access to the design and the analysis." But he adds that we should not focus on opening the source for cryptographic implementations: opening the source alone is not sufficient to attract cryptographers and let them research the code, we should open the whole standards development process.

The BruCON organizers showed the same openness as their first speaker. Different from other security events that are more commercially focused, BruCON gathered hackers (in the good sense), security researchers, security vendors, and governments, and they succeeded with a diverse mix of presentation topics and speakers, from Rijmen's metatalk, talks about social engineering and the information leakages in social networks to highly technical talks about the risks of IPv6, SQL injection, and the future techniques of malware. Moreover, anyone who has missed the conference can find the slides and even video recordings of almost all of the presentations online. Although the BruCON organizers wanted to make it a real "Belgian" conference, they didn't make the mistake of being too chauvinist. They were able to attract a lot of top-class international speakers, and the audience came from all over Europe and from the US. Your author hopes they return next year with a second event.


(Log in to post comments)

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 1:35 UTC (Thu) by smithj (subscriber, #38034) [Link]

"I have even seen designs where the output of the random number generator was hardcoded..."

As always, xkcd predicts (or at least mimics) reality: http://xkcd.com/221/

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 8:54 UTC (Thu) by bangert (subscriber, #28342) [Link]

For somem reason I expexted BruCON to be the Bruce Schneier Conference...
not that far off.

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 19:10 UTC (Thu) by salimma (subscriber, #34460) [Link]

I thought of brute-force cracking instead. In the same vein :)

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 10:46 UTC (Thu) by pointwood (guest, #2814) [Link]

"If you use a non-HP cartridge, an HP printer prints with less quality, to make the user think the HP cartridges are better. The same happens with batteries in mobile phones: some phones raise their antenna power to the maximum if you use a battery from another company, just to drain the battery and make you think the phone's own batteries are better."

You'd think that would be illegal...

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 15:53 UTC (Thu) by abatters (✭ supporter ✭, #6932) [Link]

But instead, with laws like the DMCA, the activity that is illegal is trying to restore a level playing field by bypassing the "protection" that cripples the device.

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 17:00 UTC (Thu) by martinfick (subscriber, #4455) [Link]

But nothing outlaws you from not buying such products in the first place... (yet)

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 19:06 UTC (Thu) by flewellyn (subscriber, #5047) [Link]

It should run afoul of anti-trust laws, but good luck getting the US government to bring a lawsuit.

Maybe the EU?

BruCON: Can we trust cryptography?

Posted Oct 1, 2009 19:26 UTC (Thu) by salimma (subscriber, #34460) [Link]

I don't think HP has a monopoly on inkjet printers, so unfortunately I am not sure if US anti-trust laws can be brought to bear. IANAL, though.

BruCON: Can we trust cryptography?

Posted Oct 14, 2009 13:07 UTC (Wed) by mcortese (guest, #52099) [Link]

You'd think that would be illegal...
Isn't it very similar to what happens with the regional code of DVDs?

BruCON: Can we trust cryptography?

Posted Oct 2, 2009 14:57 UTC (Fri) by AndreE (subscriber, #60148) [Link]

It's interesting that a major point is basically to trust cryptology standards. I have seen this statement echoed by various crypto guys around the web.

There is a real convenient tendency when programming/desigining to perhaps exagerrate ones own knowledge, or at the least try to build something from what little knowledge you have. In many areas of CS, this is probably a good thing. Afterall, sometimes you learn best by doing. However, having seen a number of homebrew suggestions around the place and having seen them shot down, I remain convinced that crypto is complex and difficult enough that we must leave it to the experts. Unless one can formally and rigourously describe the strenght of their scheme (i.e., in the sort of mathematical language that give me a migraine), homebrew solutions will never cut it.

Nevertheless, people are constantly deluding themselves into believing their "unique" solutions are better than a systems devised by uber-geeks and techno-spooks

messing with crypto

Posted Oct 2, 2009 15:24 UTC (Fri) by pflugstad (subscriber, #224) [Link]

One only needs to look at the Debian random number generator fiasco to see the danger in messing with crypto code without a very thorough understanding of what's going on.

messing with crypto

Posted Oct 3, 2009 20:40 UTC (Sat) by gmaxwell (subscriber, #30048) [Link]

EhÂ… thats more an example of ignorantly modifying code to silence tool warnings, not really much of an example of the tricky implications of cryptography. At most you can say about the debian openssh example is that it shows that security is often an invisible property, but that isn't a crypto specific point... and you can argue that crypto should be left to the cryptonauts but security really must be every developers problem.

The mention of RC4 in WEP in the article makes a better example of the special challenges posed by cryptography, or perhaps the old watermarking attacks against pure CBC dmcrypt volumes prior to the introduction of ESSIV and LRW... the point that you can use the primitives correctly but still produce something insecure because of non-obvious (and sometimes highly mathematical) properties of the cryptographic components.

BruCON: Can we trust cryptography?

Posted Oct 2, 2009 21:51 UTC (Fri) by zooko (subscriber, #2589) [Link]

Here's the paper about the BitLocker encryption mode which includes the motivations for the
decisions:

http://download.microsoft.com/download/0/2/3/0238acaf-d3b...
0a0be4bbb36e/BitLockerCipher200608.pdf

BruCON: Can we trust cryptography?

Posted Oct 2, 2009 21:54 UTC (Fri) by zooko (subscriber, #2589) [Link]

Oh, sorry, here it is again:

http://download.microsoft.com/download/0/2/3/0238acaf-d3b...
b3d6-0a0be4bbb36e/BitLockerCipher200608.pdf

Hm. LWN.net doesn't like this URL. Too long? Here's an indirection through a URL shortener
service: http://tr.im/Au0H

Anyway, I think the designers of BitLocker (mainly cryptographer Niels Ferguson -- disclosure: he's
a friend of mine) had fairly good reasons for doing it the way they did.

BruCON: Can we trust cryptography?

Posted Oct 10, 2009 5:41 UTC (Sat) by jtroutman (guest, #61284) [Link]

I'm glad to see some discussion regarding green cryptography. This is a concept of mine that started to take shape back in 2006, based on the fundamentals of mature and minimalist design. With recycling at its core, "green cryptography" seemed like an appropriate in-the-now moniker.

When my research took an AES-centric approach, I had the exceptional honor of being joined by Vincent Rijmen, whose wealth of knowledge and generosity helped shape our work into an extremely fun project -- one that we believe in and hope to see take off. We'll certainly be nourishing it along the way.

I must say, though: I am a bit jealous that I wasn't there to hear its introduction! Oh well -- that's why we have good recaps like yours. :)

Thanks!

- Justin Troutman

Luddite

Posted Oct 13, 2009 18:54 UTC (Tue) by gswoods (subscriber, #37) [Link]

I for one resent being called a "Luddite" because I have reservations about electronic voting systems. Cracking the crypto is only one of many concerns with such devices. Most people I know who have these concerns are far more worried about exploitable software bugs in the proprietary closed-source software that runs these things, or in the human aspects of how the voting data is handled, or in trusting the companies that make the devices. He's making it sound like the crypto is what we're worried about, when for the most part it is not.

Copyright © 2009, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds