September 30, 2009
This article was contributed by Koen Vervloesem
On September 18 and 19, the community-made conference BruCON made its first appearance in
Brussels. BruCON is organized by a small group of Belgian people working in
the security industry who wanted to create a security conference with room
for independent research and without a commercial undertone. As one of the
organizers, Benny Ketelslegers, said at the beginning of the conference:
"We have a lot of security researchers in Belgium, but we didn't have
a conference here that suits our needs."
Being a Belgian conference, there couldn't be a better speaker for the
first lecture than Vincent Rijmen, a Belgian cryptographer and one of the
designers of the Advanced Encryption Standard (AES). He is currently
working as an associate professor at the University of Leuven. His talk was
named Trusted
cryptography [PDF] and discussed the growing doubts about the trust we
can put in cryptology and its applications. His point being: today we can't
trust cryptography.
Rijmen took the audience back to where cryptography started. In the
Roman days, we had the Caesar cipher, which is a simple substitution cipher
in which each letter in the plain text is replaced by a letter some fixed
number of positions down the alphabet. This encryption method is named
after Julius Caesar, who used it with a shift of three positions to
communicate with his generals. Polyalphabetic substitution ciphers, like
Vigenère or the Enigma machine used by Germany in World War II, were
also driven by military requirements.
So cryptography in the past was used solely for military purposes,
Rijmen explained, but there is more to it than that: "Encryption was
used between trusted
parties, with secure perimeters. This is much different with our current
use of encryption: think about our bank cards, where crackers can even
measure the power consumption of the smartcard chip to try to crack the
encryption."
The shift to the public
This shift began in the 1970s and 1980s, for example with the concept of
public-key cryptography introduced by Whitfield Diffie and Martin Hellman,
and RSA being invented by Ronald Rivest, Adi Shamir and Leonard Adleman. As
a result of these technical breakthroughs, Rijmen maintains that
cryptography finally entered the public world:
Suddenly cryptography was used for non-military
purposes, such as secure communications between citizens, PKI (Public Key
Infrastructure) and key agreement, digital signatures, blind signatures,
digital cash, etcetera. Cryptography was even used as a means against
nuclear arms: the American cryptographer Gustavus Simmons designed a
protocol to verify adherence to the Comprehensive Test Ban Treaty for
nuclear weapons, based on sensor data that could not be trusted
completely. All this was a real cryptologic revolution.
This is how we have come to the current situation, where modern
communication networks are all based on cryptography. We have the A5/1
stream cipher in the GSM cellular telephone standard, the KeeLoq block
cipher used in the majority of remote car keys, and the Wired Equivalent
Privacy (WEP) algorithm which was the first attempt to secure IEEE 802.11
wireless networks. It is not a coincidence that Rijmen named these
protocols: they are all broken. The A5/1 design was initially kept secret,
but after leaks and reverse engineering several serious weaknesses have
been identified. KeeLoq was cryptanalyzed with much success in recent
years. And WEP, which is using the stream cipher RC4, can be broken in
minutes with off-the-shelf hardware and open source software such as Aircrack-ng. "RC4 is a good
protocol, but it is used incorrectly in WEP. If one of my students came up
with such a design, he should redo my course and come back next
year", Rijmen told the audience.
Security myths and evil cryptography
All these defective designs, supposedly made by smart people, don't
improve trust in cryptography. Rijmen blames the defective designs to a
couple of "industry myths" that don't seem to die out:
Many companies think that it's OK to first go to
market and later add security, as if one could add security as an extra
layer. Then there's the myth that obscurity, for example keeping your
algorithm secret, means extra security. And then many companies think the
more complex their solution is, the better. Others say that they have no
room, money or time to add security. That's, for example, a problem with many
chip designers. And the last myth is that many companies think they'll
never need to update, and then they hardcode everything. I have even seen
designs where the output of the random number generator was
hardcoded...
But there are cases where cryptography evidently works. When there is a
business case, companies are suddenly able to do it right. For example, HP
implemented authentication between the printer and the ink cartridge, as
Rijmen explains:
If you use a non-HP cartridge, an HP printer prints
with less quality, to make the user think the HP cartridges are better. The
same happens with batteries in mobile phones: some phones raise their
antenna power to the maximum if you use a battery from another company,
just to drain the battery and make you think the phone's own batteries are
better.
But it's not only in industry where things go wrong. Rijmen maintains that
there are also some fairly pervasive research myths circulating. Many
security researchers are too academic and think that a good security model
is a model that allows them to prove theorems. In their eyes, security is,
then, what they can prove about some objects in their abstract mathematical
models. This whole abstract notion of security amounts to a degenerate
concept of "good research" as applying well-known methods to
well-known problems, taking all the creativity and innovations out of the
research.
Added to this, we see that malware writers have discovered
cryptography. They use it to escape detection or to implement recovery
after partial exposure. But there's also a worrying trend where malware
encrypts the hard drive of a victim and then the malware writer extorts the
victim to get his data back. The consequence of these bad cryptography
examples in industry, academia, and the "evil cryptography"
used by malware writers means that the public loses trust in the
technology. And that's were Rijmen's talk came at a breakpoint. His message
was clear:
Luddites are coming in action because of these
failures. For example, in many countries, also in Belgium, there are
movements against electronic voting because they don't trust the
cryptography in it. If we continue like this, we risk that cryptography
will be abandoned and return only to military applications. It's time for
a change.
It was funny to see how most of the audience's questions were about
Rijmen's example of electronic voting, although he hastened to add that
e-voting was just an example of the perils of bad cryptography and
their consequences for our trust in cryptography in general. It was not, in any
way, a remark about the security of particular e-voting
implementations. Many persons attending his talk were genuinely worried
about the security of e-voting, but don't consider themselves Luddites. One
person stated that he doesn't trust e-voting because this centralizes the
power of counting the votes into a black-box network of electronic devices
from the same producer. In contrast, traditional voting decentralizes the
counting by outsourcing it to thousands of people, which makes the votes
less susceptible to manipulation.
A new kind of cryptography development
To regain trust in cryptography, Rijmen has two proposals: collaborative
standards development and best practices. As an example case, Rijmen points
to the development of his own AES. In January 1997, the National Institute
of Standards and Technology (NIST) announced the initiative to develop a
successor to the aging Data Encryption Standard (DES). NIST asked for input
from interested parties, and in September 1997 there was a call for new
algorithms. In the next three years, fifteen different designs were
submitted, analyzed, narrowed down, and, in October 2000, NIST announced the
winner: Rijndael, designed by Vincent Rijmen and Joan Daemen. Rijmen
stressed some remarkable facts about the AES process:
NIST identified and approached the relevant academic
community for the development of its new algorithm, even outside the USA,
which makes it even more remarkable. Moreover, they forced the submitters
to adopt a 128-bit block length and a key length of at least
128-bit. Ciphers with this strength were rare at the time of the
announcement. Their process with evaluation rounds and conferences was a
good cross-breeding between academia and industry. Hundreds of papers,
reports, notes and comments were published. And last but not least: the
whole process was open with many contributions. Rather than simply
publishing a successor, NIST was open for all suggestions.
According to Rijmen, the AES process should be taken as an example for
collaborative standards development, not only for algorithms like AES, but
also for protocols and even applications. The organizers of such a
competition should invite the relevant people to contribute, get both the
industry and academia on board, and envision future requirements. Moreover,
they should advertise the development process, motivate submitters and
reviewers, and evaluate the evaluations. Last but not least, they should
push the result after all this work.
Rijmen's second proposal is to limit the number of standards and
standard solutions, an approach that he calls green
cryptography. It's all about recycling: reuse ideas that have proven
their merits, and keep the implementations simple. This makes sense,
because complexity is the culprit behind a lot of instances of cryptography
failing:
From the cryptographer's perspective,
this means that they recycle existing design strategies, components and
cryptographic primitives. To my delight I see this happening now in the
SHA-3 competition: many candidates recycle parts or ideas of AES: in round
1 out of the 51 candidates 17 were AES-based, while in round 2 of the 14
candidates still 6 were AES-based.
From the developer's perspective, this means that they have a
marketplace of algorithms to pick from, and developers should be
discouraged of making their own home-brew algorithms: "Unless you can
absolutely not, use a standard." Rijmen gave an example of how it
shouldn't be done. Since 2000, there is a trend to combine encryption and
authentication into one operation, because encryption without
authentication leads to weaknesses in almost all applications. There are a
couple of standards and RFCs for authenticated encryption, but what did
Microsoft do with its BitLocker Drive Encryption in Windows Vista and 7? It
uses AES (which is good), in CBC mode (which Rijmen calls "the
standard mode in the 1980s, not in 2000"), and without
authentication, against all security trends. Microsoft's explanation was
that "There is no space to store authentication tags on the hard
disk", although each hard disk reserves space for bad
blocks. Rijmen's take-home message is that we don't need better
cryptography, but better implementations, sticking to the standards:
"Cryptography is not do-it-yourself stuff."
Security should be open
Regarding the open source aspect,
Rijmen concluded that openness has been the pulse of cryptographic design
in the last few decades, and that we should expect the same from its
implementations: "Openness works in cryptography because
cryptographers have access to the design and the analysis." But he
adds that we should not focus on opening the source for cryptographic
implementations: opening the source alone is not sufficient to attract
cryptographers and let them research the code, we should open the whole
standards development process.
The BruCON organizers showed the same openness as their first
speaker. Different from other security events that are more commercially
focused, BruCON gathered hackers (in the good sense), security
researchers, security vendors, and governments, and they succeeded with a
diverse mix of presentation topics and speakers, from Rijmen's metatalk,
talks about social engineering and the information leakages in social
networks to highly technical talks about the risks of IPv6, SQL injection,
and the future techniques of malware. Moreover, anyone who has missed the
conference can find the slides and
even video recordings
of almost all of the presentations online. Although the BruCON organizers wanted
to make it a real "Belgian" conference, they didn't make the mistake of
being too chauvinist. They were able to attract a lot of top-class
international speakers, and the audience came from all over Europe and
from the US. Your author hopes they return next year with a second
event.
(
Log in to post comments)