By Jake Edge
January 19, 2011
An interesting and brutally honest security
advisory for the Tarsnap "secure online
backup service" was released on January 18. It certainly shows a
refreshing amount of candor that other projects and companies would do well
to emulate. But there are some other lessons to be learned from the
vulnerability including the value of source code availability and the
perils of refactoring.
Tarsnap is a company founded by Colin Percival that provides encrypted
online storage for backups.
The client code is available, but it is not
free software. The code can only be used, unmodified, to talk to the
Tarsnap service. The server code is evidently completely unavailable, but
Percival is interested
in hearing from folks with ideas for improvement to the client—or
those who have found a security hole.
Percival was contacted by Taylor R. Campbell on January 14 with just such a
bug. It turns out that a refactoring of the code for the 1.0.22 release,
which was made in June 2009, introduced a bug that potentially would allow
anyone with access to the data to decrypt it. The data is stored in the
Amazon S3 "cloud", which limits the access to a small group, but that
doesn't really fit well with the security model espoused by
Tarsnap. In the advisory, Percival makes that clear:
I will not attempt to decrypt and read your data. Amazon claims that it
does not inspect Amazon Web Services users' data. And the US government is
theoretically bound by a constitution which prohibits unreasonable
searches. This is all, however, entirely irrelevant: The entire point of
Tarsnap's security is to remove the need for such guarantees. You shouldn't
need to trust me; you shouldn't need to trust Amazon; and you most
certainly shouldn't need to trust the US government.
In doing the refactoring, Percival removed an auto-increment of a nonce value
used in the Advanced Encryption Standard (AES) Counter
(CTR) mode for encrypting blocks of data. The impact of that is that
someone can
decrypt the data without having the key.
There are two ways that the decryption could be done when the nonce
value is reused, either by comparing
two ciphertexts or by using known plaintext. The former attack is
considered by Percival to be unusable on the Tarsnap data because of the
compression done to the data blocks before they are encrypted. On the
other hand, known plaintext attacks are quite plausible if there is some known
data in the blocks. As Percival points out, full backups are likely to
have any number of files with known contents, namely the files that are
installed by the operating system—binaries, configuration files, and
so on.
The bug was found by Campbell by "reading the Tarsnap source code
purely out of curiosity", which certainly shows the advantage of
making that source available. One wonders if the server code might also
benefit from curious hackers. Percival is creating a bug bounty program
(and seemingly retroactively paying one out to Campbell) to hopefully
ferret out any other problems in the client sooner.
Refactoring is meant to be strictly a clean-up operation that does not
change the semantics of the code in question. When doing refactoring, it
is helpful if there are a set of regression tests that can detect when
refactoring has gone awry. In the comments on the advisory, Percival said
that Tarsnap does not have a test suite of that sort, and pointed out that
it is difficult to create one for cryptographic software, but "I
should probably find some way of automatically testing and/or assert()ing
for nonce-reuse bugs though".
The lack of regression tests is unfortunate, but Tarsnap is hardly alone in
that.
There are countless projects that refactor their code without such a test
suite. This particular incident should serve as something of a reminder to
projects, especially those that are implementing security
features, that refactoring can and does introduce bugs. A test suite is
great, but even just some regression testing of the areas that have been
refactored may find bugs like this one.
Percival is to be congratulated for quickly turning around a fix for the
problem, as well as for being so forthright with the gory details of the
bug and its impact. It is far too often that we see companies trying to
sweep the details of their security holes under the rug—free software
projects sometimes do as well. Bugs happen, security or otherwise, and
there is value in seeing what they are and how they came about. We can
learn from incidents like this.
(
Log in to post comments)