This has been a bad few weeks to be a voting machine vendor. Three
separate governments, California, Florida and the UK looked at the devices
and have come to remarkably similar conclusions. The machines they looked
at are poorly designed, poorly implemented and subject to a wide variety of
security threats. None of the studies mentioned it, but it is likely that
the machines looked great.
The most comprehensive study was done
by California Secretary of State Debra Bowen's office. That study looked
at three electronic voting systems, each from a different manufacturer.
Each system had three separate teams investigating, one looking at the source
code, a "red team" that had physical access to the device and an
accessibility team. Their conclusions were not surprising to anyone who has
paid attention to this issue over the years.
All three of the voting machine systems were found to be sorely deficient
by all three teams. Even accessibility, which is one of the major benefits
touted by electronic voting advocates, was found lacking:
Although each of the tested voting systems included some accessibility
accommodations, none met the accessibility requirements of current law and
none performed satisfactorily in test voting by persons with a range of
disabilities and alternate language needs.
Though it is certainly terrible not to meet the needs of some individual
safeguarding the election process and accurately reporting the vote totals
need to be
higher priorities. Since they obviously had not successfully completed the
accessibility task, one would hope they were able to secure the
voting process. Unfortunately, they could not get the primary job done
The red team reports were released first and the conclusions were
The red teams demonstrated that the security mechanisms provided for
all systems analyzed were inadequate to ensure accuracy and integrity
of the election results and of the systems that provide those results.
The teams were able to defeat the physical security of the voting machines,
modify or overwrite the software in the machines as well as subvert the
tabulation machines in order to provide incorrect vote counts. All of this
just by having access to the machines themselves; the same access that
election officials, poll workers and, to a lesser extent, voters, have.
Several days later, the source code teams' reports were released and, at
that point, were almost anti-climactic. Unsurprisingly, they found
numerous, hideous source code flaws in all three systems. Buffer
overflows, hard coded passwords ('diebold' being a particularly difficult
one to guess), misuse of encryption, integer overflows (wrapping
vote counts to negative or zero perhaps); the list goes on an on. It is as
if the voting machine vendors are completely unaware of the last
twenty (or thirty or forty) years of software security flaws.
In reality, they are most likely not unaware, they are just arrogant.
Diebold, Hart and Sequoia (the companies whose machines were studied) do
not depend solely on their technical "prowess" to win bids for providing
voting machines, politics plays a huge role. These are well connected
companies. It also helps that they are all uniformly bad, there are
literally no secure choices for a government agency to make.
Florida's study only covered
Diebold equipment, but it echoed the findings in the California study. Avi
Rubin of Johns Hopkins University, who participated in a 2003 study of
Diebold's voting machine, notes:
So, Diebold is doing some things better than they
did before when they had absolutely no security, but they have yet to do
them right. Anyone taking any of our cryptography classes at Johns Hopkins,
for example, would do a better job applying cryptography.
One of the bigger problems found was that Diebold assigned cryptographic
keys to each voting machine that is derived from an MD5 hash
of the machine's serial number. Rubin again:
This is arguably worse than having a fixed static key in all of the
machines. Because with knowledge of the machine's serial number, anyone can
calculate all of the secret keys. Whereas before, someone would have needed
access to the source code or the binary in the machine.
The UK also released reports
on the outcome of electronic voting trials held in May. The overall
summary of the trial, was, once again, not very favorable:
level of implementation and
security risk involved was
significant and unacceptable.
There remain issues with the
security and transparency of the
solutions and the capacity of the
local authorities to maintain
control over the elections.
This was not the result of security professionals analyzing the systems for
flaws, but was instead noted in actual trials of the equipment in an election.
The California study was quite well done and well thought out, except for
one thing: it was done long after the equipment was bought and used in
elections. This is the kind of study that needs to be done before
buying the equipment. Due to the conclusions of the study, Bowen
revoked the certification of the equipment from all three vendors, but immediately had to
conditionally re-certify them as a practical matter. Even with a six month
lead time, replacement systems (either electronic or of some other kind)
could not be deployed before the 2008 California presidential primary voting.
The reaction to the California study by the manufacturers was typical. It
is the same reaction they have had to each and every study done of the
security of their devices: trivialize it. Each released a statement in
reaction to the study conclusions, essentially admitting the flaws, but
claiming that any "laboratory study" would find vulnerabilities. According
to these vendors, it is impossible to make a secure voting system.
As they certainly know, no one is asking these vendors to
break the laws of physics
or to produce perfectly secure code. It would appear that they expend far
more effort in deflecting criticism and lobbying various legislative bodies
than they spend trying to secure their code and equipment. It is not
necessary that the equipment be tamper-proof, merely that tampering can be
detected. At least minimal precautions, perhaps to the level taught to
computer science undergraduates, should be taken with the software.
This is not anywhere near as hard a problem as the vendors make it out to be.
Many of the techniques needed to secure voting machinery are well known and
well understood, at least outside of the vendors' labs. This is an area
where open source methods could be and should be applied.
Organizations like BlackBoxVoting.org and the NSF Accurate project should be
working on solutions. Private companies have shown themselves to be
completely incompetent at producing secure voting equipment, it is time for
another solution to be tried.
to post comments)