RSA Conference Panel: Lessons Learned from 2008 Election Technology

I spoke in a panel at the RSA Conference yesterday, on the topic of lessons learned in 2008 about voting technology. I thought I’d use this blog to share my remarks, but even though we each spoke for only 5 minutes before the question and answer period, I covered three areas of lessons learned; so I’ll cover them in separate blog posts on each topic of (1) Usability lessons (2) Audit lessons (3) Transparency lessons. But first I should note that my co-presenters were Doug Jones, David Wagner, Dan Wallach, and moderator Jeremy Epstein — each with 2 to 20 times as much experience in election technology as I have, so it was a real honor.

One lesson learned in 2008 was not that voting systems have some serious usability issues (old news) but rather some examples of how usability issues create real threats to the integrity of election processes or election results. One example I can speak to from personal experience in San Mateo County CA, but is played out in many variations in other places across the whole variety of voting machines. In this case, the usability problem is with the administration of DRE voting machines, rather than voters’ experience. The CA Secretary of State de-certified San Mateo County’s voting system product (from Hart Systems) after an independent security review, and then re-certified it for counties that instituted a new regime of procedural and physical controls designed to mitigate security risks.

The fly in ointment — more than 20 new controls (bags, tags, seals, signatures) for poll workers to do, just to contain the risks created by a poor design decision to store ballot data in re-writeable storage inside the chassis of a clunky voting machine box. In practice, it’s very unlikely that these controls are followed rigorously; the usability problems for poll worker security measures essentially means that a good portion of the time, the polling place security is not "up to code" or at least can’t be proven so, due to less than perfect record keeping. And that means that the integrity of the election process is fundamentally suspect, can’t be proven, and can be challenged.

Let’s step through that again. We start with a poor design decision (which is common to all vendors’ products in use today). To counter the product-specific results of this decision, we get an effective but complex set of security administration requirements that are in many cases beyond the ability of a typical polling place to get right. The result: elections where we can’t know whether the ballot data has been tampered with. Ooops!

The second example is from Humboldt County, about which I’ve written already so I’ll be brief. There was an independent audit (counting votes separately from the official voting system) of 100% of the ballots in a Humboldt election, and the results turned up a discrepancy that led to exposure of a problem with the central ballot scanning/counting device. Without arguing whether it is a bug or a weird feature, that fact is that unless you use a specific, non-obvious procedure in order to operate the device, you lose the vote data from the first batch of scanned ballots, more or less silently. Once again, if you make a usage error, some votes disappear. Ooops! And since the software in question has been in wide use for several years, this scenario has probably occurred many times. Double oops!

Usability really matters! and these are just a couple of many many different examples. In fact, usability is important right down to the level of idiot-proofing, because believe me, as a poll worker at the end of an 18 hour day, "idiot" about describes my mental faculties. And though I didn’t soapbox it at the RSA conference, I am free to say here that that is why we believe so strongly in designing voting system components to fit existing voting processes and resources, or even to enable simplification of them — and why we work hard getting our assumptions and designed validated by real election officials.

— EJS

PS: Coming soon, lessons learned about auditing and about transparency.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

SITEWIDE SEARCH