Tagged voting technology

The Myth of Technologist Suppression of Internet Voting

I’ve got to debunk a really troubling rumor. It’s about Internet voting, or more specifically, about those who oppose it. Longtime readers will recall that Internet voting is not one of the favorite topics here, not because it isn’t interesting, but because there are so many more nearer-term low-effort ways to use tech to improve U.S. elections. However, I’ve heard this troubling story enough times that I have to debunk it today, and return to more important topics next time.

Here’s the gist of it: there is a posse of respectable computer scientists, election tech geeks, and allies who are:

  • Un-alterably opposed to Internet voting, for ever, and
  • Lying about i-voting’s feasibility in order to prevent its use as a panacea for increased participation and general wonderfulness, because they have a hidden agenda to preserve today’s low-participation elections.

I have to say, simply: no. I’ve been in this pond for long enough to know just about every techie, scientist, academic, or other researcher who understands both U.S. elections and modern technology. We all have varying degrees of misgivings about current i-voting methods, but I am confident that every one of these people stands with me on these 4 points.

  1. We oppose the increased use of i-voting as currently practiced.
  2. We very much favor use of the Internet for election activities of many kinds, potentially nearly everything except returning ballots; many of us have been working on such improvements for years.
  3. We strongly believe and support the power of invention and R&D to overcome the tech gaps in current i-voting, despite believing that some of the remaining issues are really* hard problems.
  4. We strongly believe that i-voting will eventually be broadly used, simply because of demand.

We all share a concern that if there is no R&D on these hard problems, then eventually today’s highly vulnerable forms of i-voting will be used widely, to the detriment of our democracy, and to the advantage of our nation-state adversaries who are already conducting cyber-operations against U.S. elections.

I believe that we need a two pronged approach: to support to the R&D that’s needed, but in the mean time to enable much needed modernization of our existing clunky decaying elections infrastructure, to lay the rails for future new Internet voting methods to be adopted.

Returning to the kooky story … but what about all those Luddite nay-sayers who say i-voting is impossible and that the time for i-voting is “never”? There are none, at least among tech professionals and/or election experts. There is some harsh rhetoric that’s often quoted, but it is against the current i-voting methods, which are indeed a serious problem.

But for the future, the main difference among us is about the little asterisk that I inserted in point 3 above — it means any number of “really” before “hard.” I’m grateful to colleague Joe Kiniry of Galois and of Free&Fair, for noting that our differences are really “just the number of ‘really’ we put before the word ‘hard’.”

— EJS

PS: A footnote about i-voting Luddites and election tech Luddites more broadly. There are indeed some vocal folks who are against the use of technology in elections, for example, those that advocate for a return to hand-counted paper ballots, with no computers used for ballot casting or counting. They do indeed say “never” when it comes to using the Internet for voting, and indeed e-voting as well. But that’s because of personal beliefs and policy decisions, not because of a professionally informed judgment that hard problems in computer science can never be solved. In fact, these anti-tech people are the other end of the spectrum from the folks who so strongly favor i-voting at any cost that they caricature nay-sayers of any kind; both folks use out of context quotes about current i-voting drawbacks as way to shift a conversation to the proposition of “Internet voting, no way, not ever” from the more important but nuanced questions of: Internet voting, not whether, but how?

NBC News, Voting Machines, and a Grandmother’s PC

 

I’d like to explain more precisely what I meant by “your grandmother’s PC” in the NBC TV Bay Area’s report on election technology. Several people thought I was referring to voting machines as easily hacked by anyone with physical access, because despite appearances:

Voting machines are like regular old PCs inside, and like any old PC …

  • … it will be happy to run any program you tell it to, where:
  • “You” is anyone that can touch the computer, even briefly, and
  • “Program” is anything at all, including malicious software specially created to compromise the voting machine.

That’s all true, of course, as many of us have seen recently in cute yet fear mongering little videos about how to “hack an election.” However, I was referring to something different and probably more important: a regular old PC running some pretty basic windows-XP application software, that an election official installed on the PC in the ordinary way, and uses in the same way as anything else.

That’s your “grandmother’s PC,” or in my son’s case, something old and clunky that looks a exactly like the PC that his grandfather had a decade plus ago – minus some hardware upgrades and software patches that were great for my father, but for voting systems are illegal.

But why is that PC “super important”? Because the software in question is the brains behind every one of that fleet of voting machines, a one stop shop to hack all the voting machines, or just fiddle vote totals after all those carefully and securely operated voting machines come home from the polling places. It’s an “election management system” (EMS) that election officials use to create the data that tells the voting machines what to do, and to combine the vote tally data into the actual election results.

That’s super important.

Nothing wrong with the EMS software itself, except for the very poor choice of creating it to run on a PC platform that by law is locked in time as it was a decade or so ago, and has no meaningful self-defenses in today threat environment. As I said, it wasn’t a thoughtful choice – nobody said it would be a good idea to run this really important software on something as easily hacked as anyone’s grandparent’s PC. But it was a pragmatic choice at the time, in the rush to the post-hanging-chads Federally funded voting system replacement derby. We are still stuck with the consequences.

It reminds me of that great old radio show, Hitchhiker’s Guide to the Galaxy, where after stealing what seems like the greatest ship in the galaxy, the starship Heart of Gold, our heroes are stuck in space-time with Eddie Your Ship-Board Computer, “ready to get a bundle of kicks from any program you care to run through me.” The problem, of course, is that while designed to do an improbably large number of useful things, it’s not able to do one very important thing: steer the ship after being asked to run a program to learn why tea tastes good.

Election management systems, voting machines, and other parts of a voting system, all have an individual very important job to do, and should not be able to do anything else. It’s not hard to build systems that way, but that’s not what’s available from today’s 3 vendors in the for-profit market for voting systems, and services to operate them to assist elections officials. We can fix that, and we are.

But it’s the election officials, many many of them public servants with a heart of gold, that should really be highlighted. They are making do with what they have, with enormous extra effort to protect these vulnerable systems, and run an election that we all can trust. They deserve better, we all deserve better, election technology that’s built for elections that are Verifiable, Accurate, Secure, and Transparent (VAST as we like to say). The “better” is in the works, here at OSET Institute and elsewhere, but there is one more key point.

Don’t be demoralized by the fear uncertainty and doubt about hacking elections. Vote. These hardworking public servants are running the election for each of us, doing their best with what they have. Make it worth something. Vote, and believe what is true, that you are an essential part of the process that makes our democracy to be truly a democracy.

— John Sebes

Election Standards, Day 2

The second day of the annual meeting of the Voting System Standards Committee was Friday Feb. 6. I’m concluding my reporting on the meeting with a round up of existing and proposed standards activity that we discussed on day 2. Each item below is about an existing working group, a group in formation, or proposed groups.

Election Process Modeling

This working group isn’t making a standard, but rather a guideline: a semi-formal model for the typical or common processes for U.S. election administration and election operations. The intent here is to document, in a structured manner, the various use cases where there needs to be data transfer from one system, component, or process to another. That will make it much easier to identify and prioritize such cases where data interoperability standards may be needed; from there, folks may choose to form a working group to address some of these needs.

This group is well along under the leadership of LA’s Kenneth Bennett, but still it’s a work in progress. I’ll be reporting more as we go along.

Digital Poll Books

This just formed working group, led by Ohio’s John Dziurlaj, will develop a standard data format for digital poll books. The starting point is to define a format that can accommodate data interchange between a voter registration system and digital pollbook — for example, a list of voters, each one with a name, address, voter status (e.g. in person vs. absentee voter). The reverse flow — all that plus a note of whether each voter checked in to vote, when, etc. — is included as well of course, but there are some other subtler issues. For example, it’s not enough to simply provide the data in that reverse flow; you need to also include data that ensures that the check-in records are from a legitimate source, and not modified. Without that, systems would be vulnerable to tampering that causes some ballots to be counted that shouldn’t, and vice versa. Also, not every pollbook does its job based on purely local pollbook records. Some rely on a callback to a central system that co-ordinates information flow among lots of digital pollbooks, and there are several hybrid models as well.

Also, there are privacy issues. In the paper world, every pollbook record was legally a public document, without including what we would now call “personal identifying information” (PII). More recently, with strong voter ID requirements, a voter check-in needs to include a comparison of a presented ID number (such as a driver’s license number) with the ID number that’s part of a voter’s registration record. Today, such ID numbers are often included in e-pollbook data, but that’s not ideal because each e-pollbook becomes a trove of PII at risk. In the upcoming data standards work, we may be able to include some optional privacy guards, like a way to store PII in cryptographically hashed form, to protect privacy but still enable a valid equivalence check — just the same way that stored-password system do.

Voting Methods Models

This newer group, led by Laura Massa-Lochridge, is also creating not a standard but a guideline to be used as a “standard” reference for other work. In this group, the focus is on the various individual approaches to voting on a ballot item and counting the votes. A familiar one is “vote for one” where the candidate with the most votes is the winner. Also familiar and well understood is “vote for N out of M”. Further, each of these has different semantics; for example, some vote-for-one contests have no winner when no candidate reaches a threshold, thus triggering a run-off. Familiar to some, but not so well understood, is “instant run-off”. In fact there different flavors of IRV, and in some cases it is not actually obvious which one is wanted or used in a particular jurisdiction. From there we get into heavy-duty election geek-dom with ranked choice voting and single transferrable vote.

The goal of this working group is to develop a formal mathematical model specifying precisely what’s meant for each variation of each voting method, with consensus from all who choose to participate, with process, oversight, and validation from an official international standards body. The result should be a great reference for elections officials and legislators to refer to, instead of (as is common now) simply referring to voting method by name, or by writing a counting algorithm into law.

Voting Machine Event Logs

This standard is nearly done, thanks to the leadership of NIST’s John Wack. It deserves more than a bit of explanation because it is a great example of both how the standards work, and the value of standard open data.

Every voting system has components for casting or counting ballots, and U.S. requirements for them include the requirement to do some logging of events that would provide researchers with data to analyze in order to assess how well the components operate, how effectively voters are able to use them, or so forth. Every product does some kind of logging, but each one’s log data format is in a different, proprietary format. So, VSSC has a data standard for log data, to enable vendors to provide logs in a common format that enables log analysis tools to combine and collate data from various systems.

So far, not the most thrilling part of standards work, but necessary to ensure that techies can understand what’s going wrong — or right — with real systems in operation. As many are aware, the current crop of voting system products do seem to misbehave during elections, and it’s important for tech assessment to learn whether there really were any faults (as opposed to operator error) and if so what. However, the curious part of this standard is not that provides a standard format for data common to pretty much every system (that’s why we call them common data formats!) like date/time, event code, event description, etc. Rather, the curious part is that it doesn’t try to provide a complete enumeration of all common events. Sure, most systems have an event that means “Voter cast the ballot” or “Completed scanning a ballot” but one vendor may call this “event 37” and another “event 29”.

Why not enumerate these in the standard? Well, for one thing ,it is hard to get a complete list, and as systems add more logging capabilities over time, the list grows. We want to issue the standard now, and don’t want to bake into it an incomplete list. (Once a standard is issued, it is some work to update it, and typically a standards group would prefer to use their efforts to standardize new stuff rather than revise old standards.) So the approach taken is different. It’s typical of many of the standards we’re working on, which is why I want to explain it for this standard. The approach is to have a particular part of the data format that’s expected to be filled by an event identifier that could be one of a canonical list defined elsewhere. It’s like the standard is saying “this ID field is just a string, but systems can choose to fill it with a string that’s from some canonical list that’s beyond the scope of this standard.” Also, the data format allows for a sort of glossary to be part of a dataset, to enable a dataset to essentially say “you’re going to see a bunch of event 37’s and in my lingo that means voter cast a ballot.”

The intent of course is that systems that conform to the standard will also choose to use this canonical list, which can grow over time, without requiring modifications to the standard. That’s nice but it begs the question: who maintains this list, and how does the maintainer allow people to submit additions to it? Good question. No answer yet, but that’s not a barrier to using the standard, and the early adopters will in essence start the list and figure out who is going manage it.

Event Logs for Voter Records

This is a topic for a to-be-formed working group, focused on issues very similar to those of the Event Log group described above, but for events of a voter records systems. The type of events we’re talking about here are things like: voter registration request rejected (including why); voter’s address change accepted; voter’s absentee ballot accepted for counting; voter’s provisional ballot rejected (and why); voter checked in to vote in person; and so on. The format will likely be pretty similar to the other event log format, and much of the discussion will be similar to above groups: whether there is a complete enumeration of actions or objects; whether to rely on external canonical lists; how to not expose PII, but allow a record can uniquely identify the voter in question (so that we can recognize when multiple events were about the same voter).

What types of interoperability would this support? Automated reporting, and data mining in general — again, larger issue — but one example is that is would support automated reporting that compares military voters to other voters in terms of voting outcomes: numbers and percentages of voters who voted absentee vs. in person, absentee voters who ballots where counted vs. rejected and if so why …

This type of reporting is already required of localities and states to the Federal government, and it is currently very burdensome for many election officials to create. As a result, one of the enthusiastic supporters of this nascent effort is a recently appointed EAC Commissioner who until recently was a state election who was official grumpy over the burden of this type reporting, but is now on the Federal commission requiring the reporting. So you can see that although not the most thrilling human endeavor, standards work can have its elements of irony. 🙂

Cast Vote Records 

Another topic for a to-be-formed working group, this is about how to extend the existing .2 standard (election result reporting) to describe just the votes recorded from a single ballot, together with other data (for example an image of a paper ballot from which the votes were recorded) that would be needed to support ballot audits. The whole larger issue of ballot audits is … larger; but you can read more about it in past posts here (search for “audit”) or elsewhere by a web search on “risk limiting audits elections.”

Ballot Specifications

Another topic for a to-be-formed working group, this is about how to extend the existing .2 standard’s description of ballot items (contests, candidates, referenda, questions, etc.) that is currently limited to what’s needed for results reporting.  The extensions could be limited to extensions needed to display online sample ballots, but could extend much further. Some of us have a particular interest in supporting “interactive sample ballots” which is, again, a larger issue, but more on that as the work unfolds.

Common Identifiers and OCD

Lastly, we also discussed more of the common identifier issue that I reported on earlier in day one. It turns out that this is another instance, thought slightly more complicated, of the issue facing a number of standards that I described above: semantic interoperability. In the .2 standard, we don’t want to bake into it an incomplete list of every possible district, office, precinct, etc. — even though we need common identifiers for these if two datasets can be interpreted as referred to the same things.

So, again, we have the issue of a separate canonical list. However, in this case, the space is huge, and the names (unlike event types) wouldn’t be self-identifying; and the things named could have multiple valid names. So there will no doubt be large directories of information about these political units, using common naming schemes. But to avoid these becoming a large muddle, we do have a smaller problem of smaller canonical lists, for example, a list of the names of all the types of district used in each state. With that, we could use existing naming schemes in a canonical way.

The most promising (by consensus of those working on standards anyway) naming scheme is that of the Open Civic Data project, including IDs of exacting this sort. The scope for OCD-IDs is broad: defining a handle for pretty much any government entity in any country, so that various organizations that have data on those entities can publish that data using a common identifier, enabling others to aggregate the data about those entities. It’s much broader than U.S. electoral districts. However, it’s already in use, including U.S. electoral districts. However, as I described above, the fly in the ointment is that plethora of types of electoral district; for a common unique name, you need to include the type of district, for example, the fire control district in CA’s San Mateo County that’s known as Fire District #3.

OK, so what, who, how will this registry, or directory, or curated list — whatever you might call it — get created and managed? Still a good question, but at least we have some clarity on what needs to be done, and maybe a bit of the how, as well. Stay tuned.

If we had this missing link (a canonical scheme for names of U.S. electoral districts) then we could use OCD-IDs (or extensions of FIPS geo codes for that matter) as an optional but commonly used and standards-based approach for constructing unique identifiers for electoral districts. Organizations that choose to use the naming scheme could issue VSSC.2 datasets that could be aggregated with others that also use the scheme. And then, people could have a much easier time aggregating those election result datasets to get large scale election results. At the risk of fore-shadowing, that’s actually a big deal to data-heads, public interest groups, and news organizations alike, as eloquently explained by a speaker at the next annual conference, which was this week in DC.

Coming Soon

At that conference — NIST/EAC Future of Voting Systems Symposium II — will be the topic of my next few reports!

— EJS

 

“Digital Voting”—Don’t believe everything you think

In our most recent blog post we examined David Plouffe’s recent Wall Street Journal forward-looking op-ed [paywall] and rebalanced his vision with some practical reality.

Now, let’s turn to Plouffe’s notion of “digital voting.”  Honestly, that phrase is confusing and vague.  We should know: it catalyzed our name change last year from Open Source Digital Voting Foundation (OSDV) to Open Source Election Technology Foundation (OSET).

Most Americans already use a “digital” machine to cast their ballots, if you mean by “digital” a computer-like device that counts votes electronically, and not by the old pre-2000 methods of punched cards or mechanical levers. What Plouffe probably meant is what elections professionals call iVoting, which is voting via the Internet—and increasingly that implies your mobile device.

Internet voting has not been approved anywhere in the United States for general public use, although Alaska is experimenting in a limited way with members of the military voting in this manner. Norway just stopped its Internet voting experiment. The challenges of iVoting are daunting.

Just think about it: many credit-card companies and several major online merchandisers have been hacked at some point, and all commercial and government web sites face intrusion attempts by the hour. The Department of Defense is continually bombarded by efforts to break-in. And sometimes hackers manage to actually get in and steal stuff. Voting is too important to let it be vulnerable to hacking.

Security of online voting is not yet with us. Sure, a few vendors of online voting technologies will emphatically claim their systems have never been hacked (to their knowledge) and that they use so-called “military grade” security (whatever that actually means).  Members of our technical team have been deeply involved in cyber-security for decades. We can say with confidence that no security on the Internet is absolute, assured, or guaranteed.  So when it comes to moving cast ballots via the Internet, the security issues are real and cannot be hand-waved away.  And elections that are run, in any part, over the public Internet pose just too tempting an opportunity for some predator looking to disrupt or even derail a U.S. election.

But, that doesn’t mean elections technologies can’t be improved or be made more digital, and thereby more verifiable, more accurate, and more transparent. That’s exactly what the TrustTheVote Project is all about.

The open-source software and standards that we are developing and advocating will make online voter registration, digital poll books (used to check you in at your polling place) and (ultimately) casting and counting ballots better, faster, and more auditable.  And our software is designed to run on ordinary computer hardware – whether that is a tablet, a scanner, or laptop computer.  Adopting the TrustTheVote Project technology means there will no longer be a requirement for election administrators to acquire expensive, proprietary software or hardware with long-term costly service contracts.

Importantly, we believe there are many parts of elections administration that can benefit from digital innovations, which may or may not use the Internet in some way.  And we’re focusing on delivering those innovations.

However, for the foreseeable future, ballot casting and counting can be dramatically improved without needing to involve the Internet.

So, we should to be cautious about the phrase “digital voting” in an age when all things digital tend to imply “Internet.”

All that observed, we really like how Plouffe ended his recent Wall Street Journal op-ed: “There are disrupters in every industry… the good ones won’t just apply the best practices of the private sector, but will also innovate and create on their own to meet their unique needs.”

The TrustTheVote Project intends to be one of those disrupters.  We add one tiny nuance: in our case, those “unique needs” are primarily those of our stakeholders—the state, county and city officials who run our elections. We won’t be running elections, they will, but we are thinking as far outside of the typical ballot box as we can when looking for opportunities to make voting easy, convenient, and ideally, a delight.

David Plouffe’s View of the Future of Voting — We Agree and Disagree

David Plouffe, President Obama’s top political and campaign strategist and the mastermind behind the winning 2008 and 2012 campaigns, wrote a forward-looking op-ed [paywall] in the Wall Street Journal recently about the politics of the future and how they might look.

He touched on how technology will continue to change the way campaigns are conducted – more use of mobile devices, even holograms, and more micro-targeting at individuals. But he also mentioned how people might cast their votes in the future, and that is what caught our eye here at the TrustTheVote Project.

Here’s what Plouffe wrote: “More states will inevitably move to online voter registration and perhaps digital voting. There will be resistance…but our voting system won’t remain disconnected forever from the way we are leading the rest of our lives.

His last statement – that the voting system will come to resemble more our mobile-device-dependent world – is probably true in the long run.  But it’s going to take time, probably more time than we all would like.  Even though we can bank, buy coffee, and get a boarding pass for an airplane via our smart phones, voting by smart phone is more complicated—hugely more complicated.

When you’re banking online, the financial institution has to be able, absolutely, to identify and verify it is you who authorized (or didn’t authorize) a particular transaction (such as a purchase with your bank card at Amazon.com).  But in the world of elections, the election administrator has to be sure, absolutely, that they can never identify you as the person who cast a particular ballot. It’s completely opposite of online banking because of the sacred assurances of voter anonymity and the secret ballot.

Sure, elections officials should verify you as the individual who is checking in to cast a ballot, but once you have been authenticated, the connection with a particular ballot must cease to exist.  And doing that by your smart phone (or any other digital device connected to the Internet) is beyond non-trivial; it’s downright near impossible.

So, there’s a privacy and technology challenge there.  In other words, we need security of the ballot, but we also need privacy of the voter.  And in the digital world there is an opposite (we call it “inverse”) relationship between security and privacy.

Think about an airport and TSA check points.  If you want absolute privacy, you should be able to walk straight to your gate uninhibited.  If you want absolute security, you should not be able to do so until everything about you has been identified and verified as that exact person with an authorized ticket  to board a plane.

If you think about how awful it would be if your online bank account got hacked, imagine if your state’s online voting system was compromised. Not only could the result be suspect, the fact that an election was hacked would undermine voters’ confidence in our democracy.

So smartphone voting might be a ways off. But in the here and now and very near future, the TrustTheVote Project is already delivering on some of Plouffe’s other visions.

Online voter registration, for example, is already being implemented in many states and through third party organizations. The TrustTheVote Project helped Rock The Vote develop its “Rocky” core software, which operates that group’s nationwide online registration. TrustTheVote helped Virginia implement its online voter registration and our technology powers the search part of the Virginia site, which lets you know if you’re already registered, are at the right polling place, and that your address is up to date. This was all developed with TrustTheVote Project open-source technology that all states and localities can adopt and adapt.

And we’re underway on other innovations—like apps to help you figure out the best time to go to your polling place and apps to help you “check in” to vote, just like the ones you use to get  like you download and print a boarding pass for your flight.

So to David Plouffe, yes elections and campaigns will change in the future.  But it will come step by step and not by a big bang of smartphone voting.

Money Shot: What Does a $40M Bet on Scytl Mean?

…not much we think.

Yesterday’s news of Microsoft co-founder billionaire Paul Allen’s investing $40M in the Spanish election technology company Scytl is validation that elections remain a backwater of innovation in the digital age.

But it is not validation that there is a viable commercial market for voting systems of the size typically attracting venture capitalists; the market is dysfunctional and small and governments continue to be without budget.

And the challenges of building a user-friendly secure online voting system that simultaneously protects the anonymity of the ballot is an interesting problem that only an investor of the stature of Mr. Allen can tackle.

We think this illuminates a larger question:

To what extent should the core technology of the most vital aspect of our Democracy be proprietary and black box, rather than publicly owned and transparent?

To us, that is a threshold public policy question, commercial investment viability issues notwithstanding.

To be sure, it is encouraging to see Vulcan Capital and a visionary like Paul Allen invest in voting technology. The challenges facing a successful elections ecosystem are complex and evolving and we will need the collective genius of the tech industry’s brightest to deliver fundamental innovation.

We at the TrustTheVote Project believe voting is a vital component of our nation’s democracy infrastructure and that American voters expect and deserve a voting experience that’s verifiable, accurate, secure and transparent.  Will Scytl be the way to do so?

The Main Thing

The one thing that stood out to us in the various articles on the investment were Scytl’s comments and assertions of their security with international patents on cryptographic protocols.  We’ve been around the space of INFOSEC for a long time and know a lot of really smart people in the crypto field.  So, we’re curious to learn more about their IP innovations.  And yet that assertion is actually a red herring to us.

Here’s the main thing: transacting ballots over the public packet switched network is not simply about security.   Its also about privacy; that is, the secrecy of the ballot.  Here is an immutable maxim about the digital world of security and privacy: there is an inverse relationship, which holds that as security is increased, privacy must be decreased, and vice-verse.  Just consider any airport security experience.  If you want maximum security then you must surrender a bunch of privacy.  This is the main challenge of transacting ballots across the Internet, and why that transaction is so very different from banking online or looking at your medical record.

And then there is the entire issue of infrastructure.  We continue to harp on this, and still wait for a good answer.  If by their own admissions, the Department of Defense, Google, Target, and dozens of others have challenges securifying their own data centers, how exactly can we be certain that a vendor on a cloud-based service model or an in-house data center of a county or State has any better chance of doing so? Security is an arms race.  Consider the news today about Heartbleed alone.

Oh, and please for the sake of credibility can the marketing machinery stop using the phrase “military grade security?”  There is no such thing.  And it has nothing to do with an increase in the  128-bit encryption standard RSA keys to say, 512 or 1024 bit.  128-bit keys are fine and there is nothing military to it (other than the Military uses it).  Here is an interesting article from some years ago on the sufficiency of current crypto and the related marketing arms race.  Saying “military grade” is meaningless hype.  Besides, the security issues run far beyond the transit of data between machines.

In short, there is much the public should demand to understand from anyone’s security assertions, international patents notwithstanding.  And that goes for us too.

The Bottom Line

While we laud Mr. Allen’s investment in what surely is an interesting problem, no one should think for a moment that this signals some sort of commercial viability or tremendous growth market opportunity.  Nor should anyone assume that throwing money at a problem will necessarily fix it (or deliver us from the backwaters of Government elections I.T.).  Nor should we assume that this somehow validates Scytl’s “model” for “security.”

Perhaps more importantly, while we need lots of attention, research, development and experimentation, the bottom line to us is whether the outcome should be a commercial proprietary black-box result or an open transparent publicly owned result… where the “result” as used here refers to the core technology of casting and counting ballots, and not the viable and necessary commercial business of delivering, deploying and servicing that technology.

A Northern Exposed iVoting Adventure

NorthernExposureImageAlaska’s extension to its iVoting venture may have raised the interests of at least one journalist for one highly visible publication.  When we were asked for our “take” on this form of iVoting, we thought that we should also comment here on this “northern exposed adventure.” (apologies to those fans of the mid-90s wacky TV series of a similar name.)

Alaska has been among the states that allow military and overseas voters to return marked absentee ballots digitally, starting with fax, then eMail, and then adding a web upload as a 3rd option.  Focusing specifically on the web-upload option, the question was: “How is Alaska doing this, and how do their efforts square with common concerns about security, accessibility, Federal standards, testing, certification, and accreditation?

In most cases, any voting system has to run that whole gauntlet through to accreditation by a state, in order for the voting system to be used in that state. To date, none of the iVoting products have even tried to run that gauntlet.

So, what Alaska is doing, with respect to security, certification, and host of other things is essentially: flying solo.

Their system has not gone through any certification program (State, Federal, or otherwise that we can tell); hasn’t been tested by an accredited voting system test lab; and nobody knows how it does or doesn’t meet  federal requirements for security, accessibility, and other (voluntary) specifications and guidelines for voting systems.

In Alaska, they’ve “rolled their own” system.  It’s their right as a State to do so.

In Alaska, military voters have several options, and only one of them is the ability to go to a web site, indicate their choices for vote, and have their votes recorded electronically — no actual paper ballot involved, no absentee ballot affidavit or signature needed. In contrast to the sign/scan/email method of return of absentee ballot and affidavit (used in Alaska and 20 other states), this is straight-up iVoting.

So what does their experience say about all the often-quoted challenges of iVoting?  Well, of course in Alaska those challenges apply the same as anywhere else, and they are facing them all:

  1. insider threats;
  2. outsider hacking threats;
  3. physical security;
  4. personnel security; and
  5. data integrity (including that of the keys that underlie any use of cryptography)

In short, the Alaska iVoting solution faces all the challenges of digital banking and online commerce that every financial services industry titan and eCommerce giant spends big $ on every year (capital and expense), and yet still routinely suffer attacks and breaches.

Compared to the those technology titans of industry (Banking, Finance, Technology services, or even the Department of Defense), how well are Alaskan election administrators doing on their shoestring (by comparison) budget?

Good question.  It’s not subject to annual review (like banks’ IT operations audit for SAS-70), so we don’t know.  That also is their right as a U.S. state.  However, the  fact that we don’t know, does not debunk any of the common claims about these challenges.  Rather, it simply says that in Alaska they took on the challenges (which are large) and the general public doesn’t know much about how they’re doing.

To get a feeling for risks involved, just consider one point, think about the handful of IT geeks who manage the iVoting servers where the votes are recorded and stored as bits on a disk.  They are not election officials, and they are no more entitled to stick their hands into paper ballots boxes than anybody else outside a
county elections office.  Yet, they have the ability (though not the authorization) to access those bits.

  • Who are they?
  • Does anybody really oversee their actions?
  • Do they have remote access to the voting servers from anywhere on the planet?
  • Using passwords that could be guessed?
  • Who knows?

They’re probably competent responsible people, but we don’t know.  Not knowing any of that, then every vote on those voting servers is actually a question mark — and that’s simply being intellectually honest.

Lastly, to get a feeling for the possible significance of this lack of knowledge, consider a situation in which Alaska’s electoral college votes swing an election, or where Alaska’s Senate race swings control of Congress (not far-fetched given Murkowski‘s close call back in 2010.)

When the margin of victory in Alaska, for an election result that effects the entire nation, is a low 4-digit number of votes, and the number of digital votes cast is similar, what does that mean?

It’s quite possible that those many digital votes could be cast in the next Alaska Senate race.  If the contest is that close again,  think about the scrutiny those IT folks will get.  Will they be evaluated any better than every banking data center investigated after a data breach?  Any better than Target?  Any better than Google or Adobe’s IT management after having trade secrets stolen?  Or any better than the operators of military unclassified systems that for years were penetrated through intrusion from hackers located in China who may likely have been supported by the Chinese Army or Intelligence groups?

Probably not.

Instead, they’ll be lucky (we hope) like the Estonian iVoting administrators, when the OCSE visited back in 2011 to have a look at the Estonian system.  Things didn’t go so well.  OCSE found that one guy could have undermined the whole system.  Good news: it didn’t happenCold comfort: that one guy didn’t seem to have the opportunity — most likely because he and his colleagues were busier than a one-armed paper hanger during the election, worrying about Russian hackers attacking again, after they had previously shut-down the whole country’s Internet-connect government systems.

But so far, the current threat is remote, and it is still early days even for small scale usage of Alaska’s iVoting option.  But while the threat is still remote, it might be good for the public to see some more about what’s “under the hood” and who’s in charge of the engine — that would be our idea of more transparency.

<soapbox>

Wandering off the Main Point for a Few Paragraphs
So, in closing I’m going to run the risk of being a little preachy here (signaled by that faux HTML tag above); again, probably due to the surge in media inquiries recently about how the Millennial generation intends to cast their ballots one day.  Lock and load.

I (and all of us here) are all for advancing the hallmarks of the Millennial mandates of the digital age: ease and convenience.  I am also keenly aware there are wing-nuts looking for their Andy Warhol moment.  And whether enticed by some anarchist rhetoric, their own reality distortion field, or most insidious: the evangelism of a terrorist agenda (domestic or foreign) …said wing nut(s) — perhaps just for grins and giggles — might see an opportunity to derail an election (see my point above about a close race that swings control of Congress or worse).

Here’s the deep concern: I’m one of those who believes that the horrific attacks of 9.11 had little to do with body count or the implosions of western icons of financial might.  The real underlying agenda was to determine whether it might be possible to cause a temblor of sufficient magnitude to take world financial markets seriously off-line, and whether doing so might cause a rippling effect of chaos in world markets, and what disruption and destruction that might wreak.  If we believe that, then consider the opportunity for disruption of the operational continuity of our democracy.

Its not that we are Internet haters: we’re not — several of us came from Netscape and other technology companies that helped pioneer the commercialization of that amazing government and academic experiment we call the Internet.  Its just that THIS Internet and its current architecture simply was not designed to be inherently secure or to ensure anyone’s absolute privacy (and strengthening one necessarily means weakening the other.)

So, while we’re all focused on ease and convenience, and we live in an increasingly distributed democracy, and the Internet cloud is darkening the doorstep of literally every aspect of society (and now government too), great care must be taken as legislatures rush to enact new laws and regulations to enable studies, or build so-called pilots, or simply advance the Millennial agenda to make voting a smartphone experience.  We must be very careful and considerably vigilant, because its not beyond the realm of reality that some wing-nut is watching, cracking their knuckles in front of their screen and keyboard, mumbling, “Oh please. Oh please.”

Alaska has the right to venture down its own path in the northern territory, but it does so exposing an attack surface.  They need not (indeed, cannot) see this enemy from their back porch (I really can’t say of others).  But just because it cannot be identified at the moment, doesn’t mean it isn’t there.

</soapbox>

One other small point:  As a research and education non-profit we’re asked why shouldn’t we be “working on making Internet voting possible?”  Answer: Perhaps in due time.  We do believe that on the horizon responsible research must be undertaken to determine how we can offer an additional alternative by digital means to casting a ballot next to absentee and polling place experiences.  And that “digital means” might be over the public packet-switched network.  Or maybe some other type of network.  We’ll get there.  But candidly, our charge for the next couple of years is to update an outdated architecture of existing voting machinery and elections systems and bring about substantial, but still incremental innovation that jurisdictions can afford to adopt, adapt and deploy.  We’re taking one thing at a time and first things first; or as our former CEO at Netscape used to say, we’re going to “keep the main thing, the main thing.”

Onward
GAM|out

PCEA Report Finally Out: The Real Opportunity for Innovation Inside

PCEACoverThis week the PCEA finally released its long-awaited report to the President.  Its loaded with good recommendations.  Over the next several days or posts we’ll give you our take on some of them.  For the moment, we want to call your attention to a couple of under-pinning elements now that its done.

The Resource Behind the Resources

Early in the formation of what initially was referred to as the “Bauer-Ginsberg Commission” we were asked to visit the co-chairs in Washington D.C. to chat about technology experts and resources.  We have a Board member who knows them both and when asked we were honored to respond.

Early on we advised the Co-Chairs that their research would be incomplete without speaking with several election technology experts, and of course they agreed.  The question was how to create a means to do so and not bog down the progress governed by layers of necessary administrative regulations.

I take a paragraph here to observe that I was very impressed in our initial meeting with Bob Bauer and Ben Ginsberg.  Despite being polar political opposites they demonstrated how Washington should work: they were respectful, collegial, sought compromise to advance the common agenda and seemed to be intent on checking politics at the door in order to get work done.  It was refreshing and restored my faith that somewhere in the District there remains a potential for government to actually work for the people.  I digress.

We advised them that looking to the CalTech-MIT Voting Project would definitely be one resource they could benefit from having.

We offered our own organization, but with our tax exempt status still pending, it would be difficult politically and otherwise to rely on us much in a visible manner.

So the Chairs asked us if we could pull together a list — not an official subcommittee mind you, but a list of the top “go to” minds in the elections technology domain.  We agreed and began a several week process of vetting a list that needed to be winnowed down to about 20 for manageability  These experts would be brought in individually as desired, or collectively  — it was to be figured out later which would be most administratively expedient.  Several of our readers, supporters, and those who know us were aware of this confidential effort.  The challenge was lack of time to run the entire process of public recruiting and selection.  So, they asked us to help expedite that, having determined we could gather the best in short order.

And that was fine because anyone was entitled to contact the Commission, submit letters and comments and come testify or speak at the several public hearings to be held.

So we did that.  And several of that group were in fact utilized.  Not everyone though, and that was kind of disappointing, but a function of the timing constraints.

The next major resource we advised they had to include besides CalTech-MIT and a tech advisory group was Rock The Vote.  And that was because (notwithstanding they being a technology partner of ours) Rock The Vote has its ear to the rails of new and young voters starting with their registration experience and initial opportunity to cast their ballot.

Finally we noted that there were a couple of other resources they really could not afford to over-look including the Verified Voting Foundation, and L.A. County’s VSAP Project and Travis County’s StarVote Project.

The outcome of all of that brings me to the meat of this post about the PCEA Report and our real contribution.  Sure, we had some behind the scenes involvement as I describe above.  No big deal.  We hope it helped.

The Real Opportunity for Innovation

But the real opportunity to contribute came in the creation of the PCEA Web Site and its resource toolkit pages.

On that site, the PCEA took our advice and chose to utilize Rock The Vote’s open source voter registration tools and specifically the foundational elements the TrustTheVote Project has built for a States’ Voter Information Services Portal.

Together, Rock The Vote and the TrustTheVote Project are able to showcase the open source software that any State can adopt, adapt, and deploy–for free (at least the adoption part) and without having to reinvent the wheel by paying for a ground-up custom build of their own online voter registration and information services portal.

We submit that this resource on their PCEA web site represents an important ingredient to injecting innovation into a stagnant technology environment of today’s elections and voting systems world.

For the first time, there is production-ready open source software available for an important part of an elections official’s administrative responsibilities that can lower costs, accelerate deployment and catalyze innovation.

To be sure, its only a start — its lower hanging fruit of an election technology platform that doesn’t require any sort of certification. With our exempt status in place, and lots of things happening we’ll soon share, there is more, much more, to come.  But this is a start.

There is a 112 pages of goodness in the PCEA report.  And there are some elements in there that deserve further discussion.  But we humbly assert its the availability of some open source software on their resource web site that we think represents a quiet breakthrough in elections technology innovation.

The news has been considerable.  So, yep, we admit it.  We’re oozing pride today.
And we owe it to your continued support of our cause.
Thank you!

GAM | out

Comments Prepared for Tonight’s Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations.

Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a “component-ized” approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We’re glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn’t available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter’s output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as “forklift upgrades” or “fleet replacements.”

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It’s not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES:
There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don’t read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow’s hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to “empower” LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done… technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we’ve witnessed over the years in other jurisdictions.

Onward.

Crowd Sourcing Polling Place Wait Times

Long lines at the polling place are becoming a thorn in our democracy.

We realized a few months ago that our elections technology framework data layer could provide information that when combined with community-based information gathering might lessen the discomfort of that thorn.  Actually, that realization happened while hearing friends extol the virtues of Waze.  Simply enough, the idea was crowd-sourcing wait information to at least gain some insight on how busy a polling place might be at the time one wants to go cast their ballot.

Well, to be sure, lots of people are noodling around lots of good ideas and there is certainly no shortage of discussion on the topic of polling place performance.  And, we’re all aware that the President has taken issue with it and after a couple of mentions in speeches, created the Bauer-Ginsberg Commission.  So, it seems reasonable to assume this idea of engaging some self-reporting isn’t entirely novel.

After all, its kewl to imagine being able to tell – in real time – what the current wait time at the polling place is, so a voter can avoid the crowds, or a news organization can track the hot spots of long lines.  We do some “ideating” below but first I offer three observations from our noodling:

  • It really is a good idea; but
  • There’s a large lemon in it; yet
  • We have the recipe for some decent lemonade.

Here’s the Ideation Part

Wouldn’t it be great if everybody could use an app on their smarty phone to say, “Hi All, its me, I just arrived at my polling place, the line looks a bit long.” and then later, “Me again, OK, just finished voting, and geesh, like 90 minutes from start to finish… not so good,” or “Me again, I’m bailing.  Need to get to airport.”

And wouldn’t it be great if all that input from every voter was gathered in the cloud somehow, so I could look-up my polling place, see the wait time, the trend line of wait times, the percentage of my precinct’s non-absentee voters who already voted, and other helpful stuff?  And wouldn’t it be interesting if the news media could show a real time view across a whole county or State?

Well, if you’re reading this, I bet you agree, “Yes, yes it would.”  Sure.  Except for one thing.  To be really useful it would have to be accurate.  And if there is a question about accuracy (ah shoot, ya know where this is going, don-cha?) Yes, there is always that Grinch called “abuse.”

Sigh. We know from recent big elections that apparently, partisan organizations are sometimes willing to spend lots of money on billboard ads, spam campaigns, robo-calls, and so on, to actually try to discourage people from going to the polls, within targeted locales and/or demographics. So, we could expect this great idea, in some cases, to fall afoul of similar abuse.  And that’s the fat lemon.

But, please read on.

Now, we can imagine some frequent readers spinning up to accuse us of wanting everything to be perfectly secure, of letting the best be the enemy of the good, and noting that nothing will ever be accomplished if first every objection must be overcome. On other days, they might be right, but not so much today.

We don’t believe this polling place traffic monitoring service idea requires the invention of some new security, or integrity, or privacy stuff.  On the other hand, relying on the honor system is probably not right either.  Instead, we think that in real life something like this would have a much better chance of launch and sustained benefit, if it were based on some existing model of voters doing mobile computing in responsible way that’s not trivial to abuse like the honor system.

And that lead us to the good news – you see, we have such an existing model, in real life. That’s the new ingredient, along with that lemon above, and a little innovative sugar, for the lemonade that I mentioned.

Stay tuned for Part 2, and while waiting you might glance at this.