Tagged standards

Election Results Reporting – Assumptions About Standards and Converters (concluded)

Last time, I explained how our VoteStream work depends on the 3rd of 3 assumptions: loosely, that there might be a good way to get election results data (and other related data) out of their current hiding places, and into some useful software, connected by an election data standard that encompasses results data. But what are we actually doing about it?

Answer: we are building prototypes of that connection, and the lynchpin is an election data standard that can express everything about the information that VoteStream needs. We’ve found that the VIP format is an existing, widely adopted standard that provides a good starting point. More details on that later, but for now the key words are “converters” and “connectors”. We’re developing technology that proves the concept that anyone with basic data modeling and software development skills can create a connector, or data converter, that transforms election data (including but most certainly not limited to vote counts) from one of a variety of existing formats, to the format of the election data standard.

And this is the central concept to prove — because as we’ve been saying in various ways for some time, the data exists but is locked up in a variety of legacy and/or proprietary formats. These existing formats differ from one another quite a bit, and contain varying amounts of information beyond basic vote counts. There is good reason to be skeptical, to suppose that is a hard problem to take these different shapes and sizes of square data pegs (and pentagonal, octahedral, and many other shaped pegs!) and put them in a single round hole.

But what we’re learning — and the jury is still out, promising as our experience is so far — that all these existing data sets have basically similar elements, that correspond to a single standard, and that it’s not hard to develop prototype software that uses those correspondence to convert to a single format. We’ll get a better understanding of the tricky bits, as we go along making 3 or 4 prototype converters.

Much of this feasibility rests on a structuring principle that we’ve adopted, which runs parallel to the existing data standard that we’ve adopted. Much more on that principle, the standard, its evolution, and so on … yet to come. As we get more experience with data-wrangling and converter-creation, there will certainly be a lot more to say.

— EJS

Comments Prepared for Tonight’s Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations.

Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a “component-ized” approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We’re glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn’t available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter’s output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as “forklift upgrades” or “fleet replacements.”

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It’s not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES:
There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don’t read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow’s hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to “empower” LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done… technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we’ve witnessed over the years in other jurisdictions.

Onward.

Election Results Reload: the Time is Right

In my last post, I said that the time is right for breaking the logjam in election results reporting, enabling a big reload on technology for reporting, and big increase in public transparency. Now, let me explain why, starting with the biggest of several reasons. 

Elections data standards are needed to define common data formats into which a variety of results data can converted.

Those standards are emerging now, and previously the lack of them was a real problem.

  • We can’t reasonably expect a local elections office to take additional efforts to publish the data, or otherwise serve the public with election results services, if the result will be just one voice in a Babel of dozens of different data languages and dialects.
  • We can’t reasonably expect a 3rd party organization to make use of the data from many sources, unless it’s available in a single standard format, or they have the wherewithal to do huge amounts of work on data conversion, repeatedly.

The good news is that election data standards have come along way in the last couple of years, due to:

  • Significant support from a the U.S. Governments standards body — the National Institute of Standards and Technology (NIST);
  • Sustained effort from the volunteers working in standards committees in the international standards body — the IEEE 1622 Working Group; and
  • Practical experience with evolving de facto standards, particularly with the data formats and services of the Pew Voting Information Project (VIP), and the several elections organizations that participate in providing VIP data.

There are other reasons why the time is right, but they are more widely understood:

  • We now have technologies that perennially understaffed and underfunded elections organization can feasibly adopt quickly and cheaply including powerful web application frameworks, supported by cloud hosting operations, within a growing ecosystem of web services that enable many organizations to access a variety of data and apps.
  • “Open government,” “open data,” and even “big data” are buzz phrases now commonly understood, which describe a powerful and maturing set of technologies and IT practices.  This new language of government IT innovation facilitates actionable conversations about the opportunity to provide the public with far more robust information on elections and their participation and performance.

It’s a “promised land” of government IT and the so-called Gov 2.0 movement (arguably we think more like Gov 3.0 when you think about it in terms of 2.0 was all about collaboration and 3.0 is becoming all about the “utility web”–real apps available on demand — a direction some of these services will inevitably take).  However, for election technology in the near term, we first have to cross the river by learning how to “get the data out” (and that is more like Gov 2.0) More next time on our assumptions about how that river can be crossed, and our experiences to date on doing that crossing.

— EJS

Towards Standardized Election Results Data Reporting

Now that we are a ways into our “Election Night Reporting System” project, we want to start sharing some of what we are learning.  We had talked about a dedicated Wiki or some such, but our time was better spent digging into the assignment graciously supported by the Knight Foundation Prototype Fund.  Perhaps the best place to start is a summary of what we’ve been saying within the ENRS team, about what we’re trying to accomplish.

First, we’re toying with this silly internal project code name, “ENRS” and we don’t expect it to hang around forever. Our biggest grip is that what we’re trying to do extends way beyond the night of elections, but more about that later.

Our ENRS project is based on a few assumptions, or perhaps one could say some hypotheses that we hope to prove. “Prove” is probably a strong word. It might better to say that we expect that our assumptions will be valid, but with practical limitations that we’ll discover.

The assumptions are fundamentally about three related topics:

  1. The nature and detail of election results data;
  2. The types of software and services that one could build to leverage that data for public transparency; and
  3. Perhaps most critically, the ability for data and software to interact in a standard way that could be adopted broadly.

As we go along in the project, we hope to say more about the assumptions in each of these areas.

But it is the goal of feasible broad adoption of standards that is really the most important part. There’s a huge amount of latent value (in terms of transparency and accountability) to be had from aggregating and analyzing a huge amount of election results data. But most of that data is effectively locked up, at present, in thousands of little lockboxes of proprietary and/or legacy data formats.

It’s not as though most local election officials — the folks who are the source of election results data, as they conduct elections and the process of tallying ballots — want to keep the data locked up, nor to impede others’ activities in aggregating results data across counties and states, and analyzing it. Rather, most local election officials just don’t have the means to “get the data out” in way that supports such activities.

We believe that the time is right to create the technology to do just that, and enable election officials to use the technology quickly and easily. And this prototype phase of ENRS is the beginning.

Lastly, we have many people to thank, starting with Chris Barr and the Knight Foundation for its grant to support this prototype project. Further, the current work is based on a previous design phase. Our thanks to our interactive design team led by DDO, and the Travis County, TX Elections Team who provided valuable input and feedback during that earlier phase of work, without which the current project wouldn’t be possible.

— EJS

“Why is There a Voting Tech Logjam in Washington

“Why is There a Voting Tech Logjam?” — that’s a good question! A full answer has several aspects, but one of them is the acitivty (or in-activity) at the Federal level, that leads to very limited options in election tech. For a nice pithy explanation of that aspect, check out the current issue of the newsletter of the National Conference of State Legislators, on page 4.

One really important theme addressed here is the opportunity for state lawmakers to make their decisions about what standards to use, to enable the state’s local election officials make their decisions about what technology to make or adopt — including purchase, in-house build, and (of course) adoption and adaptation of open-source election technology.

— EJS

Sequel to A.G. Holder “Fix That” — Can Do!

In my last post, I said that we might be onto something, an idea for many of the benefits of universal automatic permanent voter registration, without the need for Federal-plus-50-states overhaul of policy, election law, and election technology that would be required for actual UAP VR. Here is a sketch of what that might be. I think it’s interesting not because of being complex or clever — which it is not — but because it is sufficiently simple and simple-minded that it might feasibly be used by real election officials who don’t have the luxury to spend money to make significant changes to their election administration systems. (By the way, if you’re not into tales of information processing systems, feel free to skip to the punchline in the last paragraph.)

Furthermore — and this is critical — this idea is simple enough that a proof of concept system could be put into place quite quickly and cheaply. And in election tech today, that’s critical. To paraphrase the “show me” that we hear often: don’t just tell me ideas for election tech improvements; show me something I can see, touch, and try, that shows that it would work in my current circumstances. With input from some election officials about what they’d need, and what that “show me” would be, here is the basic idea …

The co-ordination of existing databases that A.G. Holder called for would actually be a new system, a “federated database” that does not try to coordinate every VR status change of every person, but instead enables a best-efforts distribution of advisory information from various government organizations, to participating election officials who work on those two important principles that I explained in my last post. This is not a clearing-house, not a records matching system, but just something that distributes info about events.

Before I explain what the events could be and how the sharing happens, let me bracket the issue of privacy. Of course all of this should be done in a privacy-protecting way with anonymized data, and of course that’s possible. But whenever I say “a person with a DOB of X” or something like that, remember that I am really talking about some DOB that is one-way-hashed for privacy. Secondly, for the sake of simple explanation, I’m assuming that SSN and DOB can be used as a good-enough nearly-unique identifier for these purposes, but the scheme works pretty much the same with other choices of identifying information. (By the way, I say nearly-unique because it is not uncommon for a VR database to have two people with the same SSN because of data-entry typos, hand-writing issues, and so forth.)

To explain this system, I’ll call it “Holder” both because of the A.G. and because I like the idea that everything in this system is a placeholder for possible VR changes, rather than anything authoratative. And because this is a Federal policy goal, I’ll tell a story that involves Federal activity to share information with states — and also because right now that’s one of the sources of info that states don’t actually have today!

Now, suppose that every time a Federal agency — say the IRS or HHS — did a transaction with a person, and that involved the person’s address, that agency posts a notification into “Holder” that says that on date D, a person with SSN and DOB of X and Y claimed a current address of Z. This is just a statement of what the agency said the person said, and isn’t trying to be a change-of-address. And it might, but needn’t always, include an indication of what type of transaction occurred. The non-authoratative part is important. Suppose there’s a record where the X and Y match a registered voter Claire Cornucopia of 1000 Chapel St., New Haven CT, but the address in not in CT. The notification might indicate a change of address, but it might be a mistake too. Just today I got mail from of government organization that had initially sent it to a friend of mine in another state. Stuff happens.

State VR operators could access “Holder” to examine this stream of notifications to find cases where it seems to be about a voter that is currently registered in that state, or isn’t but possibly should be. If there is a notification that looks like a new address for an existing voter, then they can reach out to the voter — for example, email, postal mail to the current address on file, postal mail to the possibly new address. In keeping with current U.S. practice:

  • it is up the voter to maintain their voter record;
  • election officials must update a record when a voter sends a change;
  • without info from a voter, election officials can change a record only in specific ways authorized by state election law.

The point here is to make it easier for election officials to find out that a person might ought take some action, and to help that person do so. The helping part is a separate matter, including online voter services, but conceivably, this type of system would work (albeit with a lower participation rate) in system limited to postal mail to voters asking them to fill out a paper form and mail it back.

Next, let’s imagine the scenarios that this system might enable, in terms of the kinds of outreach that a voter could receive, not limited to change of address as I described above.

  1. “Hey, it looks like you changed your mailing address – does that mean that you changed your residence too? If so, here is how you should update your voter record …”
  2. “Hey, it looks like you now live in the state of XX but aren’t registered to vote – if so, here is what you should do to find out if you’re eligible to vote … …”
  3. “Hey, it looks like you just signed up for selective service – so you are probably eligible to vote too, and here is what you should do …”

Number 3 — and other variations I am sure you can think of — is especially important as a way to approximate the “automatic” part of A.G. Holder’s policy recommendation, while number 1 is the “permanent” part, and number 2 is part of both.

With just a little trial-ballooning to date, I fairly confident that this “Holder” idea would complement existing VR database maintenace work, and has the potential to connect election officials with a larger number of people than they currently connect with. And I know for sure that this does not require election officials to change the existing way that they manage voter records. But how about technical feasibility, cost, and so on. Could it pass the “show me” test?

Absolutely, yes. We’ve done some preliminary work on this is, and it is the work of a few weeks to set up the federated database, and the demo systems that show how Federal and state organizations would interact with it. But I don’t mean that it would be a sketchy demo. In fact, because the basic concept is so simple, it would be a nearly complete software implementation of the federated database and all the interactions with it. Hypothetically, if there were a Federal organization that would operate “Holder”, and enough states that agreed that its interface met their needs for getting started, a real “Holder” system could be set up as quickly as that organization could amend a services agreement with one of its existing I.T. service provider organizations, and set up MOUs with other Federal agencies.

Which is of course, not exactly “quick” but the point is that the show-me demonstrates this the enabling technology exists in an immediately usable (and budgetable) form, to justify embarking on the other 99% of the work that is not technology work. Indeed, you almost have to have the tech part finished, before you can even consider the rest of it; an idea by itself will not do.

Lastly, is this reasonable or are we dreaming again? Well, let’s charitably say that we are dreaming the same voting rights dream that A.G. Holder has, and we’re here to say from the standpoint of election technology, that we could do the tech part nearly overnight, in a way that enables adoption that requires much administrative activity, but not legal or legislative activity. For techies, that’s not much of a punchline, but for policy folks who want to “fix that” quickly, it may be a very pleasant surprise.

— EJS

A.G. Holder Wants to “Fix That” for Voter Registration

In a public speech yesterday, U.S. Attorney General Eric Holder called for universal, automatic voter registration, and stated that current technology can accomplish that, despite the fact that the current system is complex and error-prone. As Reuters reported on Holder’s remarks:

By coordinating existing databases, the government could register “every eligible voter in America” and ensure that registration did not lapse during a move.

That’s easy to say, but it requires some careful thought to make it easy to do. After some discussion with election officials recently, I’ve concluded that it is in fact easy to do in tech terms, but not in a way that you might think. To explain, let me first say that one thing that’s not going to happen anytime soon is a “federal government takeover” of voter registration. VR will remain a state responsibility for the medium term, I predict.

Second, something that might happen, but would be a bad idea, is the combination of inter-state record matching and automatic registration. Why? Because we’ve already seen that in some states, recent practice includes automatic de-registration: if a computer’s matching algorithm says that you moved from one state to another, you get un-registered in the first state. (Though not registered in the second!) Of course that’s a problem if the match is incorrect — and we’ve seen plenty examples of dodgey databases yielding false positive matches — but it also can be a problem even if it is correct.

Ironically, the most recent instance of that story I’ve heard personally was from a Yale political science professor who specializes in election observation in other countries, and is keenly aware of voter registration issues as a bar to voting. While retaining her residence in CT, the prof did something that looked to some computer like taking up residence at another address — result: CT’s VR system de-registered her. Not the right way of doing universal, automatic, permanent.

One state election official explained the higher-level issue to me recently with two main points.

1. The current system places responsibility on the citizen to apprise the appropriate government. So when it appears that there has been a change of address, the state VR operators should reach to the voter in question to get the real story from them. That includes making it easier for voters to quickly find out their VR status and get help on what they can do next. (Which is what we’re doing with online VR technology this year.)

2. When deciding what to do about a reported VR change, the responsibility is the election official’s not some computer’s. Technology can help suggest to an election official that a voter’s record may be out of date, but that should not mean that the voter record should invalidated, either automatically or with a pro-forma confirmation by a person who has no more information than the computer did. What should the election official do instead? See point #1 above!

In other words, a simple interpretation of Holder’s words about database co-ordination can lead to data-mining and matching that is error prone not just because the databases have imperfect information, but also because some of the most important information — voter’s intent — is not in the database, for example “I did a postal address forwarding from my CT home to a DC address not because I moved but because I’m visiting for several weeks and don’t want to miss my mail.”

So that got me thinking about functional requirements – surprise, techie thinks about requirements not policies! – and we came up with a way to use those two principles to deliver many of the benefits of universal automatic permanent registration, without actually changing election laws and overhauling existing voter database systems. What’s required is an inter-government information sharing system:

  • that can notify state VR system operators about events that are possibly relevant to VR, without having to be authoritative about the event or even the person involved;
  • that can enable state VR system operators to take further steps to determine whether there’s been an change in voter eligibility;
  • is sufficiently flexible for a wide variety and number of government organizations to participate with ease.

In addition, not required, but darned useful to residents of the 21st century, this system would be complemented by online assistance to members of the public to help them quickly and accurately respond inquiries from election officials.

The latter we are, as I have said, already working on, and well into it. But that inter-government information sharing system, what is that? It would clearly have to be not complicated, not expensive, and not requiring changes in election law or policy. Is that possible?

I think so. Stay tuned, we may be on to something.

– EJS

Vote Adding Machine Problems in FL — We Have to Fix That!

Despite today’s blog docket being for RockTheVote, I just can’t resist pointing out a recurring type of technology-triggered election dysfunction that is happening again, and is 100% preventable using election technology that we have already developed.

Here’s the scoop: in St. Lucie County, Florida, the LEOs are having trouble coming up with a county wide grand total of votes, because their adding machine (for totting up the the subtotals from dozens of voting machines) has a great feature for human error. The full details are bit complex in terms of handling of data sticks and error messages, but I’ve been told that in early voting in 94 precincts, 40 precincts weren’t counted at all, and 54 were counted twice. Thank goodness someone noticed afterwards! (Well, 108 precincts totaled out of 94 might have been a tip off.) Sure, human error was involved, but it is not a great situation where software allows this human error to get through.

We’re only talking about software that adds up columns of numbers here! A much better solution would be one where the software refuses to add in any sub-total more than once, and refuses to identify as a finished total anything where there is a sub-total missing. Of course! And I am sure that the vendor of St. Lucie’s GEMS system has a fix for this problem in some later version of the software or some successor product. But that’s just not relevant if an election official doesn’t have the time, budget, support contract, or procurement authority to test the better upgrade, and buy it if it works satisfactorily!

What’s sad is that it is completely preventable by using an alternative adding machine like the one we developed last year (OK, shameless plug) — which of course does all these cross-checks. The LEOs would need to translate that vendor-proprietary subtotal data into a standard format — and I know some volunteer programmers who I bet would do that for them. They’d need to use an ordinary PC to run the open source tabulation software — and I know people who would set it up for them as a public service.  And they’d have to spend less than half an hour using the system to get their totals, and comparing them to the totals that their GEMS system provided.

And maybe, in order for it to be kosher, it would have to be a “pilot effort” with oversight by the EAC; we’ve already discussed that with them and understand that the resource requirements are modest.  I bet we could find a FL philanthropist who would underwrite the costs without a 2nd thought other than how small the cost was compared to the public benefit of the result – that is, avoiding one more day of delay in a series that’s causing a State to not be done with the election, more than a week after election day.

It’s just one example of the many possible election integrity benefits that can be demonstrated using technology that, so far at any rate, only non-commercial technologists have been willing to develop for governments to use to do their job correctly — in this case, producing timely and accurate election results.

— EJS

Recapping The OSCON O’Reilly Radar Conversation

A couple of weeks ago I presented at OSCON and during the conference had an opportunity to sit down with Mac Slocum, Managing Editor for the O’Reilly Radar.  We had about a half an hour conversation, for which we covered ~20 minutes of it on camera.  You can find it here if you want to watch me jaw.  But perhaps simpler below, I’ve listened to the tape, and captured the essence of my answers to Mac’s questions about what the Foundation is about and working on and the like.  I promised Matt Douglass, our Public Relations Director I’d get this up for interested followers; apologize it took me a couple of weeks.

So, here it is; again not an official transcript, but a compilation of my answers after watching and listening to the video interview about a dozen times (so you don’t have to) combined with my recollection as close as I recall my remarks – expressed and intended.

O’Reilly: How are voting systems in the U.S. currently handled?  In other words, where do they come from; procurement process; who decides/buys; etc.?

Miller: Voting systems are currently developed and delivered by proprietary systems vendors, and procured by local election jurisdictions such counties and townships. The States’ role is to approve specific products for procurement, often requiring products to have completed a Federal certification process overseen by the EAC.  However, the counties and local elections jurisdictions make the vast majority of elections equipment acquisition decisions across the country.

O’Reilly: So how many vendors are there?  Or maybe more to the point, what’s the state of the industry; who are the players; and what’s the innovation opportunity, etc.?

Miller: Most of the U.S. market is currently served by just 3 vendors.  You know, as we sit here today, just two vendors control some 88% of America’s voting systems infrastructure, and one of them has a white-knuckled grip on 75% of that.  Election Systems and Services is the largest, after having acquired Premier Systems from its parent company, Diebold.  The DoJ interceded on that acquisition under a mandatory Hart-Scott-Rodino Act review to consider potential anti-trust issues.  In their settlement with ES&S, the Company dealt off a portion of their technology (and presumably customers) to the Canadian firm Dominion Systems.  Dominion was a small player in the U.S. until recently when it acquired those technology assets of Premier (as part of the DoJ acquisition, and acquired the other fomer market force, Sequoia.  And that resulted in consolidating approximately 12% of the U.S. market. Most of the remaining U.S. market is served by Hart-Intercivic Systems.

On the one hand, I’d argued that the voting systems marketplace is so dysfunctional and malformed that there is no incentive to innovate, and at worst, there is a perverse disincentive to innovate and therefore really not much opportunity.  At least that’s what we really believed when we started the Foundation in November 2006.  Seriously, for the most part any discussion about innovation in this market today amounts to a discussion of ensuring spare parts for what’s out there.  But really what catalyzed us was the belief that we could inject a new level of opportunity… a new infusion of innovation.  So, we believe part of the innovation opportunity is demonstrated by the demise of Premier and Sequoia and now the U.S. elections market is not large or uniform enough to support a healthy eco-system of competition and innovation.  So the innovation opportunity is to abandon the proprietary product model, develop new election technology in a public benefits project, and work directly with election officials to determine their actual needs.

O’Reilly: So what is the TrustTheVote Project, and how does that relates to the Foundation?

Miller:  The Open Source Digital Voting Foundation is the enabling 501.c.3 public benefits corporation that funds and manages projects to develop innovative, publicly owned open source elections and voting technology.  The TrustTheVote Project is the flagship effort of the Foundation to design and develop an entirely new ballot eco-system.

What we’re making is an elections technology framework built on breakthrough innovations in elections administration and management and ballot casting and counting that can restore trust in how America votes.  Our design goal is to truly deliver on the four legs of integrity in elections: accuracy, transparency, trust, and security.

The reason we’re doing this is simple: this is the stuff of critical democracy infrastructure – something far too much of a public asset to privatize.  We need to deliver what the market has so far failed to deliver.  And we want to re-invent that industry – based on a new category of entrants – systems integrators who can take the open source framework, integrate it with qualified commodity hardware, and stand it up for counties and elections jurisdictions across the country.

We’re doing this with a small full time team of very senior technologists and technology business executives, as well as contractors, academia, and volunteer developers.

We’re 4 years into an 8 year undertaking – we believe the full framework will be complete and should be achieving widespread adoption, adaptation, and deployment by the close of 2016 – done right it can impact the national election cycle that year.  That said, we’re under some real pressure to expedite this because turns out that a large number of jurisdiction will be looking to replace their current proprietary systems over the next 4 years as well.

O’Reilly:  How can open source really improve the voting system?

Miller:  Well, open source is not a panacea, but we think it’s an important enabler to any solution for the problems of innovation, transparency, and cost that burden today’s elections.  Innovation is enabled by the departure from the proprietary product model, including the use of open-source licensing of software developed in a public benefits project. Transparency, or open-government features and capabilities of voting systems are largely absent and require innovation that the current market does not support. Cost reduction can be enabled by an open-source-based delivery model in which procurements allow system integrators to compete for delivery license-free voting systems, coupled with technical support that lacks the vendor lock-in of current procurements. Open source software doesn’t guarantee any of these benefits, but it does enable them.

I should point out too, that one of our deepest commitments is to elections verification and auditability (sic).  And our framework, based on an open standards common data format utilizing a markup language extension to XML called EML is the foundation on which we can deliver that.  Likewise, I should point out our framework is predicated on a durable paper ballot of record… although we haven’t talked about the pieces of the framework yet.

O’ReillyWell our time is limited, but you must know I can’t resist this last question, which is probably controversial but our audience is really curious about.  Will online voting ever be viable?

Miller: Well, to be intellectually honest, there are two parts to that loaded question.  Let me leave my personal opinion and the position of the Foundation out of it at first, so I just address the question in a sterile light.

First, online voting is already viable in other countries that have these 3 policy features: [1] a national ID system, [2] uniform standards for nationwide elections, and [3] have previously encouraged remote voting by mail rather than in-person voting. These countries also fund the sophisticated centralized IT infrastructure required for online voting, and have accepted the risks of malware and other Internet threats as acceptable parts of nationwide online voting.   For a similar approach to be viable in the U.S., those same 3 policy features would likely require some huge political innovations, at the 50-plus state level, if not the Federal level.   There really isn’t the political stomach for any of that and particularly national ID although arguably we already have it, or creating national elections and voting standards, let alone building a national elections system infrastructure.  In fact, the National Association of State Secretaries recently passed – actually re-upped an earlier resolution to work to sunset the Federal Elections Assistance Commission.  In other words, there is a real Federalist sense about elections.  So, on this first point of socio-political requirements alone I don’t see it viable any time soon.

But letting our opinion slip into this, the Foundation believes there is a more important barrier from a technical standpoint.  There are flat out technical barriers that have to be cleared involving critical security and privacy issues on the edge and at the core of a packet-switched based solution. Furthermore, to build the kind of hardened data center required to transact voting data is far beyond the financial reach of the vast majority of jurisdictions in the country.  Another really important point is that online elections are difficult if not impossible to audit or verify.  And finally, there is a current lack of sophisticated IT resources in most of the thousands of local elections offices that run elections in the U.S.

So, while elections remain a fundamentally local operation for the foreseeable future, and while funding for elections remains at current levels, and until the technical problems of security and privacy are resolved, nationwide online voting seems unlikely in the U.S.

That said, we should be mindful that the Internet cloud has darkened the doorstep of nearly every aspect of society as we’ve moved from the 2nd age of industrialism to the 3rd age of digitalism.  And it seems a bit foolish to assume that the Internet will not impact the conduct of elections in years to come.  We know there is a generation out there now who is maturing having never known any way to communicate, find information, shop, or anything other than online.  Their phones exist in an always-on society and they expect to be able to do everything they need to interact with their government online.  Whether that’s a reasonable expectation I don’t think is the issue.

But I think it will be important for someone to figure out what’s possible in the future – we can’t run and hide from it, but I believe we’re no where near being able to securely and verifiably use the Net for elections.  There is some very limited use in military and overseas settings, but it needs to be restricted to venues like that until the integrity issues can be ironed out.

So, we’re not supporters of widespread use of the Internet for voting and we don’t believe it will be viable in the near future on a widespread basis.  And honestly, we have too much to do in just improving upon ballot casting and counting devices in a polling place setting to spend too many cycles thinking about how to do this across the Internet.

-GAM|out

UOCAVA Remote Voting Workshop Makes a Strong Finish

24 hours ago I, along with some others, was actually considering asking for a refund.  We had come to the EAC, NIST, and FVAP co-hosted UOCAVA Remote Voting Systems 2 Day Workshop, expecting to feast on some fine discussions about the technical details and nuances of building remote voting systems for overseas voters that could muster the demands of security and privacy.  And instead we had witnessed an intellectual food fight of ideology.

That all changed in a big way today.

The producers and moderators of the event, I suspect sensing the potential side effects of yesterdays outcome — came together, somehow collectively made some adjustments (in moderation techniques, approach, and topic tweaking), and pulled off an excellent, informative day full of the kind of discourse I willingly laid down money (the Foundation’s money no less) in the first place to attend.

My hat is off; NIST and EAC on the whole did a great job with a comeback performance today that nearly excused all of what we witnessed yesterday.  Today, they exhibited self deprecating humor, and even had elections officials playing up their drunk driver characterization from the day before.

Let me share below what we covered; it was substantive.  It was detailed.  And it was tiring, but in a good way.  Here it is:

Breakout Session – Voter Authentication and Privacy

–Identified voter authentication and privacy characteristics and risks of the current UOCAVA voting process.

–Identified potential risks related to voter authentication and privacy of remote electronic absentee voting systems. For example, the group considered:

  • Ballot secrecy
  • Coercion and/or vote selling
  • Voter registration databases and voter lists
  • Strength of authentication mechanisms
  • Susceptibility to phishing/social engineering
  • Usability and accessibility of authentication mechanisms
  • Voter autonomy
  • Other potential risks

–Considered measures and/or criteria for assessing and quantifying identified risks and their potential impacts.

  • How do these compare to those of the current UOCAVA voting processes?

–Identified properties or characteristics of remote digital voting absentee voting systems that could provide comparable authentication mechanisms and privacy protections as the current UOCAVA voting process

–Considered currently available technologies that can mitigate the identified risks. How do the properties or characteristics of these technologies compare to those of the current UOCAVA voting process?

–Started to identify and discuss emerging or future research areas that hold promise for improving voter authentication and/or privacy.  For example:

  • Biometrics (e.g., speaker voice identification)
  • Novel authentication methods

–Chatted about cryptographic voting protocols and other cryptographic technologies

Breakout Session – Network and Host Security

–Identified problems and risks associated with the transmission of blank and voted ballots through the mail in the current UOCAVA voting process.

–Identified risks associated with electronic transmission or processing of blank and voted ballots.  For example, the breakout group considered:

  • Reliability and timeliness of transmission
  • Availability of voting system data and functions
  • Client-side risks to election integrity
  • Server-side risks to election integrity
  • Threats from nation-states
  • Other potential risks

–Considered and discussed measures and/or criteria for assessing and quantifying identified risks and their potential impacts.

  • How do these compare to those of the current UOCAVA voting process

–Identified properties or characteristics of remote digital absentee voting systems that could provide for the transmission of blank and voted ballots at least as reliably and securely as the current UOCAVA voting process.

–Discussed currently available technologies that can mitigate the identified risks and potential impact.

  • How do the properties and characteristics of these technologies compare to those of the current UOCAVA voting process?

–Identified and discussed emerging or future research areas that hold promise for improving network and host security.  For example:

  • Trusted computer and trusted platform models
  • End point security posture checking
  • Cloud computing
  • Virtualization
  • Semi-controlled platforms (e.g., tablets, smart phones, etc.)
  • Use of a trusted device (e.g., smart card, smart phone, etc.)

As you can see, there was a considerable amount of information covered in each 4 hour session, and then the general assembly reconvened to report on outcomes of each breakout group.

Did we solve any problems today?  Not so much.  Did we come a great deal forward in challenge identification, guiding principles development, and framing the issues that require more research and solution formulation? Absolutely.

Most importantly, John Sebes, our CTO and myself gained a great deal of knowledge we can incorporate into the work of the TrustTheVote Project, had some badly needed clarifying discussions with several, and feel we are moving in the right direction.

We clarified where we stand on use of the Internet in elections (its not time beyond anything but tightly controlled experimentation, and there is a lacking of understanding of the magnitude of resources required to stand up sufficiently hardened data centers to make it work, let alone figuring out problems at the edge.)

And we feel like we made some small contributions to helping the EAC and NIST figure out the kind of test Pilot they wish to stand up as a guiding principles reference model sometime over the next 2 years.

Easily a day’s work for the 50-60 people in attendance over the two days.

Back to the west coast (around 3am for my Pacific colleagues 😉

Its a wrap
GAM|out