Tagged voter confidence

Kudos to EAC for Exploring Critical Nature of Election Infrastructure

Kudos to EAC for this week’s public Hearing on election infrastructure as critical infrastructure! After the 2016 election cycle, I think that there is very little disagreement that election infrastructure (EI) is critical, in the sense of: vital, super-important, a matter of national security, etc. But this hearing is a bit of a turning point. I’ll explain why in terms of: discussion before the hearing, then the aftermath, and then I will make my one most important point about action going forward. I’ll close with specific recommend steps forward.

Prior Negativity

Prior to this hearing, I heard and read a lot of negativity about the idea that EI is “critical infrastructure” (CI) in the specific sense of homeland security policy. Yes, late last year, DHS did designate EI as CI, specifically as a sub-sector of the existing CI sector for government systems. And that caused alarm and the negativity I referred to, ranging from honest policy disagreement (what are the public policy ramifications of designation) to par-for-the-course political rhetoric (unprecedented Federal takeover of elections as states’ rights, etc.), and just plain “fake news” (DHS hackers breaking Federal laws to infiltrate state-managed election systems).

The fracas has been painful to me especially, as someone with years of experience in the disparate areas of cyber-security technology (since the ‘80s), critical infrastructure policy and practice (since before 9/11), DHS cyber-security research (nearly since its inception), and election technology (merely the last decade or so).

Turning Point in Dialog

That’s why the dialogue, during the EAC hearing, and the reflections in online discussion since, have been so encouraging. I hear less competing monologues and more dialogue about what EI=CI means, what official designation actually does, and how it can or can’t help us as a community respond to the threat environment. The response includes a truly essential and fundamental shift to creating, delivering, and operating EI as critical national assets like the power grid, local water and other public utilities, air traffic control, financial transaction networks, and so on. Being so uplifted by the change in tenor, I’ll drop a little concept here to blow-up some of this new dialogue:

Official CI designation is irrelevant to the way forward.

The way forward has essential steps that were possible before the official designation, and that remain possible if the designation is rescinded. These steps are urgent. Fussing over official designation is a distraction from the work at hand, and it needs to stop. EAC’s hearing was a good first step. My blog today is my little contribution to dialog about next steps.

Outlining the Way Forward

To those who haven’t been marinating in cyber CI for years, it may be odd to say that this official announcement of criticality is actually a no-op, especially given its news coverage. But thanks to changes in cyber-security law and policy over the years, the essential first steps no longer require official designation. There may be benefits over the longer term, but the immediate tasks can and should be done now, without concern for Federal policy wonkery.

Here is a short and incomplete list of essential tasks, each of which I admit deserves loads more unpacking and explaining to non-CI-dweeb people, than I can possibly do in a blog. But regardless of DHS policy, and definitely in light of the 2016 election disruption experience, the EI community can and should:

  • Start the formation of one or more of the information-sharing communities (like ISAOs or similar) that are bread-and-butter of other CI sectors.
  • If needed, take voluntary action to get DoJ and DHS assistance in the legal side of such formation.
  • Use the information sharing organizations to privately share and discuss what really happened in 2016 to prepare, detect, and respond to attacks on EI.
  • Likewise use the organizations to jointly consider available assistance, and to assess:
    • the range of types of CI related assistance that are available to election officials – both cyber and otherwise;
    • the costs and benefits of using them; and
    • for those participants who have already done or choose to voluntarily use that assistance (from DHS or elsewhere) to, inform all EI/CI operators who choose to participate.
  • Begin to form sector-specific CI guidelines specifically about changes required to operate EI assets as CI.

And all that is just to get started, to enable several further steps, including: informing the election tech market of what needs to respond to; helping the 1000s of local election offices to begin to learn how their responsibilities evolve during the transformation of EI to truly part of CI in practice.

— EJS

Cancellation of Federal Assistance to US Elections — The Good, The Bad, and The Geeky

Recently I wrote about Congress dismantling the only Federal agency that helps states and their local election officials ensure that the elections that they conduct are verifiable, accurate, and secure — and transparently so, to strengthen public trust in election results. Put that way, it may sound like dismantling the U.S. Election Assistance Commission (EAC) is both a bad idea, and also poorly timed after a highly contentious election in which election security, accuracy, and integrity were disparaged or doubted vocally and vigorously.

As I explained previously, there might be a sensible case for shutdown with a hearty “mission accomplished”  — but only with a narrow view of original mission of the EAC. I also explained that since its creation, EAC’s evolving role has come to include duties that are uniquely imperative at this point in U.S. election history. What I want to explain today is that evolved role, and why it is so important now.

Suppose that you are a county election official in the process of buying a new voting system. How do you know that what you’re buying is a legit system that does everything it should do, and reliably? It’s a bit like a county hospital administrator considering adding new medications to their formulary — how do you know that they are safe and effective? In the case of medications, the FDA runs a regulatory testing program and approves medications as safe and effective for particular purposes.

In the case of voting systems, the EAC (with support from NIST) has an analogous role: defining the requirements for voting systems, accrediting test labs, defining requirements for how labs should test products, reviewing test labs’ work, and certifying those products that pass muster. This function is voluntary for states, who can choose whether and how to build their certification program on the basis of federal certification. The process is not exactly voluntary for vendors, but since they understandably want to have products that can work in every state, they build products to meet the requirements and pass Federal certification. The result is that each locality’s election office has a state-managed approved product list that typically includes only products that are Federally certified.

Thus far the story is pretty geeky. Nobody gets passionate about standards, test labs, and the like. It’s clear that the goals are sound and the intentions are good. But does that mean that eliminating the EAC’s role in certification is bad? Not necessarily, because there is a wide range of opinion on EAC’s effectiveness in running certification process. However, recent changes have shown how the stakes are much higher, and the role of requirements, standards, testing, and certification are more important than ever. The details about those changes will be in the next installment, but here is the gist: we are in the middle of a nationwide replacement of aging voting machines and related election tech, and in an escalating threat environment for global adversaries targeting U.S. elections. More of the same-old-same-old isn’t nearly good enough. But how would election officials gain confidence in new election tech that’s not only safe and effective, but robust against whole new categories of threat?

— EJS

Accurate Election Results in Michigan and Wisconsin is Not a Partisan Issue

counties

Courtesy, Alex Halderman Medium Article

In the last few days, we’ve been getting several questions that are variations on:

Should there be recounts in Michigan in order to make sure that the election results are accurate?

For the word “accurate” people also use any of:

  • “not hacked”
  • “not subject to voting machine malfunction”
  • “not the result of tampered voting machine”
  • “not poorly operated voting machines” or
  • “not falling apart unreliable voting machines”

The short answer to the question is:

Maybe a recount, but absolutely there should be an audit because audits can do nearly anything a recount can do.

Before explaining that key point, a nod to University of Michigan computer scientists pointing out why we don’t yet have full confidence in the election results in their State’s close presidential election, and possibly other States as well. A good summary is here and and even better explanation is here.

A Basic Democracy Issue, not Partisan

The not-at-all partisan or even political issue is election assurance – giving the public every assurance that the election results are the correct results, despite the fact that bug-prone computers and human error are part of the process. Today, we don’t know what we don’t know, in part because the current voting technology not only fails to meet the three (3) most basic technical security requirements, but really doesn’t support election assurance very well. And we need to solve that! (More on the solution below.)

A recount, however, is a political process and a legal process that’s hard to see as anything other than partisan. A recount can happen when one candidate or party looks for election assurance and does not find it. So it is really up to the legal process to determine whether to do a recount.

While that process plays out let’s focus instead on what’s needed to get the election assurance that we don’t have yet, whether it comes via a recount or from audits — and indeed, what can be done, right now.

Three Basic Steps

Leaving aside a future in which the basic technical security requirements can be met, right now, today, there is a plain pathway to election assurance of the recent election. This path has three basic steps that election officials can take.

  1. Standardized Uniform Election Audit Process
  2. State-Level Review of All Counties’ Audit Records
  3. State Public Release of All Counties Audit Records Once Finalized

The first step is the essential auditing process that should happen in every election in every county. Whether we are talking about the initial count, or a recount, it is essential that humans do the required cross-check of the computers’ work to detect and correct any malfunction, regardless of origin. That cross-check is a ballot-polling audit, where humans manually count a batch of paper ballots that the computers counted, to see if the human results and machine results match. It has to be a truly random sample, and it needs to be statistically significant, but even in the close election, it is far less work than a recount. And it works regardless of how a machine malfunction was caused, whether hacking, manipulation, software bugs, hardware glitches, or anything.

This first step should already have been taken by each county in Michigan, but at this point it is hard to be certain. Though less work than a recount, a routine ballot polling audit is still real work, and made harder by the current voting technology not aiding the process very well. (Did I mention we need to solve that?)

The second step should be a state-level review of all the records of the counties’ audits. The public needs assurance that every county did its audit correctly, and further, documented the process and its findings. If a county can’t produce detailed documentation and findings that pass muster at the State level, then alas the county will need to re-do the audit. The same would apply if the documentation turned up an error in the audit process, or a significant anomaly in a difference between the human count and the machine count.

That second step is not common everywhere, but the third step would be unusual but very beneficial and a model for the future: when a State is satisfied that all counties’ election results have been properly validated by ballot polling audit, the State elections body could publicly release all the records of all the counties’ audit process. Then anyone could independently come to the same conclusion as the State did, but especially election scientists, data scientists, and election tech experts. I know that Michigan has diligent and hardworking State election officials who are capable of doing all this, and indeed do much of it as part of the process toward the State election certification.

This Needs to Be Solved – and We Are

The fundamental objective for any election is public assurance in the result.  And where the election technology is getting in the way of that happening, it needs to be replaced with something better. That’s what we’re working toward at the OSET Institute and through the TrustTheVote Project.

No one wants the next few years to be dogged by uncertainly about whether the right person is in the Oval Office or the Senate. That will be hard for this election because of the failing voting machines that were not designed for high assurance. But America must say never again, so that in two short years and four years from now, we have election infrastructure in place that was designed from ground-up and purpose-built to make it far easier for election officials to deliver election results and election assurance.

There are several matters to address:

  • Meeting the three basic security requirements;
  • Publicly demonstrating the absence of the vulnerabilities in current voting technology;
  • Supporting evidenced-based audits that maximize confidence and minimize election officials’ efforts; and
  • Making it easy to publish detailed data in standard formats, that enable anyone to drill down as far as needed to independently assess whether audits really did the job right.

All that and more!

The good news (in a shameless plug for our digital public works project) is that’s what we’re building in ElectOS. It is the first openly public and freely available set of election technology; an “operating system” of sorts for the next generation of voting systems, in the same way and Android is the basis for much of today’s mobile communication and computing.

— John Sebes

Money Shot: What Does a $40M Bet on Scytl Mean?

…not much we think.

Yesterday’s news of Microsoft co-founder billionaire Paul Allen’s investing $40M in the Spanish election technology company Scytl is validation that elections remain a backwater of innovation in the digital age.

But it is not validation that there is a viable commercial market for voting systems of the size typically attracting venture capitalists; the market is dysfunctional and small and governments continue to be without budget.

And the challenges of building a user-friendly secure online voting system that simultaneously protects the anonymity of the ballot is an interesting problem that only an investor of the stature of Mr. Allen can tackle.

We think this illuminates a larger question:

To what extent should the core technology of the most vital aspect of our Democracy be proprietary and black box, rather than publicly owned and transparent?

To us, that is a threshold public policy question, commercial investment viability issues notwithstanding.

To be sure, it is encouraging to see Vulcan Capital and a visionary like Paul Allen invest in voting technology. The challenges facing a successful elections ecosystem are complex and evolving and we will need the collective genius of the tech industry’s brightest to deliver fundamental innovation.

We at the TrustTheVote Project believe voting is a vital component of our nation’s democracy infrastructure and that American voters expect and deserve a voting experience that’s verifiable, accurate, secure and transparent.  Will Scytl be the way to do so?

The Main Thing

The one thing that stood out to us in the various articles on the investment were Scytl’s comments and assertions of their security with international patents on cryptographic protocols.  We’ve been around the space of INFOSEC for a long time and know a lot of really smart people in the crypto field.  So, we’re curious to learn more about their IP innovations.  And yet that assertion is actually a red herring to us.

Here’s the main thing: transacting ballots over the public packet switched network is not simply about security.   Its also about privacy; that is, the secrecy of the ballot.  Here is an immutable maxim about the digital world of security and privacy: there is an inverse relationship, which holds that as security is increased, privacy must be decreased, and vice-verse.  Just consider any airport security experience.  If you want maximum security then you must surrender a bunch of privacy.  This is the main challenge of transacting ballots across the Internet, and why that transaction is so very different from banking online or looking at your medical record.

And then there is the entire issue of infrastructure.  We continue to harp on this, and still wait for a good answer.  If by their own admissions, the Department of Defense, Google, Target, and dozens of others have challenges securifying their own data centers, how exactly can we be certain that a vendor on a cloud-based service model or an in-house data center of a county or State has any better chance of doing so? Security is an arms race.  Consider the news today about Heartbleed alone.

Oh, and please for the sake of credibility can the marketing machinery stop using the phrase “military grade security?”  There is no such thing.  And it has nothing to do with an increase in the  128-bit encryption standard RSA keys to say, 512 or 1024 bit.  128-bit keys are fine and there is nothing military to it (other than the Military uses it).  Here is an interesting article from some years ago on the sufficiency of current crypto and the related marketing arms race.  Saying “military grade” is meaningless hype.  Besides, the security issues run far beyond the transit of data between machines.

In short, there is much the public should demand to understand from anyone’s security assertions, international patents notwithstanding.  And that goes for us too.

The Bottom Line

While we laud Mr. Allen’s investment in what surely is an interesting problem, no one should think for a moment that this signals some sort of commercial viability or tremendous growth market opportunity.  Nor should anyone assume that throwing money at a problem will necessarily fix it (or deliver us from the backwaters of Government elections I.T.).  Nor should we assume that this somehow validates Scytl’s “model” for “security.”

Perhaps more importantly, while we need lots of attention, research, development and experimentation, the bottom line to us is whether the outcome should be a commercial proprietary black-box result or an open transparent publicly owned result… where the “result” as used here refers to the core technology of casting and counting ballots, and not the viable and necessary commercial business of delivering, deploying and servicing that technology.

Crowd Sourcing Polling Place Wait Times

Long lines at the polling place are becoming a thorn in our democracy.

We realized a few months ago that our elections technology framework data layer could provide information that when combined with community-based information gathering might lessen the discomfort of that thorn.  Actually, that realization happened while hearing friends extol the virtues of Waze.  Simply enough, the idea was crowd-sourcing wait information to at least gain some insight on how busy a polling place might be at the time one wants to go cast their ballot.

Well, to be sure, lots of people are noodling around lots of good ideas and there is certainly no shortage of discussion on the topic of polling place performance.  And, we’re all aware that the President has taken issue with it and after a couple of mentions in speeches, created the Bauer-Ginsberg Commission.  So, it seems reasonable to assume this idea of engaging some self-reporting isn’t entirely novel.

After all, its kewl to imagine being able to tell – in real time – what the current wait time at the polling place is, so a voter can avoid the crowds, or a news organization can track the hot spots of long lines.  We do some “ideating” below but first I offer three observations from our noodling:

  • It really is a good idea; but
  • There’s a large lemon in it; yet
  • We have the recipe for some decent lemonade.

Here’s the Ideation Part

Wouldn’t it be great if everybody could use an app on their smarty phone to say, “Hi All, its me, I just arrived at my polling place, the line looks a bit long.” and then later, “Me again, OK, just finished voting, and geesh, like 90 minutes from start to finish… not so good,” or “Me again, I’m bailing.  Need to get to airport.”

And wouldn’t it be great if all that input from every voter was gathered in the cloud somehow, so I could look-up my polling place, see the wait time, the trend line of wait times, the percentage of my precinct’s non-absentee voters who already voted, and other helpful stuff?  And wouldn’t it be interesting if the news media could show a real time view across a whole county or State?

Well, if you’re reading this, I bet you agree, “Yes, yes it would.”  Sure.  Except for one thing.  To be really useful it would have to be accurate.  And if there is a question about accuracy (ah shoot, ya know where this is going, don-cha?) Yes, there is always that Grinch called “abuse.”

Sigh. We know from recent big elections that apparently, partisan organizations are sometimes willing to spend lots of money on billboard ads, spam campaigns, robo-calls, and so on, to actually try to discourage people from going to the polls, within targeted locales and/or demographics. So, we could expect this great idea, in some cases, to fall afoul of similar abuse.  And that’s the fat lemon.

But, please read on.

Now, we can imagine some frequent readers spinning up to accuse us of wanting everything to be perfectly secure, of letting the best be the enemy of the good, and noting that nothing will ever be accomplished if first every objection must be overcome. On other days, they might be right, but not so much today.

We don’t believe this polling place traffic monitoring service idea requires the invention of some new security, or integrity, or privacy stuff.  On the other hand, relying on the honor system is probably not right either.  Instead, we think that in real life something like this would have a much better chance of launch and sustained benefit, if it were based on some existing model of voters doing mobile computing in responsible way that’s not trivial to abuse like the honor system.

And that lead us to the good news – you see, we have such an existing model, in real life. That’s the new ingredient, along with that lemon above, and a little innovative sugar, for the lemonade that I mentioned.

Stay tuned for Part 2, and while waiting you might glance at this.

Do Trade Secrets Hinder Verifiable Elections? (Duh)

Slate Magazine posted an article this week, which in sum and substance suggests that trade secret law makes it impossible to independently verify that voting machines are working correctly.  In a short, we say, “Really, and is this a recent revelation?

Of course, those who have followed the TrustTheVote Project know that we’ve been suggesting this in so many words for years.  I appreciate that author David Levine refers to elections technology as “critical infrastructure.”  We’ve been suggesting the concept of “critical democracy infrastructure” for years.

To be sure, I’m gratified to see this article appear, particularly as we head to what appears to be the closest presidential election since 2000.  The article is totally worth a read, but here is an excerpt worth highlighting from Levine’s essay:

The risk of the theft (known in trade secret parlance as misappropriation) of trade secrets—generally defined as information that derives economic value from not being known by competitors, like the formula for Coca-Cola—is a serious issue. But should the “special sauce” found in voting machines really be treated the same way as Coca-Cola’s recipe? Do we want the source code that tells the machine how to register, count, and tabulate votes to be a trade secret such that the public cannot verify that an election has been conducted accurately and fairly without resorting to (ironically) paper verification? Can we trust the private vendors when they assure us that the votes will be assigned to the right candidate and won’t be double-counted or simply disappear, and that the machines can’t be hacked?

Well, we all know (as he concludes) that all of the above have either been demonstrated to be a risk or have actually transpired.  The challenge is that the otherwise legitimate use of trade secret law ensures that the public has no way to independently verify that voting machinery is properly functioning, as was discussed in this Scientific American article from last January (also cited by Levine.)

Of course, what Levine is apparently not aware of (probably our bad) is that there is an alternative approach on the horizon,  regardless of whether the government ever determines a way to “change the rules” for commercial vendors of proprietary voting technology with regard to ensuring independent verifiability.

As a recovering IP lawyer, I’ll add one more thing we’ve discussed within the TrustTheVote Project and the Foundation for years: this is a reason that patents — including business method patents — are arguably helpful.  Patents are about disclosure and publication, trade secrets are, be definition, not.  Of course, to be sure, a patent alone would not be sufficient because within the intricacies of a patent prosecution there is an allowance that only requires partial disclosure of software source code.  Of course, “partial disclosure” must meet a test of sufficiency for one “reasonably skilled in the art” to “independently produce the subject matter of the invention.”  And therein lies the wonderful mushy grounds on which to argue a host of issues if put to the test.  But ironically, the intention of partial code disclosure is to protect trade secrets while still facilitating a patent prosecution.

That aside, I also note that in the face of all the nonsense floating about in the blogosphere and other mainstream media whether about charges of Romney’s ownership interest in voting machinery companies being a pathway to steal an election or suggesting a Soros-Spanish based voting technology company’s conspiracy to deliver tampered tallies, Levine’s article is a breath of fresh air deserving the attention ridiculously lavished on these latest urban myths.

Strap in… T-12 days.  I fear a nail biter from all view points.

GAM|out

Public Benefit from Online Voter Registration?

Some feedback on a couple recent blogs showed that I didn’t do such a great job on defining how our OVR work creates public benefit. So let me try again, with thanks to a canny reader who pointed out the subtlety involved.

But first, let me restate what our OVR work is: online voter registration assistance technology for NGOs like RockTheVote and government organizations like state and local boards of election. Through our work with RockTheVote, a large and expanding number of good government groups and other NGOs can quickly get an OVR system of their own, without deploying software or operating computers; and some can take advantage of options to largely re-work the appearance of the OVR web application, and/or integrate with mobile clients and social media. We’re also helping drive registrants to the government organizations as well, for those states with a strong online voter registration systems, who have requested that the Rocky OVR system give users the option of registering with the state board of elections. Then, out at the bleeding edge, it is even possible for local or state election officials to piggyback on the OVR system to have their own 100% election-official-managed online voter registration assistance system, with the same look and feel as other county or state web sites, and all without any procurement or deployment.

So, fair enough, we’re the technology provider in a mix of many organizations who either want to help people register to vote (NGOs) or are have a basic mission of helping people register — county registrars and state election officials. So where is the public benefit? And where is the subtlety that I mentioned? Many people would say that in a broad way, the public as a whole benefits when more eligible voters are registered and participate in elections — but not all. In fact, that is a political issue that we at OSDV want to steer clear of, especially given the political conflicts between some, who wish to aggressively register people in droves and who are more concerned about participation than eligibility, and others who are concerned about possible fraud and are more concerned about eligibility that participation. The debate about voter registration practices goes from one extreme where an election is tainted if it seems that a single eligible voter was barred from participation, to the the other extreme where an election is tainted if there is a suspicion about a single ineligible person having cast a ballot.

So where do public benefits arise separately from these political issues? In a word: access, from a citizen perspective; and duty, from an election official perspective. Every eligible citizen deserves and is entitled to access to elections. It is the duty of election officials to provide that access to the eligible citizens who demand access, and to fairly and expeditiously assess every request for eligibility. Whether or not one is a fan of voter registration drives, or of voter roll purging, there is this shared value: eligible citizens who are trying to participate in elections should not have the access blocked by election officials. Yet in many cases that does occur because well-meaning public officials simply lack the resources, staff, or budget to be responsive to citizen needs. In OSDV’s wheel-house, the lack that we address is lack of  election technology, or lack of an effective way to acquire and deploy relevant technology.

And the technology angle is particularly important for younger citizens, who have been using computers and smart phones for practically everything for their whole lives. And network and mobile technology is in fact appropriate for registration and all manner of other voter services — unlike voting which has unique anonymity and integrity requirements — and so people expect it. Many election officials use technology to help them more effectively carry out their duties, meeting those expectations —  including those relating to voter registration. But for other election officials, there is gap between what they need, and what they are actually able to do within limitations of budget, procurement, staff; or products that simply don’t provide the functions appropriate to their jurisdiction. So the gap has multiple dimensions, but across them all, government officials are doing less than they could, in performance of their duties to provide election access to those who are actively seeking it and are eligible.

So when we or anyone else helps to fill that gap with new or better or more available technology, then we have enabled public benefit: election officials can do more in spite of having less resources every year; entitled voters can vote; and thirdly and often overlooked, good government groups and watchdog agencies have more visibility to assess how well the election officials really are doing their job. And that third factor is quite important. Just look at the horror-show of suspicion, vituperation, conspiracy theory, litigation, and Internet-speed dis- or mis-information that spun up recently in Talahassee and Memphis and elsewhere, over removal of people from voter rolls. It may be that nefarious people really were rigging the poll books, or it may be the electronic voter records are in significant dis-array, or it may be voter record databases are antique and prone to administrative error. But we’ll never really know. Resource constrained election organizations, that run old election technology with demonstrated flaws, and little or no self-record-keeping, find it extremely difficult to demonstrate to interested and entitled observers, exactly what is going on inside the computers, when one of these election year firestorms brews up.

And when the firestorm is big enough, it essentially prevents election officials from delivering on a fundamental duty: performing accurate and trustworthy elections. In other words, those firestorms are also a detriment to public confidence in elections. We, in addition to helping election officials perform their duties, are also passionate about delivering technology that can help with the transparency that’s part of firestorm prevention, and reducing their public detriment.

And lastly that brings me to a related point for another day: how the technology that we’re developing now can help deliver that transparency, along with the improvement in the technical infrastructure for U.S. elections. The next chunk is still in the oven, but I really look forward to sharing it here, when it is fully baked.

— EJS

Spokane County Ballot Copying — Problem?

Here is some interesting news from Spokane WA, where ballot counting has been seriously delayed because election officials are hand copying tens of thousands of ballots. It’s an interesting lesson in how vote-by-mail (Spokane is an all-VBM county in WA) creates higher operational requirements for accountability, transparency, and election integrity.

Some readers may not be familiar with the practice of hand-copying VBM ballots, and ask: what’s going on? The situation is that for some reasons (read the news article for speculation on why), thousands of Spokane voters did not follow instructions on marking their ballot, for example, putting a check mark over a bubble rather than filling the bubble. checkedbubbleIf a paper ballot has even one of these mistakes anywhere, the ballot can’t be machine counted — the optical counting device kicks the ballot back out. And because this is vote-by-mail where the voter is not present during counting, there is no voter to ask to re-do the ballot. Instead, local election officials (LEOs) have to simply guess what the voter meant.

This is called “interpreting the voter’s intent” in order to count every vote that the LEOs think that the voter cast on the ballot. After making such an interpretation of a ballot, an LEO marks a new blank ballot, copying all the voter’s marks to tidy filled-in bubbles that the scanners will count. After all the uncountable ballots have been copied to a countable ballot-copy, the voting counting can finally proceed.

I’ve said many times that election technology should provide (and as our efforts at TTV bear fruit, will provide) support for such interpretation, and do so with as much logging and transparency as possible. I think that most people would agree that confidence in an election result depends in part on knowing how many votes were created by LEOs on behalf of a voter, rather than the mark of a voter that is so unambiguous that a machine can recognize it. Such automation might also reduce the need for laborious copying, preserving for all to see, an image of the original ballot together with the interpretation provided by LEOs during the counting process.

But the scale of Spokane operation really has me squirming. Tens of thousands! I mean, sure, I believe that the process is being done diligently, with intense scrutiny by people independent of the LEOs (members of the public, good government groups, political party people). But over days and days of efforts, under pressure to get the election results out, I fear that exhaustion and human error may take a toll. And unless the public (or at least auditors) have access to each ballot in all 3 forms (what the voter provided, what an LEO transcribed, what the scanner counted) it is going to be very hard determine whether this large-scale transcription process introduced errors. If this process were happening, for example, in New York with several very close contests, I could see people pushing for hand re-count. Let’s hope that in WA the margins of victory are larger that the errors that could have been introduced by transcription.

And in the meantime, I wish the best to Spokane LEOs plowing through this mound of uncountable paper, and I continue squirm, wishing we had already finished the TTV central-count technology that could really help today.

— EJS

Dust Settles on Election Results, But Not Voting System Troubles

It should come as no surprise that this month’s election activities included claims of voting machine malfunction and related investigation and litigation. In many parts of the U.S., the voting systems used this month are the same flakey systems that in the past have created controversy and legal wrangling. (I promise to define “flakey”.)

But are the new lessons learned? or is this more of the same underwhelming voting technology experience that observers have come to expect? I think that, yes, there are new lessons learned. North Carolina the source for one set of teachable remarks, shown in two statements made in the context of North Carolina’s voting machine controversy in this election.

The background is that in some parts of NC there were numerous reports of touch-screen voting machines apparently malfunctioning, swapping voter selections from what the voter intended, to selections that they hadn’t made. (Some people call this “vote flipping” but I find it to be a misleading term that doesn’t cover the extensive range of odd touch-screen behavior.) The NC RNC claimed that these glitches seemed to favor Democratic candidates over Republican candidates, and started some interesting litigation.

The first notable statement was from NC GOP chair Tom Fetzer in the context of starting the litigation:

We cannot have an election where voters in counties where the machines are used have less confidence that their votes are being accurately counted than in counties where optical scan ballots are used …

The second is form Johnnie McLean, deputy director of the State Board of Elections, at the conclusion of the litigation:

I hope this is the end of the issue. We have every confidence in the voting systems North Carolina has and I’ve seen no evidence that we should feel differently.

I really find these to be curious statements that nevertheless cast some new light on the existing decades-old touch screen systems. With respect to Mr. Fetzer, I don’t think that one kind of voting machine is inherently more reliable than another — though people may have a more confident feeling about one over the other. Both the optical scanners and the touch-screen DREs are computers running software with bugs, and it’s possible that either could be mis-counting votes. Both can and should be cross-checked in the same way with statistical audits using hand-counting of either the scanned paper ballots, or the paper record produced by the DRE.

Old news: every kind of voting machine is a computer that should not be blindly trusted to operate correctly. New news: that fact is not altered if some people think that one system is more flakey than others. With respect to Ms. McLean, people will unavoidably “feel differently” if they see touch screens mis-behaving.

Next time … “flakey” defined, and a full response to Ms. McLean.

— EJS

Recapping The OSCON O’Reilly Radar Conversation

A couple of weeks ago I presented at OSCON and during the conference had an opportunity to sit down with Mac Slocum, Managing Editor for the O’Reilly Radar.  We had about a half an hour conversation, for which we covered ~20 minutes of it on camera.  You can find it here if you want to watch me jaw.  But perhaps simpler below, I’ve listened to the tape, and captured the essence of my answers to Mac’s questions about what the Foundation is about and working on and the like.  I promised Matt Douglass, our Public Relations Director I’d get this up for interested followers; apologize it took me a couple of weeks.

So, here it is; again not an official transcript, but a compilation of my answers after watching and listening to the video interview about a dozen times (so you don’t have to) combined with my recollection as close as I recall my remarks – expressed and intended.

O’Reilly: How are voting systems in the U.S. currently handled?  In other words, where do they come from; procurement process; who decides/buys; etc.?

Miller: Voting systems are currently developed and delivered by proprietary systems vendors, and procured by local election jurisdictions such counties and townships. The States’ role is to approve specific products for procurement, often requiring products to have completed a Federal certification process overseen by the EAC.  However, the counties and local elections jurisdictions make the vast majority of elections equipment acquisition decisions across the country.

O’Reilly: So how many vendors are there?  Or maybe more to the point, what’s the state of the industry; who are the players; and what’s the innovation opportunity, etc.?

Miller: Most of the U.S. market is currently served by just 3 vendors.  You know, as we sit here today, just two vendors control some 88% of America’s voting systems infrastructure, and one of them has a white-knuckled grip on 75% of that.  Election Systems and Services is the largest, after having acquired Premier Systems from its parent company, Diebold.  The DoJ interceded on that acquisition under a mandatory Hart-Scott-Rodino Act review to consider potential anti-trust issues.  In their settlement with ES&S, the Company dealt off a portion of their technology (and presumably customers) to the Canadian firm Dominion Systems.  Dominion was a small player in the U.S. until recently when it acquired those technology assets of Premier (as part of the DoJ acquisition, and acquired the other fomer market force, Sequoia.  And that resulted in consolidating approximately 12% of the U.S. market. Most of the remaining U.S. market is served by Hart-Intercivic Systems.

On the one hand, I’d argued that the voting systems marketplace is so dysfunctional and malformed that there is no incentive to innovate, and at worst, there is a perverse disincentive to innovate and therefore really not much opportunity.  At least that’s what we really believed when we started the Foundation in November 2006.  Seriously, for the most part any discussion about innovation in this market today amounts to a discussion of ensuring spare parts for what’s out there.  But really what catalyzed us was the belief that we could inject a new level of opportunity… a new infusion of innovation.  So, we believe part of the innovation opportunity is demonstrated by the demise of Premier and Sequoia and now the U.S. elections market is not large or uniform enough to support a healthy eco-system of competition and innovation.  So the innovation opportunity is to abandon the proprietary product model, develop new election technology in a public benefits project, and work directly with election officials to determine their actual needs.

O’Reilly: So what is the TrustTheVote Project, and how does that relates to the Foundation?

Miller:  The Open Source Digital Voting Foundation is the enabling 501.c.3 public benefits corporation that funds and manages projects to develop innovative, publicly owned open source elections and voting technology.  The TrustTheVote Project is the flagship effort of the Foundation to design and develop an entirely new ballot eco-system.

What we’re making is an elections technology framework built on breakthrough innovations in elections administration and management and ballot casting and counting that can restore trust in how America votes.  Our design goal is to truly deliver on the four legs of integrity in elections: accuracy, transparency, trust, and security.

The reason we’re doing this is simple: this is the stuff of critical democracy infrastructure – something far too much of a public asset to privatize.  We need to deliver what the market has so far failed to deliver.  And we want to re-invent that industry – based on a new category of entrants – systems integrators who can take the open source framework, integrate it with qualified commodity hardware, and stand it up for counties and elections jurisdictions across the country.

We’re doing this with a small full time team of very senior technologists and technology business executives, as well as contractors, academia, and volunteer developers.

We’re 4 years into an 8 year undertaking – we believe the full framework will be complete and should be achieving widespread adoption, adaptation, and deployment by the close of 2016 – done right it can impact the national election cycle that year.  That said, we’re under some real pressure to expedite this because turns out that a large number of jurisdiction will be looking to replace their current proprietary systems over the next 4 years as well.

O’Reilly:  How can open source really improve the voting system?

Miller:  Well, open source is not a panacea, but we think it’s an important enabler to any solution for the problems of innovation, transparency, and cost that burden today’s elections.  Innovation is enabled by the departure from the proprietary product model, including the use of open-source licensing of software developed in a public benefits project. Transparency, or open-government features and capabilities of voting systems are largely absent and require innovation that the current market does not support. Cost reduction can be enabled by an open-source-based delivery model in which procurements allow system integrators to compete for delivery license-free voting systems, coupled with technical support that lacks the vendor lock-in of current procurements. Open source software doesn’t guarantee any of these benefits, but it does enable them.

I should point out too, that one of our deepest commitments is to elections verification and auditability (sic).  And our framework, based on an open standards common data format utilizing a markup language extension to XML called EML is the foundation on which we can deliver that.  Likewise, I should point out our framework is predicated on a durable paper ballot of record… although we haven’t talked about the pieces of the framework yet.

O’ReillyWell our time is limited, but you must know I can’t resist this last question, which is probably controversial but our audience is really curious about.  Will online voting ever be viable?

Miller: Well, to be intellectually honest, there are two parts to that loaded question.  Let me leave my personal opinion and the position of the Foundation out of it at first, so I just address the question in a sterile light.

First, online voting is already viable in other countries that have these 3 policy features: [1] a national ID system, [2] uniform standards for nationwide elections, and [3] have previously encouraged remote voting by mail rather than in-person voting. These countries also fund the sophisticated centralized IT infrastructure required for online voting, and have accepted the risks of malware and other Internet threats as acceptable parts of nationwide online voting.   For a similar approach to be viable in the U.S., those same 3 policy features would likely require some huge political innovations, at the 50-plus state level, if not the Federal level.   There really isn’t the political stomach for any of that and particularly national ID although arguably we already have it, or creating national elections and voting standards, let alone building a national elections system infrastructure.  In fact, the National Association of State Secretaries recently passed – actually re-upped an earlier resolution to work to sunset the Federal Elections Assistance Commission.  In other words, there is a real Federalist sense about elections.  So, on this first point of socio-political requirements alone I don’t see it viable any time soon.

But letting our opinion slip into this, the Foundation believes there is a more important barrier from a technical standpoint.  There are flat out technical barriers that have to be cleared involving critical security and privacy issues on the edge and at the core of a packet-switched based solution. Furthermore, to build the kind of hardened data center required to transact voting data is far beyond the financial reach of the vast majority of jurisdictions in the country.  Another really important point is that online elections are difficult if not impossible to audit or verify.  And finally, there is a current lack of sophisticated IT resources in most of the thousands of local elections offices that run elections in the U.S.

So, while elections remain a fundamentally local operation for the foreseeable future, and while funding for elections remains at current levels, and until the technical problems of security and privacy are resolved, nationwide online voting seems unlikely in the U.S.

That said, we should be mindful that the Internet cloud has darkened the doorstep of nearly every aspect of society as we’ve moved from the 2nd age of industrialism to the 3rd age of digitalism.  And it seems a bit foolish to assume that the Internet will not impact the conduct of elections in years to come.  We know there is a generation out there now who is maturing having never known any way to communicate, find information, shop, or anything other than online.  Their phones exist in an always-on society and they expect to be able to do everything they need to interact with their government online.  Whether that’s a reasonable expectation I don’t think is the issue.

But I think it will be important for someone to figure out what’s possible in the future – we can’t run and hide from it, but I believe we’re no where near being able to securely and verifiably use the Net for elections.  There is some very limited use in military and overseas settings, but it needs to be restricted to venues like that until the integrity issues can be ironed out.

So, we’re not supporters of widespread use of the Internet for voting and we don’t believe it will be viable in the near future on a widespread basis.  And honestly, we have too much to do in just improving upon ballot casting and counting devices in a polling place setting to spend too many cycles thinking about how to do this across the Internet.

-GAM|out