By Gregory Miller

PCEA Report Finally Out: The Real Opportunity for Innovation Inside

PCEACoverThis week the PCEA finally released its long-awaited report to the President.  Its loaded with good recommendations.  Over the next several days or posts we’ll give you our take on some of them.  For the moment, we want to call your attention to a couple of under-pinning elements now that its done.

The Resource Behind the Resources

Early in the formation of what initially was referred to as the “Bauer-Ginsberg Commission” we were asked to visit the co-chairs in Washington D.C. to chat about technology experts and resources.  We have a Board member who knows them both and when asked we were honored to respond.

Early on we advised the Co-Chairs that their research would be incomplete without speaking with several election technology experts, and of course they agreed.  The question was how to create a means to do so and not bog down the progress governed by layers of necessary administrative regulations.

I take a paragraph here to observe that I was very impressed in our initial meeting with Bob Bauer and Ben Ginsberg.  Despite being polar political opposites they demonstrated how Washington should work: they were respectful, collegial, sought compromise to advance the common agenda and seemed to be intent on checking politics at the door in order to get work done.  It was refreshing and restored my faith that somewhere in the District there remains a potential for government to actually work for the people.  I digress.

We advised them that looking to the CalTech-MIT Voting Project would definitely be one resource they could benefit from having.

We offered our own organization, but with our tax exempt status still pending, it would be difficult politically and otherwise to rely on us much in a visible manner.

So the Chairs asked us if we could pull together a list — not an official subcommittee mind you, but a list of the top “go to” minds in the elections technology domain.  We agreed and began a several week process of vetting a list that needed to be winnowed down to about 20 for manageability  These experts would be brought in individually as desired, or collectively  — it was to be figured out later which would be most administratively expedient.  Several of our readers, supporters, and those who know us were aware of this confidential effort.  The challenge was lack of time to run the entire process of public recruiting and selection.  So, they asked us to help expedite that, having determined we could gather the best in short order.

And that was fine because anyone was entitled to contact the Commission, submit letters and comments and come testify or speak at the several public hearings to be held.

So we did that.  And several of that group were in fact utilized.  Not everyone though, and that was kind of disappointing, but a function of the timing constraints.

The next major resource we advised they had to include besides CalTech-MIT and a tech advisory group was Rock The Vote.  And that was because (notwithstanding they being a technology partner of ours) Rock The Vote has its ear to the rails of new and young voters starting with their registration experience and initial opportunity to cast their ballot.

Finally we noted that there were a couple of other resources they really could not afford to over-look including the Verified Voting Foundation, and L.A. County’s VSAP Project and Travis County’s StarVote Project.

The outcome of all of that brings me to the meat of this post about the PCEA Report and our real contribution.  Sure, we had some behind the scenes involvement as I describe above.  No big deal.  We hope it helped.

The Real Opportunity for Innovation

But the real opportunity to contribute came in the creation of the PCEA Web Site and its resource toolkit pages.

On that site, the PCEA took our advice and chose to utilize Rock The Vote’s open source voter registration tools and specifically the foundational elements the TrustTheVote Project has built for a States’ Voter Information Services Portal.

Together, Rock The Vote and the TrustTheVote Project are able to showcase the open source software that any State can adopt, adapt, and deploy–for free (at least the adoption part) and without having to reinvent the wheel by paying for a ground-up custom build of their own online voter registration and information services portal.

We submit that this resource on their PCEA web site represents an important ingredient to injecting innovation into a stagnant technology environment of today’s elections and voting systems world.

For the first time, there is production-ready open source software available for an important part of an elections official’s administrative responsibilities that can lower costs, accelerate deployment and catalyze innovation.

To be sure, its only a start — its lower hanging fruit of an election technology platform that doesn’t require any sort of certification. With our exempt status in place, and lots of things happening we’ll soon share, there is more, much more, to come.  But this is a start.

There is a 112 pages of goodness in the PCEA report.  And there are some elements in there that deserve further discussion.  But we humbly assert its the availability of some open source software on their resource web site that we think represents a quiet breakthrough in elections technology innovation.

The news has been considerable.  So, yep, we admit it.  We’re oozing pride today.
And we owe it to your continued support of our cause.
Thank you!

GAM | out

Comments Prepared for Tonight’s Elections Technology Roundtable

This evening at 5:00pm members of the TrustTheVote Project have been invited to attend an elections technology round table discussion in advance of a public hearing in Sacramento, CA scheduled for tomorrow at 2:00pm PST on new regulations governing Voting System Certification to be contained in Division 7 of Title 2 of the California Code of Regulations.

Due to the level of activity, only our CTO, John Sebes is able to participate.

We were asked if John could be prepared to make some brief remarks regarding our view of the impact of SB-360 and its potential to catalyze innovation in voting systems.  These types of events are always dynamic and fluid, and so we decided to publish our remarks below just in advance of this meeting.

Roundtable Meeting Remarks from the OSDV Foundation | TrustTheVote Project

We appreciate an opportunity to participate in this important discussion.  We want to take about 2 minutes to comment on 3 considerations from our point of view at the TrustTheVote Project.

1. Certification

For SB-360 to succeed, we believe any effort to create a high-integrity certification process requires re-thinking how certification has been done to this point.  Current federal certification, for example, takes a monolithic approach; that is, a voting system is certified based on a complete all-inclusive single closed system model.  This is a very 20th century approach that makes assumptions about software, hardware, and systems that are out of touch with today’s dynamic technology environment, where the lifetime of commodity hardware is months.

We are collaborating with NIST on a way to update this outdated model with a “component-ized” approach; that is, a unit-level testing method, such that if a component needs to be changed, the only re-certification required would be of that discrete element, and not the entire system.  There are enormous potential benefits including lowering costs, speeding certification, and removing a bar to innovation.

We’re glad to talk more about this proposed updated certification model, as it might inform any certification processes to be implemented in California.  Regardless, elections officials should consider that in order to reap the benefits of SB-360, the non-profit TrustTheVote Project believes a new certification process, component-ized as we describe it, is essential.

2. Standards

2nd, there is a prerequisite for component-level certification that until recently wasn’t available: common open data format standards that enable components to communicate with one another; for example, a format for a ballot-counter’s output of vote tally data, that also serves as input to a tabulator component.  Without common data formats elections officials have to acquire a whole integrated product suite that communicates in a proprietary manner.  With common data formats, you can mix and match; and perhaps more importantly, incrementally replace units over time, rather than doing what we like to refer to as “forklift upgrades” or “fleet replacements.”

The good news is the scope for ballot casting and counting is sufficiently focused to avoid distraction from the many other standards elements of the entire elections ecosystem.  And there is more goodness because standards bodies are working on this right now, with participation by several state and local election officials, as well as vendors present today, and non-profit projects like TrustTheVote.  They deserve congratulations for reaching this imperative state of data standards détente.  It’s not finished, but the effort and momentum is there.

So, elections officials should bear in mind that benefits of SB-360 also rest on the existence of common open elections data standards.

3. Commercial Revitalization

Finally, this may be the opportunity to realize a vision we have that open data standards, a new certification process, and lowered bars to innovation through open sourcing, will reinvigorate a stagnant voting technology industry.  Because the passage of SB-360 can fortify these three developments, there can (and should) be renewed commercial enthusiasm for innovation.  Such should bring about new vendors, new solutions, and new empowerment of elections officials themselves to choose how they want to raise their voting systems to a higher grade of performance, reliability, fault tolerance, and integrity.

One compelling example is the potential for commodity commercial off-the-shelf hardware to fully meet the needs of voting and elections machinery.  To that point, let us offer an important clarification and dispel a misconception about rolling your own.  This does not mean that elections officials are about to be left to self-vend.  And by that we mean self-construct and support their open, standard, commodity voting system components.  A few jurisdictions may consider it, but in the vast majority of cases, the Foundation forecasts that this will simply introduce more choice rather than forcing you to become a do-it-yourself type.  Some may choose to contract with a systems integrator to deploy a new system integrating commodity hardware and open source software.  Others may choose vendors who offer out-of-the-box open source solutions in pre-packaged hardware.

Choice is good: it’s an awesome self-correcting market regulator and it ensures opportunity for innovation.  To the latter point, we believe initiatives underway like STAR-vote in Travis County, TX, and the TrustTheVote Project will catalyze that innovation in an open source manner, thereby lowering costs, improving transparency, and ultimately improving the quality of what we consider critical democracy infrastructure.

In short, we think SB-360 can help inject new vitality in voting systems technology (at least in the State of California), so long as we can realize the benefits of open standards and drive the modernization of certification.

 

EDITORIAL NOTES:
There was chatter earlier this Fall about the extent to which SB-360 allegedly makes unverified non-certified voting systems a possibility in California.  We don’t read SB-360 that way at all.  We encourage you to read the text of the legislation as passed into law for yourself, and start with this meeting notice digest.  In fact, to realize the kind of vision that leading jurisdictions imagine, we cannot, nor should not alleviate certification, and we think charges that this is what will happen are misinformed.  We simply need to modernize how certification works to enable this kind of innovation.  We think our comments today bear that out.

Moreover, have a look at the Agenda for tomorrow’s hearing on implementation of SB-360.  In sum and substance the agenda is to discuss:

  1. Establishing the specifications for voting machines, voting devices, vote tabulating devices, and any software used for each, including the programs and procedures for vote tabulating and testing. (The proposed regulations would implement, interpret and make specific Section 19205 of the California Elections Code.);
  2. Clarifying the requirements imposed by recently chaptered Senate Bill 360, Chapter 602, Statutes 2013, which amended California Elections Code Division 19 regarding the certification of voting systems; and
  3. Clarifying the newly defined voting system certification process, as prescribed in Senate Bill 360.

Finally, there has been an additional charge that SB-360 is intended to “empower” LA County, such that what LA County may build they (or someone on their behalf) will sell the resulting voting systems to other jurisdictions.  We think this allegation is also misinformed for two reasons: [1] assuming LA County builds their system on open source, there is a question as to what specifically would/could be offered for sale; and [2] notwithstanding offering open source for sale (which technically can be done… technically) it seems to us that if such a system is built with public dollars, then it is in fact, publicly owned.  From what we understand, a government agency cannot offer for sale their assets developed with public dollars, but they can give it away.  And indeed, this is what we’ve witnessed over the years in other jurisdictions.

Onward.

The Gift Has Arrived: Our Exempt Status After 6 Years

Today is a bit of a historical point for us: we can publicly announce the news of the IRS finally granting our tax exempt status.

The digital age is wreaking havoc, however, on the PR and news processes.  In fact, we knew about this nearly 2 weeks ago, but due to a number of legal and procedural issues and a story we were being interviewed for, we were on hold in making this important announcement.  And we’re still struggling to get this out on the wires (mostly due to a change our of our PR Agency at the most inopportune moment).

The result: WIRED Magazine actually got the jump on us (oh, the power of digital publishing), and now hours after their article posting, we’re finally getting our own press release to the world.

I have to observe, that notwithstanding a paper-chase of near epic proportions with the IRS in granting us what we know our charter deserves to do foster the good work we intend, at the end of the day, 501(c)(3) status is a gift from the government.  And we cannot lose sight of that.

So, for the ultimate outcome we are deeply grateful, please make no mistake about that.  The ways  and means of getting there was exhausting… emotionally, financially, and intellectually.  And I notice that the WIRED article makes a showcase of a remark I made in one of the many  interviews and exchanges leading up to that story about being “angry.”

I am (or was) angry at the process because 6 years to ask and re-ask us many of the same questions, and perform what I humbly believe at some point amounted to intellectual naval gazing, was crazy.  I can’t help but feel like we were being bled.  I fear there are many other valuable public benefit efforts, which involve intangible assets, striving for the ability to raise public funds to do public good, who are caught up in this same struggle.

What’s sad, is that it took the guidance and expertise (and lots of money that could be spent on delivering the on our mission) of high powered Washington D.C. lawyers to negotiate this to successful conclusion.  That’s sad, because the vast majority of projects cannot afford to do that.  Had we not been so resolute in our determination, and willing to risk our own financial stability to see this through, the TrustTheVote Project would have withered and died in prosecution of our tax exempt status over 6 years and 4 months.

Specifically, it took the expertise and experience of Caplin Drysdale lawyers Michael Durham and Marc Owen himself (who actually ran the IRS Tax Exempt Division for 10 years).  If you can find a way to afford them, you can do no better.

There is so much that could be shared about what it took and what we learned from issues of technology licensing, to nuances of what constitutes public benefit in terms of IRS regulations — not just what seems obvious.  Perhaps we’ll do so another time.  I note for instance that attorney Michael Durham was a computer science major and software engineer before becoming a tax lawyer.  I too have a very similar combination background of computer science and intellectual property law, and it turned out to be hugely helpful to have this interdisciplinary view — just odd that such would be critical to a tax exempt determination case.

However, in summary, I was taught at a very young age and through several life lessons that only patience and perseverance empower prevailing.  I guess its just the way I, and all of us on this project are wired.

Cheers
GAM | out

If it Walks Like a Duck, and Quacks Like a Duck…

So in the midst of participating in an amazing conference at MIT’s Media Lab produced in conjunction with the Knight Foundation (thanks Chris Barr you rock!) today, we learned of additional conduct by the IRS with regard to their Exempt Division’s handling of 1023 filings (for 501.c.3 determination).  In particular, this revelation appeared in an NY Times article today:

But groups with no political inclinations were also examined. “Open source software” organizations seeking nonprofit status are usually for-profit business or for-profit support technicians of the software,” a lookout list warns. “If you see a case, elevate it to your manager.”

Please let us go on record once again, here and now:

  1. The OSDV Foundation’s 1023 application (filed in February 2007) states with sworn affidavits under penalties of perjury, that our charter Article II(A), defines the OSDV Foundation as an organization organized as a nonprofit public benefit corporation, not for the private gain of any person, and it is organized under California Nonprofit Public Benefit Corporation Law for public and charitable purposes, with by-laws consistent with that charter.
  2. We are not a “for-profit business,” rather we are a non-profit project, conducting our activities as such, and never intend to be a commercial operation, or convert into a commercial entity.
  3. We are genuinely seeking, through (continued) philanthropic support of larger grantor organizations, as well as through the generous support of individual citizens, to provide education, research, and reference development of blueprints for trustworthy elections technology, on an open source basis for the benefit of elections jurisdictions nationwide because the current commercial market place is woefully falling short of that capability.
  4. We are willing to ensure our reference designs and implementations, some reduced to software source code, remain up-to-date with as-then-published Federal and/or State-specific criteria or specifications, but we will not do so as a commercial venture.
  5. We reiterate here that we are notfor-profit support technicians” for any software that results from the efforts of this California public benefits non-profit corporation.

As you might imagine, these statements have been backed by over 6-years of considerable supporting documentation, sworn to be accurate and correct to the best of our knowledge, under penalties of perjury and backed by sworn, signed affidavits.

And to be sure, we take our sworn signed statements and veracity very seriously.

We remain hopeful that the IRS will ultimately acknowledge these points and grant us our exempt determination.

TrustTheVote Project Earns Backing from Knight Foundation Prototype Fund

Greetings All-

Apologies for the extended radio silence.  I promise to one day be able to explain in some detail why that occasionally occurs, but for now I have to remain, um, silent on that point.  However, I am very happy to share with you that one of the additional reasons for being distracted from this forum has been work that resulted in today’s announcement.

Indeed, the OSDV Foundation’s TrustTheVote Project has earned a substantial grant from the Knight Foundation’s Prototype Fund.  Such was a favorable consequence of being a near brides-maid in their Knight Foundation News Challenge, which we competed for earlier this Spring.  While we did not make the final cut for the major grant program, the Knight Foundation was sufficiently excited about our proposal for open data standards based election night reporting services that they awarded us a Prototype Grant.

You can learn more about our project here.  In a sentence, let me state the metes and bounds of this project.  We will share a little about what, how, why, and when in subsequent posts.

In a sentence our project is:
Building an open source election night results reporting service tying directly into local and State elections data feeds (for which the TrustTheVote Project has already helped establish the required standards), with a public-facing web app, and a robust API to enable anyone to access reporting data for further analysis and presentation.

Some Details
So, essentially the Knight Foundation’s Prototype Fund is designed to provide a “seed grant” to enable a prototype or “early Alpha” of an app, service, or system that advances the causes of civic media and citizen engagement with news and information.  Our Election Night Reporting Service is a perfect fit.  And this 6-month project is intended to finish the development and deployment stages for an evaluation/test run on the system.  I need to point out that it will definitely be a prototype and will not include some necessary components to put the system into production, but enough framework and scaffolding to conduct a robust “alpha test for which 3 or 4 elections jurisdictions have agreed to participate.

We will announce those jurisdictions soon.  The test will utilize an early release of the Results Scoreboard — a web-based app/service to display elections results.  The alpha will also deliver an API and data feed service.

In our next post, we will discuss some details about the project in terms of the what, how, and why. But let me say quickly that the name has some legacy meaning, because its not just about election night — its about election reporting any time.  So, stay tuned!

I’d like to thank the tremendous support of OSDVF Board and TTV Project Advisers who worked closely with us on the Knight News Challenge application and for the work of the Core team and our CTO John Sebes on hammering out sufficient details originating in our work with Travis County, TX a couple of years ago.  Without their contributions — many in the 11th hour and into the pre-dawn hours last March —  this would not have been possible.

Onward!

Crowd Sourcing Polling Place Wait Times – Part 2

Last time, we wrote about the idea of a voter information service where people could crowd source the data about polling place wait times, so that other voters would benefit by not going when the lines are getting long, and so that news media and others could get a broad view of how well or poorly a county or state was doing in terms of voting time.

And as we observed such would be a fine idea, but the results from that crowd-sourced reporting would be way better if the reporting were not on the “honor system.”  Without going on a security and privacy rampage, it would be better if this idea were implemented using some existing model for people to do mobile computing voter-stuff, in a way that is not trivial to abuse, unlike the honor system.

Now, back to the good news we mentioned previously: there is an existing model we could use to limit the opportunity for abuse.  You see, many U.S. voters, in a growing number of States, already have the ability to sit in a café and use their smart phone and a web site to identify themselves sufficiently to see their full voter record, and in some cases even update part of that voter record.

So, the idea is: why not extend that with a little extra record keeping of when a voter reports that they have arrived at the polls, and when they said they were done? In fact, it need not even be an extension of existing online voter services, and could be done in places that are currently without online voter services altogether.  It could even be the first online voter service in those places.

The key here is voters “sufficiently identify themselves” through some existing process, and that identification has to be based an existing voter record.  In complex online voter services (like paperless online voter registration), that involves a 3-way real-time connection between the voter’s digital device, the front-end web server that it talks to, and a privileged and controlled connection from the front-end to obtain specific voter data in the back-end.  But in a service like this, it could be even simpler, with a system that’s based on a copy of the voter record data, indeed, just that part that the voter needs to use to “identify themselves sufficiently”.

Well, let’s not get ahead of ourselves.  The fact is, State and/or local elections officials generally manage the voter database.  And our Stakeholders inform us its still likely these jurisdictions would want to operate this service in order to retain control of the data, and to control the ways and means of “sufficient identity” to be consistent with election laws, current usage practices, and other factors.  On the other hand, a polling place traffic monitor service can be a completely standalone system – a better solution we think, and more likely to be tried by everyone.

OK, that’s enough for the reasonably controlled and accurate crowd-source reporting of wait times. What about the benefits from it – the visibility on wait times?  As is almost always the case in transparent, open government computing these days, there are two parallel answers.

The first answer is that the same system that voters report into, could also provide the aggregated information to the public.  For example, using a web app, one could type in their street address (or get some help in selecting it, akin to our Digital Poll Book), and see the wait time info for a given precinct.  They could also view a list of the top-5 shortest current wait times and bottom-5 longest wait times of the precincts in their county, and see where their precinct sits in that ranking.  They could also study graphs of moving averages of wait times – well, you can ideate for yourself.  It’s really a question of what kind of information regular voters would actually value, and that local election officials would want to show.

The second answer is that this system must provide a web services API so that “other systems” can query this wait-time-reporting service.  These other systems should be able to get any slice of the raw data, or the whole thing, up to the minute.  Then they could do whatever visualization, reporting, or other services thought up by the clever people operating this other system.

For me, I’d like an app on my phone that pings like my calendar reminders, that I set to ping myself after 9am (no voting without adequate caffeine!) but before 3pm (high school lets out and street traffic becomes a sh*t show ;-)); but oh, when the waiting time is <10 minutes.  I’d also like something that tells me if/when turn-out in my precinct (or my county, or some geographic slice) tips over 50% of non-absentee voters.  And you can imagine others.  But the main point is that we do not expect our State or local election officials to deliver that to us.  We do hope that they can deliver the basics, including that API so that others can do cool stuff with the data.

Actually, it’s an important division of labor.

Government organizations have the data and need to “get the data out” both in raw form via an API, and in some form useful for individual voters’ practical needs on Election Day.  Then other organizations or individuals can use that API with their different vision and innovation to put that data to a range of additional good uses.  That’s our view.

So, in our situation at the TrustTheVote Project, it’s actually really possible.  We already have the pieces: [1] the whole technology framework for online voter services, based on existing legacy databases; [2] the web and mobile computing technology framework with web services and APIs; [3] existing voter services that are worked examples of how to use these frameworks; and [4] some leading election officials who are already committed to using all these pieces, in real life, to help real voters.  This “voting wait-time tracker” system we call PollMon is actually one of the simplest examples of this type of computing.

We’re ready to build one.  And we told the Knight News Challenge so.  We say, let’s do this.  Wanna help?  Let us know.  We’ve already had some rockin good ideas and some important suggestions.

GAM | out

Crowd Sourcing Polling Place Wait Times

Long lines at the polling place are becoming a thorn in our democracy.

We realized a few months ago that our elections technology framework data layer could provide information that when combined with community-based information gathering might lessen the discomfort of that thorn.  Actually, that realization happened while hearing friends extol the virtues of Waze.  Simply enough, the idea was crowd-sourcing wait information to at least gain some insight on how busy a polling place might be at the time one wants to go cast their ballot.

Well, to be sure, lots of people are noodling around lots of good ideas and there is certainly no shortage of discussion on the topic of polling place performance.  And, we’re all aware that the President has taken issue with it and after a couple of mentions in speeches, created the Bauer-Ginsberg Commission.  So, it seems reasonable to assume this idea of engaging some self-reporting isn’t entirely novel.

After all, its kewl to imagine being able to tell – in real time – what the current wait time at the polling place is, so a voter can avoid the crowds, or a news organization can track the hot spots of long lines.  We do some “ideating” below but first I offer three observations from our noodling:

  • It really is a good idea; but
  • There’s a large lemon in it; yet
  • We have the recipe for some decent lemonade.

Here’s the Ideation Part

Wouldn’t it be great if everybody could use an app on their smarty phone to say, “Hi All, its me, I just arrived at my polling place, the line looks a bit long.” and then later, “Me again, OK, just finished voting, and geesh, like 90 minutes from start to finish… not so good,” or “Me again, I’m bailing.  Need to get to airport.”

And wouldn’t it be great if all that input from every voter was gathered in the cloud somehow, so I could look-up my polling place, see the wait time, the trend line of wait times, the percentage of my precinct’s non-absentee voters who already voted, and other helpful stuff?  And wouldn’t it be interesting if the news media could show a real time view across a whole county or State?

Well, if you’re reading this, I bet you agree, “Yes, yes it would.”  Sure.  Except for one thing.  To be really useful it would have to be accurate.  And if there is a question about accuracy (ah shoot, ya know where this is going, don-cha?) Yes, there is always that Grinch called “abuse.”

Sigh. We know from recent big elections that apparently, partisan organizations are sometimes willing to spend lots of money on billboard ads, spam campaigns, robo-calls, and so on, to actually try to discourage people from going to the polls, within targeted locales and/or demographics. So, we could expect this great idea, in some cases, to fall afoul of similar abuse.  And that’s the fat lemon.

But, please read on.

Now, we can imagine some frequent readers spinning up to accuse us of wanting everything to be perfectly secure, of letting the best be the enemy of the good, and noting that nothing will ever be accomplished if first every objection must be overcome. On other days, they might be right, but not so much today.

We don’t believe this polling place traffic monitoring service idea requires the invention of some new security, or integrity, or privacy stuff.  On the other hand, relying on the honor system is probably not right either.  Instead, we think that in real life something like this would have a much better chance of launch and sustained benefit, if it were based on some existing model of voters doing mobile computing in responsible way that’s not trivial to abuse like the honor system.

And that lead us to the good news – you see, we have such an existing model, in real life. That’s the new ingredient, along with that lemon above, and a little innovative sugar, for the lemonade that I mentioned.

Stay tuned for Part 2, and while waiting you might glance at this.

For (Digital) Poll Books — Custody Matters!

Today, I am presenting at the annual Elections Verification Conference in Atlanta, GA and my panel is discussing the good, the bad, and the ugly about the digital poll book (often referred to as the “e-pollbook”).  For our casual readers, the digital poll book or “DPB” is—as you might assume—a digital relative of the paper poll book… that pile of print-out containing the names of registered voters for a given precinct wherein they are registered to vote.

For our domain savvy reader, the issues to be discussed today are on the application, sometimes overloaded application, of DPBs and their related issues of reliability, security and verifiability.  So as I head into this, I wanted to echo some thoughts here about DPBs as we are addressing them at the TrustTheVote Project.

OSDV_pollbook_100709-1We’ve been hearing much lately about State and local election officials’ appetite (or infatuation) for digital poll books.  We’ve been discussing various models and requirements (or objectives), while developing the core of the TrustTheVote Digital Poll Book.  But in several of these discussions, we’ve noticed that only two out of three basic purposes of poll books of any type (paper or digital, online or offline) seem to be well understood.  And we think the gap shows why physical custody is so important—especially so for digital poll books.

The first two obvious purposes of a poll book are to [1] check in a voter as a prerequisite to obtaining a ballot, and [2] to prevent a voter from having a second go at checking-in and obtaining a ballot.  That’s fine for meeting the “Eligibility” and “Non-duplication” requirements for in-person voting.

But then there is the increasingly popular absentee voting, where the role of poll books seems less well understood.  In our humble opinion, those in-person polling-place poll books are also critical for absentee and provisional voting.  Bear in mind, those “delayed-cast” ballots can’t be evaluated until after the post-election poll-book-intake process is complete.

To explain why, let’s consider one fairly typical approach to absentee evaluation.  The poll book intake process results in an update to the voter record of every voter who voted in person.  Then, the voter record system is used as one part of absentee and provisional ballot processing.  Before each ballot may be separated from its affidavit, the reviewer must check the voter identity on the affidavit, and then find the corresponding voter record.  If the voter record indicates that the voter cast their ballot in person, then the absentee or provisional ballot must not be counted.

So far, that’s a story about poll books that should be fairly well understood, but there is an interesting twist when if comes to digital poll books (DPB).

The general principle for DPB operation is that it should follow the process used with paper poll books (though other useful features may be added).  With paper poll books, both the medium (paper) and the message (who voted) are inseparable, and remain in the custody of election staff (LEOs and volunteers) throughout the entire life cycle of the poll book.

With the DPB, however, things are trickier. The medium (e.g., a tablet computer) and the message (the data that’s managed by the tablet, and that represents who voted) can be separated, although it should not.

Why not? Well, we can hope that the medium remains in the appropriate physical custody, just as paper poll books do. But if the message (the data) leaves the tablet, and/or becomes accessible to others, then we have potential problems with accuracy of the message.  It’s essential that the DPB data remain under the control of election staff, and that the data gathered during the DPB intake process is exactly the data that election staff recorded in the polling place.  Otherwise, double voting may be possible, or some valid absentee or provisional ballots may be erroneously rejected.  Similarly, the poll book data used in the polling place must be exactly as previously prepared, or legitimate voters might be barred.

That’s why digital poll books must be carefully designed for use by election staff in a way that doesn’t endanger the integrity of the data.  And this is an example of the devil in the details that’s so common for innovative election technology.

Those devilish details derail some nifty ideas, like one we heard of recently: a simple and inexpensive iPad app that provides the digital poll book UI based on poll book data downloaded (via 4G wireless network) from “cloud storage” where an election official previously put it in a simple CSV file; and where the end-of-day poll book data was put back into the cloud storage for later download by election officials.

Marvelous simplicity, right?  Oh hec, I’m sure some grant-funded project could build that right away.  But turns out that is wholly unacceptable in terms of chain of custody of data that accurate vote counts depend on.  You wouldn’t put the actual vote data in the cloud that way, and poll book data is no less critical to election integrity.

A Side Note:  This is also an example of the challenge we often face from well-intentioned innovators of the digital democracy movement who insist that we’re making a mountain out of a molehill in our efforts.  They argue that this stuff is way easier and ripe for all of the “kewl” digital innovations at our fingertips today.  Sure, there are plenty of very well designed innovations and combinations of ubiquitous technology that have driven the social web and now the emerging utility web.  And we’re leveraging and designing around elements that make sense here—for instance the powerful new touch interfaces driving today’s mobile digital devices.  But there is far more to it, than a sexy interface with a 4G connection.  Oops, I digress to a tangential gripe.

This nifty example of well-intentioned innovation illustrates why the majority of technology work in a digital poll book solution is actually in [1] the data integration (to and from the voter record system); [2] the data management (to and from each individual digital poll book), and [3] the data integrity (maintaining the same control present in paper poll books).

Without a doubt, the voter’s user experience, as well as the election poll worker or official’s user experience, is very important (note pic above)—and we’re gathering plenty of requirements and feedback based on our current work.  But before the TTV Digital Poll Book is fully baked, we need to do equal justice to those devilish details, in ways that meet the varying requirements of various States and localities.

Thoughts? Your ball (er, ballot?)
GAM | out

The 2013 Annual Elections Verification Conference Opens Tonight

If its Wednesday 13.March it must be Atlanta.  And that means the opening evening reception for the Elections Verification Network‘s 2013 Annual Conference.  We’re high on this gathering of elections officials, experts, academicians and advocates because it represents a unique interdisciplinary collaboration of technologists, policy wonks and legal experts, and even politicians all with a common goal: trustworthy elections.

The OSDV Foundation is proud to be a major sponsor of this event.  We do so because it is precisely these kinds of forums where discussions about innovation in HOW America votes take place and it represents a rich opportunity for collaboration, debate, education, and sharing.  We always learn much and share our own research and development efforts as directed by our stakeholders — those State and local elections officials who are the beneficiaries of our charitable work to bring increased accuracy, transparency, verification, and security (i.e., the 4 pillars of trustworthiness) to elections technology reform through education, research and development for elections technology innovation.

Below are my opening remarks to be delivered this evening or tomorrow morning, at the pleasure of the Planning Committee depending on how they slot the major sponsors opportunities to address the attendees.  We believe there are 3 points we wanted to get across in opening remarks: [1] why we support the EVN; [2] why there is a growing energy around increased election verification efforts, and [3] how the EVN can drive that movement forward…..

Greetings Attendees!

On behalf of the EVN Planning Committee and the Open Source Digital Voting Foundation I want to welcome everyone to the 2013 Elections Verification Network Annual Conference.  As a major conference supporter, the Planning Committee asked if I, on behalf of the OSDV Foundation, would take 3 minutes to share 3 things with you:

  • 1st, why the Foundation decided to help underwrite this Conference;
  • 2nd, why we believe there is a growing energy and excitement around election verification; and
  • 3rd, how the EVN can bring significant value to this growing movement

So, we decided to make a major commitment to underwriting and participating in this conference for two reasons:

  1. We want to strengthen the work of this diverse group of stakeholders and do all that we can to fortify this gathering to make it the premier event of its kind; and
  2. The work of the EVN is vital to our own mission because there are 4 pillars to trustworthy elections: Accuracy, Transparency, Verification, and Security, and the goals and objectives of these four elements require enormous input from all stakeholders.  The time to raise awareness, increase visibility, and catalyze participation is now, more than ever.  Which leads to point about the movement.

We believe the new energy and excitement being felt around election verification is due primarily to 4 developments, which when viewed in the aggregate, illustrates an emerging movement.  Let’s consider them quickly:

  1. First, we’re witnessing an increasing number of elections officials considering “forklift upgrades” in their elections systems, which are driving public-government partnerships to explore and ideate on real innovation – the Travis County Star Project and the LA County’s VSAP come to mind as two showcase examples, which are, in turn, catalyzing downstream activities in smaller jurisdictions;
  2. The FOCE conference in CA, backed by the James Irvine Foundation was a public coming out of sorts to convene technologists, policy experts, and advocates in a collaborative fashion;
  3. The recent NIST Conferences have also raised the profile as a convener of all stakeholders in an interdisciplinary fashion; and finally,
  4. The President’s recent SOTU speech and the resulting Bauer-Ginsberg Commission arguably will provide the highest level of visibility to date on the topic of improving access to voting.  And this plays into EVN’s goals and objectives for elections verification.  You see, while on its face the visible driver is fair access to the ballot, the underlying aspect soon to become visible is the reliability, security, and verifiability of the processes that make fair access possible.  And that leads to my final point this morning:

The EVN can bring significant value to this increased energy, excitement, and resulting movement if we can catalyze a cross pollination of ideas and rapidly increase awareness across the country.  In fact, we spend lots of time talking amongst ourselves.  It’s time to spread the word.  This is critical because while elections are highly decentralized, there are common principles that must be woven into the fabric of every process in every jurisdiction.  That said, we think spreading the word requires 3 objectives:

  1. Maintaining intellectual honesty when discussing the complicated cocktail of technology, policy, and politics;
  2. Sustaining a balanced approach of guarded optimism with an embracing of the potential for innovation; and
  3. Encouraging a breadth of problem awareness, possible solutions, and pragmatism in their application, because one size will never fit all.

So, welcome again, and lets make the 2013 EVN Conference a change agent for raising awareness, increasing knowledge, and catalyzing a nationwide movement to adopt the agenda of elections verification.

Thanks again, and best wishes for a productive couple of days.

Election Tech “R” Us – and Interesting Related IP News

Good Evening–

On this election night, I can’t resist pointing out the irony of the USPTO’s news of the day for Election Day earlier: “Patenting Your Vote,” a nice little article about patents on voting technology.  It’s also a nice complement to our recent posting on the other form of intellectual property protection on election technology — trade secrets.  In fact, there is some interesting news of the day about how intellectual property protections won’t (as some feared) inhibit the use of election technology in Florida.

For recent readers, let’s be clear again about what election technology is, and our mission. Election technology is any form of computing — “software ‘n’ stuff” — used by election officials to carry out their administrative duties (like voter registration databases), or by voters to cast a ballot (like an opscan machine for recording votes off of a paper ballot), or by election officials to prepare for an election (like defining ballots), or to conduct an election (like scanning absentee ballots), or to inform the public (like election results reporting). That covers a lot of ground for “election technology.”

With the definition, it’s reasonable to say that “Election Technology ‘R’ Us” is what the TrustTheVote Project is about, and why the OSDV Foundation exists to support it.  And about intellectual property protection?   I think we’re clear on the pros and cons:

  • CON: trade secrets and software licenses that protect them. These create “black box” for-profit election technology that seems to decrease rather than increase public confidence.
  • PRO: open source software licenses. These enable government organizations to [A] adopt election technology with a well-defined legal framework, without which the adoption cannot happen; and [B] enjoy the fruits of the perpetual harvest made possible by virtue of open source efforts.
  • PRO: patent applications on election technology.  As in today’s news, the USPTO can make clear which aspects of voting technology can or can’t be protected with patents that could inhibit election officials from using the technology, or require them to pay licensing fees.
  • ZERO SUM: granted patents on techniques or business processes (used in election administration or the conduct of elections) in favor of for-profit companies.  Downside: can increase costs of election technology adoption by governments. Upside: if the companies do have something innovative, they are entitled to I.P. protection, and it may motivate investment in innovation.  Downside: we haven’t actually seen much innovation by voting system product vendors, or contract software development organizations used by election administration organizations.
  • PRO: granted patents to non-profit organizations.  To the extent that there are innovations that non-profits come up with, patents can be used to protect the innovations so that for-profits can’t nab the I.P., and charge license fees back to governments running open source software that embodies the innovations.

All that stated, the practical upshot as of today seems to be this: there isn’t much innovation in election technology, and that may be why for-profits try to use trade secret protection rather than patents.

That underscores our practical view at the TrustTheVote Project: a lot of election technology isn’t actually hard, but rather simply detailed and burdensome to get right — a burden beyond the scope of all but a few do-it-ourself elections offices’ I.T. groups.

That’s why our “Election Technology ‘R’ Us” role is to understand what the real election officials actually need, and then to (please pardon me) “Git ‘er done.”

What we’re “getting done” is the derivation of blue prints and reference implementations of an elections technology framework that can be adopted, adapted, and deployed by any jurisdiction with common open data formats, processes, and verification and accountability loops designed-in from the get-go.  This derivation is based on the collective input of elections experts nationwide, from every jurisdiction and every political process point of view.  And the real beauty: whereas no single jurisdiction could possibly ever afford (in terms of resources, time or money) to achieve this on their own, by virtue of the collective effort, they can because everyone benefits — not just from the initial outcomes, but from the on-going improvements and innovations contributed by all.

We believe (and so do the many who support this effort) that the public benefit is obvious and enormous: from every citizen who deserve their ballots counted as cast, to every local election official who must have an elections management service layer with complete fault tolerance in a transparent, secure, and verifiable manner.

From what we’ve been told, this certainly lifts a load of responsibility off the shoulders of elections officials and allows it to be more comfortably distributed.  But what’s more, regardless of how our efforts may lighten their burden, the enlightenment that comes from this clearinghouse effect is of enormous benefit to everyone by itself.

So, at the end of the day, what we all benefit from is a way forward for publicly owned critical democracy infrastructure.  That is, that “thing” in our process of democracy that causes long lines and insecurities, which the President noted we need to fix during his victory speech tonight.  Sure, its about a lot of process.  But where there will inevitably be technology involved, well that would be the TrustTheVote Project.

GAM|out