Tagged internet voting

Old School, New Tech: What’s Really Behind Today’s Elections

Many thanks to coverage by Bloomberg’s Michaela Ross, on election tech and cyber-security.

Given so much at stake for this election with its credibility rocked by claims of rigging, and so much more at stake as we move ahead to replace and improve our election infrastructure, I’m rarely enthused about reading more about how some people think Internet voting is great, and others think it is impossible.  However, Ms. Ross did a great job of following that discussion about how “Old School May Be Better” with supporting remarks from many long time friends and colleagues in election administration and technology worlds.

Where I’d like to respond is to re-frame the “old” part of “old school” and to reject one remark from a source that Ross quoted: They’re pretending what we do today is secure … There’s not a mission critical process in the world that uses 150-year-old technology.” Three main points here:

  1. There is plenty of new technology in the so-called old school;
  2. No credible election expert pretends that our ballots are 100% secure, not even close; and
  3. That’s why we have several new and old protections on the election process, including some of that new technology.

Let me address that next in three parts, mostly about what’s old and what’s new, then circle back to the truth about security, and lastly a comment on iVoting that I’ll most defer to a later re-up on the iVoting scene.

Old and New

Here is what’s old: paper ballots. We use them because we recognize the terrible omission in voting machines from the late 19th century mechanical lever machines (can be hacked with toothpicks, tampered with screwdrivers, and retain no record of any voter’s intent other than numbers on odometer dials) and many of today’s paperless touchscreens: “hack-able” and “tamper-able” even more readily, and likewise with no actual ballot other than bits on a disk. We use paper ballots (or paper-added touchscreens as a stop-gap) because no machine can be trusted to accurately record every voter’s intent. We need paper ballots not just for disputes and recounts, but fundamentally as a way to cross check the work of the machines.

Here’s what’s new: recently defined scientific statistical methods to conduct a routine ballot audit for every election, to cross check the machines’ work, with far less effort and cost than today’s “5% manual count and compare” and variant methods used in some states. It’s never been easier to use machines for rapid counts and quick unofficial results, and then (before final results) to detect and correct instances of machine inaccuracies whether from bugs, tampering, physical failure, or other issues. It’s called Risk Limiting Audit or RLA.

Here’s what new-ish: the new standard approach is for paper ballots to be rapidly machine counted using optical scanners and digital image processing software. There are a lot of old clunky and expensive (to buy, maintain, and store) op-scanners still in use, but this isn’t “150 years old,” any more than our modern ballots are like the old 19th-century party-machine-politics balloting that was rife with fraud that led to the desire for the old lever machines. However, these older machines have low to no support for RLA.

Here’s what’s newer: many people have mobile computers in their pocket that can run optical-capture and digital image processing. It’s no longer a complicated job to make a small, inexpensive device that can read some paper, record what’s on it, and retain records that humans can cross check. There’s no reason why the op-scan method needs to be old and clunky. And with new systems, it is easy to keep the type of records (technically, a “cast vote record” for each ballot) needed for easy support for RLA.

And finally, here’s the really good part: innovation is happening to make the process easier and stronger, both here at the OSET Institute and elsewhere ranging from local to state election officials, Federal organizations like EAC and NIST, universities, and other engines of tech innovation. The future looks more like this:

  • Polling place voting machines called “ballot marking devices” that use a familiar inexpensive tablet to collect a voter’s ballot choices, and print them onto a simple “here’s all and only what you chose” ballot to
    be easily and independently verified by the voter, and cast for optical scanning.
  • Devices and ballots with professionally designed and scientifically tested usability and accessibility for the full range of voters’ needs.
  • Simple inexpensive ballot scanners for these modern ballots.
  • Digital sample ballots using the voter’s choice of computer, tablet, or phone, to enable the voter to take their own time navigating the ballot, and creating a “selections worksheet” that can be scanned into a
    ballot marking device to confirm, correct if needed, and create the ballot cast in a polling place
  • or to be used in a vote-by-mail  process, without the need to wait for an official blank ballot to arrive in the mail.
  • And below that tip of the iceberg for the critical ballot-related operations, there is a range of other innovations to streamline voter registration, voter check-in, absentee ballot processing, voter services
    and apps to navigate the whole process and avoid procedural hurdles or long lines, interactive election results exploration and analytics, and more
  •   and all with the ability for election official to provide open public data on the outcome of the whole election process, and every voter’s success in participation or lack thereof.

That’s a lot of new tech that’s in the pipeline or in use already, but in still in the old school.

Finally, two last points to loop back to Michaela’s article.

Election Protection in the Real World

First, everyone engaged in elections knows that no method of casting and counting ballots is secure.

  • Vote by mail ballots go to election officials by mail passing through many hands, not all of which may seem trustworthy to the voters.
  • Email ballots and other digital ballots go to election officials via the Internet — again via many “virtual hands” that are definitely not trustworthy — and to computers that election officials may not fully control.
  • Polling place ballots in ballot boxes are transported by mere mortals who can make mistakes, encounter mishaps, and as in a very few recent historical cases, may be dishonest insiders.
  • Voting machines are easily tampered with by those with physical access, including temp workers and contractors in warehouses, transportation services, and pre-election preparations.
  • The “central brains” behind the voting machines is often an ordinary antique PC with no real protection in today’s daunting threat environment.
  • The beat goes on with voter records systems, electronic poll books, and more.

That’s why today’s election officials work so hard on the people and processes to contain these risks, and retain control over these vital assets throughout a complex process that — honestly, going forward — could be a lot simpler and easier with innovations designed to reduce the level of effort and complexity of these same type of protections.

The Truth About iVoting Today

Secondly, lastly, and mostly for another time: Internet voting. It’s desirable, it will likely happen someday, and it will require a solid R&D program to invent the tech that can do the job with all the protections — whether against, fraud, coercion, manipulation, and accidental or intention disenfranchisement — the we have today in our state-managed, locally-operated, and (delightfully but often frustratingly) hodge podge process of voting in 9,000+ jurisdictions across the US.  I repeat, all, no compromises; no waving the magic fairy wands of trust-me-it-works-because-it-is-cool or blockchains or so-called “military grade” encryption or whatever the latest cool geek cred item is.

In the meantime short-term, we have to shore up the current creaky systems and process, especially to address the issues of “rigging,” and the crazy amount of work election professionals have to do get the job done and maintain order and trust.

And then we have to replace the current systems in the existing process with innovations that also serve to increase trust and transparency. If we don’t fix the election process that we have now, and soon, we risk the hasty addition of i-voting systems that are just as creaky and flawed, hastily adopted, and poorly understood, the same as the paperless voting machines that adopted more than a decade ago.

We can do better, in the short-term and long, and we will.  A large and growing set of election and tecnology folks, in organizations of many kinds, are dedicated to making these improvements happen, especially as this election cycle has shown us all how vitally important it is.

— John Sebes

Future of Voting Systems: Future Requirements (Part 1)

For this first of several reports from the NIST/EAC Future of Voting Systems Symposium II, some readers of my recent report on standard work, may heave a sigh of relief that I’m not doing a long post that’s a laundry list of topics. However, I will be doing a series of posts on one part of the conference, a session held by some EAC who run the voting systems certification program, which relies on a “guidelines” document that is actually a complex set of standards that voting systems have to meet in order to get certified.

The reason that I am doing a series of posts is that the session was on a broad topic: if you were able to write a whole new requirements document from scratch, oriented to future voting systems, not required to support the existing certification program backwards-compatibly, then … what would you put in your hypothetical standards for each of several topics? Not surprisingly, I and my colleagues at TrustTheVote (and like minded folks in the election world more broadly) have some pretty clear views on many areas. As promised to the folks running this session, I’ll be using this blog to document more fully the recommendations we discussed, informed (with thanks) by the views of others at this conference. But I’ll be doing it in chunks over time, because I don’t think anybody wants tome here. 🙂

The Fork in the Road

The zeroth recommendation — that is, before getting to any of the topics requested! — is about the overall scope of a future standard. In the decade or so since the current one was developed (and even more years to the earlier versions), things have changed a lot in the election tech world, and change is accelerating. We are no longer in the stage of “wow that hanging chad fiasco was horrible, we need to replace them fast with computerized voting machines.” We’ve learned a lot. And one of the biggest learnings is that there is a huge fork in the road, which effects nearly all the requirements that one might make for voting systems. That’s what I want to explain today, in part because it was a good chunk of the discussions at the conference.

The fork in the road is this: you either have a voting system that supports evidence-based election results, or you don’t.

In this context, evidence-based means that the voting system produces evidence of its vote tallies that can be cross-checked by humans — and this is the important part — without having to trust or rely on software in any way. That’s important, because as we know, software is not and can never be perfect or trustworthy. In practice, what this means is that for each voter, there is a paper ballot that can be counted directly by people conducting a ballot audit. The typical practice is to take a statistically significant group of ballots for which we have machine count totals — typically a whole precinct in practice today — and manually count them to see if there is any significant variance between human and machine count that could indicate the machine count (or the human audit) had some errors. The process for resolving the rare variances is a larger topic, but the point here is that the process provides assurance of correct results without relying on computers working perfectly all the time.

That’s not the only way to build a voting system, and it’s not the only way to run an election. And in the U.S., our state and local election officials have choices. But many of them do want paper-based processes, to complement modern use of ballot marking devices for accessibility, ballot counters, ballot on demand, ballot readers for those with impaired vision, and a host of technical innovations emerging, including such things as on-boarding processes at polling places, interactive sample ballots for home use, and more. And for those election officials, the evidence-based voting systems have some important requirements.

The Harder Path

But let’s respect the other path as well, which includes a lot of paperless DRE voting machines still in use (and also some internet-voting schemes that several elections orgs are experimenting with). A lot of voters use these older systems. But there is a big difference in the requirements. Indeed, the bulk and complexity of the early requirements standard (and its larger 10-year-old successor) is due to trying to encompass early DRE based systems. Because these systems placed complete reliance on computers, the current requirements include an enormous amount of attention on security, risk management, software development practices, and more, all oriented to helping vendors build systems that would to the extent possible avoid creating a threat of “hacked elections.”

In fact, if you read it now, it looks like a time warp opened and a dropped through a doc from 2004 or so; and it reads pretty well as good advice for the time, on how to use then-current software and systems to — pardon me for the vernacular — “create a system that is not nearly as easily hacked as most stuff being made now.” (This was in windows XP days, recall.)

I suppose that some updated version of these requirements will be appropriate for future non-evidence-based voting systems. It will take a while to develop; it will be a bit dated by the time it is approved; and its use in voting system development, independent testing, and certification, will be about as burdensome as what we’ve seen in recent years. It has to be done, though, because the risks are greater now that ever, given that the expertise of cyber-adversaries continues to expand beyond the ability of the most sophisticated tech orgs to match.

The Road Not Taken?

So my 0th recommendation is do not apply these existing standards to evidence based voting systems. I’d almost like to see the new standard in two volumes – one for evidence based and one for others. It would just be a crazy waste of people’s time, effort, energy, and ingenuity to apply such burdensome requirements to evidence based systems, and ironic too: evidence based voting systems are specifically defined to entirely avoid many risks — in fact, the exact risks that the current requirements seek to mitigate! In fact, I would almost recommend further that the new version of the EAC start getting input on how to develop a new streamlined set of voting system requirements specifically for evidence-based systems. I say “almost” because I started to see exactly that starting to glimmer at the NIST/EAC conf this week. And that was super encouraging!

So, my specific recommendations will be entirely focused on what such new requirements should be for evidence-based voting systems. For the other fork in the road, the current standards set a pretty good direction. More soon …

— EJS

“Digital Voting”—Don’t believe everything you think

In our most recent blog post we examined David Plouffe’s recent Wall Street Journal forward-looking op-ed [paywall] and rebalanced his vision with some practical reality.

Now, let’s turn to Plouffe’s notion of “digital voting.”  Honestly, that phrase is confusing and vague.  We should know: it catalyzed our name change last year from Open Source Digital Voting Foundation (OSDV) to Open Source Election Technology Foundation (OSET).

Most Americans already use a “digital” machine to cast their ballots, if you mean by “digital” a computer-like device that counts votes electronically, and not by the old pre-2000 methods of punched cards or mechanical levers. What Plouffe probably meant is what elections professionals call iVoting, which is voting via the Internet—and increasingly that implies your mobile device.

Internet voting has not been approved anywhere in the United States for general public use, although Alaska is experimenting in a limited way with members of the military voting in this manner. Norway just stopped its Internet voting experiment. The challenges of iVoting are daunting.

Just think about it: many credit-card companies and several major online merchandisers have been hacked at some point, and all commercial and government web sites face intrusion attempts by the hour. The Department of Defense is continually bombarded by efforts to break-in. And sometimes hackers manage to actually get in and steal stuff. Voting is too important to let it be vulnerable to hacking.

Security of online voting is not yet with us. Sure, a few vendors of online voting technologies will emphatically claim their systems have never been hacked (to their knowledge) and that they use so-called “military grade” security (whatever that actually means).  Members of our technical team have been deeply involved in cyber-security for decades. We can say with confidence that no security on the Internet is absolute, assured, or guaranteed.  So when it comes to moving cast ballots via the Internet, the security issues are real and cannot be hand-waved away.  And elections that are run, in any part, over the public Internet pose just too tempting an opportunity for some predator looking to disrupt or even derail a U.S. election.

But, that doesn’t mean elections technologies can’t be improved or be made more digital, and thereby more verifiable, more accurate, and more transparent. That’s exactly what the TrustTheVote Project is all about.

The open-source software and standards that we are developing and advocating will make online voter registration, digital poll books (used to check you in at your polling place) and (ultimately) casting and counting ballots better, faster, and more auditable.  And our software is designed to run on ordinary computer hardware – whether that is a tablet, a scanner, or laptop computer.  Adopting the TrustTheVote Project technology means there will no longer be a requirement for election administrators to acquire expensive, proprietary software or hardware with long-term costly service contracts.

Importantly, we believe there are many parts of elections administration that can benefit from digital innovations, which may or may not use the Internet in some way.  And we’re focusing on delivering those innovations.

However, for the foreseeable future, ballot casting and counting can be dramatically improved without needing to involve the Internet.

So, we should to be cautious about the phrase “digital voting” in an age when all things digital tend to imply “Internet.”

All that observed, we really like how Plouffe ended his recent Wall Street Journal op-ed: “There are disrupters in every industry… the good ones won’t just apply the best practices of the private sector, but will also innovate and create on their own to meet their unique needs.”

The TrustTheVote Project intends to be one of those disrupters.  We add one tiny nuance: in our case, those “unique needs” are primarily those of our stakeholders—the state, county and city officials who run our elections. We won’t be running elections, they will, but we are thinking as far outside of the typical ballot box as we can when looking for opportunities to make voting easy, convenient, and ideally, a delight.

Money Shot: What Does a $40M Bet on Scytl Mean?

…not much we think.

Yesterday’s news of Microsoft co-founder billionaire Paul Allen’s investing $40M in the Spanish election technology company Scytl is validation that elections remain a backwater of innovation in the digital age.

But it is not validation that there is a viable commercial market for voting systems of the size typically attracting venture capitalists; the market is dysfunctional and small and governments continue to be without budget.

And the challenges of building a user-friendly secure online voting system that simultaneously protects the anonymity of the ballot is an interesting problem that only an investor of the stature of Mr. Allen can tackle.

We think this illuminates a larger question:

To what extent should the core technology of the most vital aspect of our Democracy be proprietary and black box, rather than publicly owned and transparent?

To us, that is a threshold public policy question, commercial investment viability issues notwithstanding.

To be sure, it is encouraging to see Vulcan Capital and a visionary like Paul Allen invest in voting technology. The challenges facing a successful elections ecosystem are complex and evolving and we will need the collective genius of the tech industry’s brightest to deliver fundamental innovation.

We at the TrustTheVote Project believe voting is a vital component of our nation’s democracy infrastructure and that American voters expect and deserve a voting experience that’s verifiable, accurate, secure and transparent.  Will Scytl be the way to do so?

The Main Thing

The one thing that stood out to us in the various articles on the investment were Scytl’s comments and assertions of their security with international patents on cryptographic protocols.  We’ve been around the space of INFOSEC for a long time and know a lot of really smart people in the crypto field.  So, we’re curious to learn more about their IP innovations.  And yet that assertion is actually a red herring to us.

Here’s the main thing: transacting ballots over the public packet switched network is not simply about security.   Its also about privacy; that is, the secrecy of the ballot.  Here is an immutable maxim about the digital world of security and privacy: there is an inverse relationship, which holds that as security is increased, privacy must be decreased, and vice-verse.  Just consider any airport security experience.  If you want maximum security then you must surrender a bunch of privacy.  This is the main challenge of transacting ballots across the Internet, and why that transaction is so very different from banking online or looking at your medical record.

And then there is the entire issue of infrastructure.  We continue to harp on this, and still wait for a good answer.  If by their own admissions, the Department of Defense, Google, Target, and dozens of others have challenges securifying their own data centers, how exactly can we be certain that a vendor on a cloud-based service model or an in-house data center of a county or State has any better chance of doing so? Security is an arms race.  Consider the news today about Heartbleed alone.

Oh, and please for the sake of credibility can the marketing machinery stop using the phrase “military grade security?”  There is no such thing.  And it has nothing to do with an increase in the  128-bit encryption standard RSA keys to say, 512 or 1024 bit.  128-bit keys are fine and there is nothing military to it (other than the Military uses it).  Here is an interesting article from some years ago on the sufficiency of current crypto and the related marketing arms race.  Saying “military grade” is meaningless hype.  Besides, the security issues run far beyond the transit of data between machines.

In short, there is much the public should demand to understand from anyone’s security assertions, international patents notwithstanding.  And that goes for us too.

The Bottom Line

While we laud Mr. Allen’s investment in what surely is an interesting problem, no one should think for a moment that this signals some sort of commercial viability or tremendous growth market opportunity.  Nor should anyone assume that throwing money at a problem will necessarily fix it (or deliver us from the backwaters of Government elections I.T.).  Nor should we assume that this somehow validates Scytl’s “model” for “security.”

Perhaps more importantly, while we need lots of attention, research, development and experimentation, the bottom line to us is whether the outcome should be a commercial proprietary black-box result or an open transparent publicly owned result… where the “result” as used here refers to the core technology of casting and counting ballots, and not the viable and necessary commercial business of delivering, deploying and servicing that technology.

A Northern Exposed iVoting Adventure

NorthernExposureImageAlaska’s extension to its iVoting venture may have raised the interests of at least one journalist for one highly visible publication.  When we were asked for our “take” on this form of iVoting, we thought that we should also comment here on this “northern exposed adventure.” (apologies to those fans of the mid-90s wacky TV series of a similar name.)

Alaska has been among the states that allow military and overseas voters to return marked absentee ballots digitally, starting with fax, then eMail, and then adding a web upload as a 3rd option.  Focusing specifically on the web-upload option, the question was: “How is Alaska doing this, and how do their efforts square with common concerns about security, accessibility, Federal standards, testing, certification, and accreditation?

In most cases, any voting system has to run that whole gauntlet through to accreditation by a state, in order for the voting system to be used in that state. To date, none of the iVoting products have even tried to run that gauntlet.

So, what Alaska is doing, with respect to security, certification, and host of other things is essentially: flying solo.

Their system has not gone through any certification program (State, Federal, or otherwise that we can tell); hasn’t been tested by an accredited voting system test lab; and nobody knows how it does or doesn’t meet  federal requirements for security, accessibility, and other (voluntary) specifications and guidelines for voting systems.

In Alaska, they’ve “rolled their own” system.  It’s their right as a State to do so.

In Alaska, military voters have several options, and only one of them is the ability to go to a web site, indicate their choices for vote, and have their votes recorded electronically — no actual paper ballot involved, no absentee ballot affidavit or signature needed. In contrast to the sign/scan/email method of return of absentee ballot and affidavit (used in Alaska and 20 other states), this is straight-up iVoting.

So what does their experience say about all the often-quoted challenges of iVoting?  Well, of course in Alaska those challenges apply the same as anywhere else, and they are facing them all:

  1. insider threats;
  2. outsider hacking threats;
  3. physical security;
  4. personnel security; and
  5. data integrity (including that of the keys that underlie any use of cryptography)

In short, the Alaska iVoting solution faces all the challenges of digital banking and online commerce that every financial services industry titan and eCommerce giant spends big $ on every year (capital and expense), and yet still routinely suffer attacks and breaches.

Compared to the those technology titans of industry (Banking, Finance, Technology services, or even the Department of Defense), how well are Alaskan election administrators doing on their shoestring (by comparison) budget?

Good question.  It’s not subject to annual review (like banks’ IT operations audit for SAS-70), so we don’t know.  That also is their right as a U.S. state.  However, the  fact that we don’t know, does not debunk any of the common claims about these challenges.  Rather, it simply says that in Alaska they took on the challenges (which are large) and the general public doesn’t know much about how they’re doing.

To get a feeling for risks involved, just consider one point, think about the handful of IT geeks who manage the iVoting servers where the votes are recorded and stored as bits on a disk.  They are not election officials, and they are no more entitled to stick their hands into paper ballots boxes than anybody else outside a
county elections office.  Yet, they have the ability (though not the authorization) to access those bits.

  • Who are they?
  • Does anybody really oversee their actions?
  • Do they have remote access to the voting servers from anywhere on the planet?
  • Using passwords that could be guessed?
  • Who knows?

They’re probably competent responsible people, but we don’t know.  Not knowing any of that, then every vote on those voting servers is actually a question mark — and that’s simply being intellectually honest.

Lastly, to get a feeling for the possible significance of this lack of knowledge, consider a situation in which Alaska’s electoral college votes swing an election, or where Alaska’s Senate race swings control of Congress (not far-fetched given Murkowski‘s close call back in 2010.)

When the margin of victory in Alaska, for an election result that effects the entire nation, is a low 4-digit number of votes, and the number of digital votes cast is similar, what does that mean?

It’s quite possible that those many digital votes could be cast in the next Alaska Senate race.  If the contest is that close again,  think about the scrutiny those IT folks will get.  Will they be evaluated any better than every banking data center investigated after a data breach?  Any better than Target?  Any better than Google or Adobe’s IT management after having trade secrets stolen?  Or any better than the operators of military unclassified systems that for years were penetrated through intrusion from hackers located in China who may likely have been supported by the Chinese Army or Intelligence groups?

Probably not.

Instead, they’ll be lucky (we hope) like the Estonian iVoting administrators, when the OCSE visited back in 2011 to have a look at the Estonian system.  Things didn’t go so well.  OCSE found that one guy could have undermined the whole system.  Good news: it didn’t happenCold comfort: that one guy didn’t seem to have the opportunity — most likely because he and his colleagues were busier than a one-armed paper hanger during the election, worrying about Russian hackers attacking again, after they had previously shut-down the whole country’s Internet-connect government systems.

But so far, the current threat is remote, and it is still early days even for small scale usage of Alaska’s iVoting option.  But while the threat is still remote, it might be good for the public to see some more about what’s “under the hood” and who’s in charge of the engine — that would be our idea of more transparency.

<soapbox>

Wandering off the Main Point for a Few Paragraphs
So, in closing I’m going to run the risk of being a little preachy here (signaled by that faux HTML tag above); again, probably due to the surge in media inquiries recently about how the Millennial generation intends to cast their ballots one day.  Lock and load.

I (and all of us here) are all for advancing the hallmarks of the Millennial mandates of the digital age: ease and convenience.  I am also keenly aware there are wing-nuts looking for their Andy Warhol moment.  And whether enticed by some anarchist rhetoric, their own reality distortion field, or most insidious: the evangelism of a terrorist agenda (domestic or foreign) …said wing nut(s) — perhaps just for grins and giggles — might see an opportunity to derail an election (see my point above about a close race that swings control of Congress or worse).

Here’s the deep concern: I’m one of those who believes that the horrific attacks of 9.11 had little to do with body count or the implosions of western icons of financial might.  The real underlying agenda was to determine whether it might be possible to cause a temblor of sufficient magnitude to take world financial markets seriously off-line, and whether doing so might cause a rippling effect of chaos in world markets, and what disruption and destruction that might wreak.  If we believe that, then consider the opportunity for disruption of the operational continuity of our democracy.

Its not that we are Internet haters: we’re not — several of us came from Netscape and other technology companies that helped pioneer the commercialization of that amazing government and academic experiment we call the Internet.  Its just that THIS Internet and its current architecture simply was not designed to be inherently secure or to ensure anyone’s absolute privacy (and strengthening one necessarily means weakening the other.)

So, while we’re all focused on ease and convenience, and we live in an increasingly distributed democracy, and the Internet cloud is darkening the doorstep of literally every aspect of society (and now government too), great care must be taken as legislatures rush to enact new laws and regulations to enable studies, or build so-called pilots, or simply advance the Millennial agenda to make voting a smartphone experience.  We must be very careful and considerably vigilant, because its not beyond the realm of reality that some wing-nut is watching, cracking their knuckles in front of their screen and keyboard, mumbling, “Oh please. Oh please.”

Alaska has the right to venture down its own path in the northern territory, but it does so exposing an attack surface.  They need not (indeed, cannot) see this enemy from their back porch (I really can’t say of others).  But just because it cannot be identified at the moment, doesn’t mean it isn’t there.

</soapbox>

One other small point:  As a research and education non-profit we’re asked why shouldn’t we be “working on making Internet voting possible?”  Answer: Perhaps in due time.  We do believe that on the horizon responsible research must be undertaken to determine how we can offer an additional alternative by digital means to casting a ballot next to absentee and polling place experiences.  And that “digital means” might be over the public packet-switched network.  Or maybe some other type of network.  We’ll get there.  But candidly, our charge for the next couple of years is to update an outdated architecture of existing voting machinery and elections systems and bring about substantial, but still incremental innovation that jurisdictions can afford to adopt, adapt and deploy.  We’re taking one thing at a time and first things first; or as our former CEO at Netscape used to say, we’re going to “keep the main thing, the main thing.”

Onward
GAM|out

“Definitive” Word on Internet Voting? Look to NIST

As frequent readers will note, Internet voting as a discussion topic is one we increasingly tire of — there is so much else to do! that unlike Internet Voting, can actually be done today! Let’s talk instead about what tech innovations can do speed up the long lines at polling places, for example.

But nevertheless, today, esteemed colleagues were asking for a “definitive statement” on i-voting, having been asked for such by folks who work for legislators interested in the topic. So I thought I’d share what I like to say when legislators, staffers, etc. ask for “definitive”. Basically 3 steps.

Look to the National Institute of Standards and Technology (NIST), the U.S. government’s top authority on technology issues. They’ve studied the use of the Internet for everything in elections, including voting.

  1. Read their conclusion on top half of page 69 of NIST’s “Threat Analysis of Voting Systems” (here), supported by analysis in pages 42-46.
  2. Also briefly flip through NIST’s 36 pages of “Information System Security Best Practices” (here) for Internet-connected election support systems.
  3. Think about whether state or local election officials would have the funding to comply with NIST guidelines, even for the lower bar of distributing blank ballots, much less the higher bar of Internet Voting.

That’s pretty much it.

— EJS

PS: While we’re on the topic, my thanks to Jeremy Epstein for tackling another topic on i-voting  (botnets, one among several i-voting topics that I am happy to leave to Jeremy and other colleagues in the security world), and for letting me put my 2 cents in for his Freedom To Tinker blog today. Thanks Jeremy, for doing the $0.98, and keep up the good work !  — ejs

Exactly Who is Delivering Postal Ballots? and Do We Care?

An esteemed colleague noted the news of the USPS stopping weekend delivery, as part of a trend of slow demise of the USPS, and asked: will we get to the point where vote-by-mail is vote-by-Fedex? And would that be bad, having a for-profit entity acting as the custodian for a large chunk of the ballots in an election?

The more I thought about it, the more flummoxed I was. I had to take off the geek hat and dust off the philosopher hat, looking at the question from a viewpoint of values, rather than (as would be my wont) requirements analysis or risk analysis. I goes like this …

I think that Phil’s question is based on assumption of some shared values among voters — all voters, not just those that vote by mail — that make postal voting acceptable because ballots are a “government things” and so is postal service. Voting is in part an act of faith in government to be making a good faith effort to do the job right, and keep the operations above a minimum acceptable level of sanity. It “feels OK” to hand a marked ballot to my regular neighborhood post(wo)man, but not to some stranger dropping off a box from a delivery truck. Translate from value to feeling to expectation: it’s implied that we expect USPS staff to know that they have a special government duty in delivering ballots, and to work to honor that duty, regarding the integrity of those special envelopes as a particular trust, as well as their timely delivery.

  • Having re-read all that, it sounds so very 20th century, almost as antique as lever machines for voting.

I don’t really think that USPS is “the government” anymore, not in the sense that the journey of a VBM ballot is end-to-end inside a government operation. I’m not sure that Fedex or UPS are inherently more or less trustworthy. In fact they all work for each other now! And certainly in some circumstances the for-profit operations may to some voters feel more trustworthy — whether because of bad experiences with USPS, or because of living overseas in a country that surveils US citizens and operates the postal service.

Lastly, I think that many people do share the values behind Phil’s question — I know I do. The idea makes me wobbly. I think it comes down to this:

  • If you’re wobbly on for-profit VBM, then get back into the voting booth, start volunteering to help your local election officials, and if they are effectively outsourcing any election operations to for-profit voting system vendors, help them stop doing so.
  • If you not wobbly, then you’re part of trend to trusting — and often doing — remote voting with significant involvement from for-profit entities – and we know where that is headed.

The issue with USPS shows that in the 21st century, any form of remote voting will involve for-profits, whether it is Fedex for VBM, or Amazon cloud services for i-voting. My personal conclusions:

  • Remote voting is lower integrity no matter what, but gets more people voting because in-person voting can be such a pain.
  • I need to my redouble efforts to fix the tech so that in-person voting is not only not a pain, but actually more desirable than remote voting.

— EJS

D.C. Reality Check – The Opportunities and Challenges of Transparency

Gentle Readers:
This is a long article/posting.  Under any other circumstance it would be just too long.

There has been much written regarding the public evaluation and testing of the District of Columbia’s Overseas “Digital Vote-by-Mail” Service (the D.C.’s label).  And there has been an equal amount of comment and speculation about technology supplied to the District from the OSDV Foundation’s TrustTheVote Project, and our role in the development of the D.C. service.  Although we’ve been mostly silent over the past couple of weeks, now enough has been determined so that we can speak to all readers (media sources included) about the project from our side of the effort.

The coverage has been extensive, with over 4-dozen stories reaching over 370 outlets not including syndication.  We believe it’s important to offer a single, contiguous commentary, to provide the OSDV Foundation’s point of view, as a complement to those of the many media outlets that have been covering the project.

0. The Working Relationship: D.C. BoEE & TrustTheVote Project
Only geeks start lists with item “0” but in this case its meant to suggest something “condition-precedent” to understanding anything about our work to put into production certain components of our open source elections technology framework in D.C. elections.  Given the misunderstanding of the mechanics of this relationship, we want readers to understand 6 points about this collaboration with the District of Columbia’s Board of Elections & Ethics (BoEE), and the D.C. I.T. organization:

  1. Role: We acted in the capacity of a technology provider – somewhat similar to a software vendor, but with the critical difference of being a non-profit R&D organization.  Just as has been the case with other, more conventional, technology providers to D.C, there was generally a transom between the OSDV Foundation’s TTV Project and the I.T. arm of the District of Columbia.
  2. Influence: We had very little (if any) influence over anything construed as policy, process, or procedure.
  3. Access: We had no access or participation in D.C.’s IT organization and specifically its data center operations (including any physical entry or server log-in for any reason), and this was for policy and procedural reasons.
  4. Advice: We were free to make recommendations and suggestions, and provide instructions and guidelines for server configurations, application deployment, and the like.
  5. Collaboration: We collaborated with the BoEE on the service design, and provided our input on issues, opportunities, challenges, and concerns, including a design review meeting of security experts at Google in Mountain View, CA early on.
  6. Advocacy: We advocated for the public review, cautioning that the digital ballot return aspect should be restricted to qualified overseas “UOCAVA” voters, but at all times, the BoEE, and the D.C. I.T. organization “called the shots” on their program.

And to go on record with an obvious but important point: we did not have any access to the ballot server, marked ballots, handling of voter data, or any control over any services for the same.  And no live data was used for testing.

Finally, we provided D.C. with several software components of our TTV Elections Technology Framework, made available under our OSDV Public License, an open source license for royalty-free use of software by government organizations.  Typical to nearly any deployment we have done or will do, the preexisting software did not fit seamlessly with D.C. election I.T. systems practices, and we received a “development grant” to make code extensions and enhancements to these software components, in order for them to comprise a D.C.-specific system for blank ballot download and an experimental digital ballot return mechanism (see #7 below).

The technology we delivered had two critically different elements and values.  The 1st, “main body of technology” included the election data management, ballot design, and voter user interface for online distribution of blank ballots to overseas voters.  With this in hand, the BoEE has acquired a finished MOVE Act compliant blank ballot delivery system, plus significant components of a new innovative elections management system that they own outright, including the source code and right to modify and extend the system.

For this system, BoEE obtained the pre-existing technology without cost; and for D.C-specific extensions, they paid a fraction of what any elections organization can pay for a standard commercial election management system with a multi-year right-to-use license including annual license fees.

D.C.’s acquired system is also a contrast to more than 20 other States that are piloting digital ballot delivery systems with DoD funding, but only for a one-time trial use.  Unlike D.C., if those States want to continue using their systems, they will have to find funding to pay for on-going software licenses, hosting, data center support, and the like.  There is no doubt, a comparison shows that the D.C. project has saved the District a significant amount of money over what they might have had to spend for ongoing support of overseas and military voters.

That noted, the other (2nd) element of the system – digital return of ballots – was an experimental extension to the base system that was tested prior to possible use in this year’s November election.  The experiment failed in testing to achieve the level of integrity necessary to take it into the November election.  This experimental component has been eliminated from the system used this year.  The balance of this long article discusses why that is the case, and what we saw from our point of view, and what we learned from this otherwise successful exercise.

1. Network Penetration and Vulnerabilities
There were two types of intrusions as a result of an assessment orchestrated by a team at the University of Michigan led by Dr. Alex Halderman, probing the D.C. network that had been made available to public inspection.  The first was at the network operations level.  During the time that the Michigan team was testing the network and probing for vulnerabilities, they witnessed what appeared to be intrusion attempts originating from machines abroad from headline generating countries such as China and IranWe anticipate soon learning from the D.C. IT Operations leaders what network security events actually transpired, because detailed review is underway.  And more to that point, these possible network vulnerabilities, while important for the District IT operations to understand, were unrelated to the actual application software that was deployed for the public test that involved a mock election, mock ballots, and fictitious voter identities provided to testers.

2. Server Penetration and Vulnerabilities
The second type of intrusion was directly on the District’s (let’s call it) “ballot server,” through a vulnerability in the software deployed on that server. That software included: the Red Hat Linux server operating system; the Apache Web server with standard add-ons; the add-on for the Rails application framework; the Ruby-on-Rails application software for the ballot delivery and return system; and some 3rd party library software, both to supplement the application software, and the Apache software.

The TrustTheVote Project provided 6 technology assets (see below, Section 7) to the BoEE project, plus a list of requirements for “deployment;” that is, the process of combining the application software with the other elements listed above, in order to create a working 3-tier application running on 3 servers: a web proxy server, an application server, and a database server.  One of those assets was a Web application for delivering users with a correct attestation document and the correct blank ballot, based on their registration records.  That was the “download” portion of the BoEE service, similar to the FVAP solutions that other states are using this year on a try-it-once basis.

3. Application Vulnerability
Another one of those technology assets was an “upload” component, which performed fairly typical Web application functions for file upload, local file management, and file storage – mostly relying on a 3rd-party library for these functions.  The key D.C.-specific function was to encrypt each uploaded ballot file to preserve ballot secrecy.  This was done using the GPG file encryption program, with a command shell to execute GPG with a very particular set of inputs.  One of those inputs was the name of the uploaded file. 

And here was the sticking point.  Except for this file-encryption command, the library software largely performed the local file management functions.  This included the very important function of renaming the uploaded file to avoid giving users the ability to define file names on the server.  Problem: during deployment, a new version of this library software package was installed, in which the file name checks were not performed as expected by the application software.  Result: carefully crafted file names, inserted into the shell command, gave attackers the ability to execute pretty much any shell command, with the userID and privileges of the application itself.

Just as the application requires the ability to rename, move, encrypt, and save files, the injected commands could also use the same abilities.  And this is the painfully ironic point: the main application-specific data security function (file encryption), by incorrectly relying on a library, exposed those ballot files (and the rest of the application) to external tampering.

4.  Consequences
The Michigan team was creative in their demonstration of the results of attacking a vulnerability in what Halderman calls a “brittle design,” a fair critique common to nearly every Web application deployed using application frameworks and application servers.  In such a design, the application and all of its code operates as a particular userID on the server.  No matter how much a deployment constrains the abilities of that user and the code running as that user, the code, by definition, has to be able to use the data that the application manages.

Therefore, if there is a “chink” in any of the pieces the collective armor (e.g., the server, its operating system, web server, application platform, application software, or libraries) or the way they fit together, then that “chink” can turn use into an abuse.  That abuse applies to any and all of the data managed by the application, as well as the file storage used by the application.  As the Michigan teamed demonstrated, this general rule also applies specifically, when the application data includes ballot files.

5.  Mea Culpa
Let’s be clear
, the goof we made, and “our bad” in the application development was not anticipating a different version of the 3rd-party library, and not locking in the specific version that did perform file name checking that we assumed was done to prevent exactly this type of vulnerability.  And in fact, we learned 4 valuable lessons from this stumble:

  1. Factoring Time:  Overly compressed schedules will almost certainly ensure a failure point is triggered.  This project suffered from a series of cycle-time issues in getting stuff requisitioned, provisioned, and configured, and other intervening issues for the BoEE, including their Primary election which further negatively impacted the time frame.  This led to a very compressed amount of time to stage and conduct this entire exercise;
  2. Transparency vs. Scrutiny:  The desired public transparency put everyone involved in a highly concentrated light of public scrutiny, and margins of otherwise tolerable error allowed during a typical test phase were nonexistent in this setting – even the slightest oversight typically caught in a normal testing phase was considered fault intolerant, as if the Pilot were already in production;
  3. (Web) Application Design:  Web applications for high-value, high-risk data require substantial work to avoid brittleness.  Thankfully, none of the TrustTheVote Elections Technology Framework will require an Internet-connected Web application or service – so the 3rd lesson is how much of a relief that is going forward for us; and
  4. No Immunity from Mistake: Even the most experienced professionals are not immune from mistake or misstep, especially when they are working under very skeptical public scrutiny and a highly compressed time schedule our development team, despite a combined total of 7 decades of experience, included.

So, we learned some valuable lessons from this exercise. We still believe in the public transparency mandate, and fully accept responsibility for the goof in the application development and release engineering process.

Now, there is more to say about some wholly disconnected issues regarding other discovered network vulnerabilities, completely beyond our control (see #0 above), but we’ll save comment on that until after the D.C. Office of the CTO completes their review of the Michigan intrusion exercise.   Next, we turn attention to some outcomes.

6. Outcomes
Let’s pull back up to the 30-thousand foot level, and consider what the discussion has been about (leaving aside foreign hackers).  This test revealed a security weakness of a Web application framework; how there can be flaws in application-specific extensions to routine Web functions like file upload, including flaws that can put those functions and files at risk.  Combine that with the use of Web applications for uploading files that are ballots.  Then, the discussion turns on whether it is possible (or prudent) to try to field any Web application software, or even any other form of software, that transfers marked ballots over the Internet.  We expect that discussion to vigorously continue, including efforts that we’d be happy to see, towards a legislative ruling on the notion, such to Ohio’s decision to ban digital ballot transfer for overseas voting or North Carolina’s recent enthusiastic embrace of it.

However, public examination, testing, and the related discussions and media coverage, were key objectives of this project.  Rancorous as that dialogue may have become, we think it’s better than the dueling monologues that we witnessed at the NIST conference on overseas digital voting (reported here earlier).

But this is an important discussion, because it bears on an important question about the use of the Internet, which could range from (a) universal Internet voting as practiced in other countries (which nearly everyone in this discussion, including the OSDV Foundation, agrees is a terrible idea for the U.S.), to (b) the type of limited-scope usage of the Internet that may be needed only for overseas and military voters who really have time-to-vote challenges, or (c) limited only to ballot distribution.  For some, the distinction is irrelevant.  For others, it could be highly relevant.  For many, it is a perilous slippery slope.  It’s just barely possible that worked examples and discussion could actually lead to sorting out this issue.

The community certainly does have some worked examples this year, not just the D.C. effort, and not just DoD’s FVAP pilots, but also other i-Voting efforts in West Virginia and elsewhere.  And thankfully, we hear rumors that NIST will be fostering more discussion with a follow-up conference in early 2011 to discuss what may have been learned from these efforts in 2010.  (We look forward to that, although our focus returns to open source elections technology that has nothing to do with the Internet!)

7. Our Technology Contributions
Finally, for the record, below we catalog the technology we contributed to the District of Columbia’s Overseas “Digital Vote-by-Mail” service (again, their label).  If warranted, we can expand on this, another day.  The assets included:

  1. Three components of the open source TrustTheVote (TTV) Project Elections Technology Framework: [A] the Election Manager, [B] the Ballot Design Studio, and [C] the Ballot Generator.
  2. We augmented the TTV Election Manager and TTV Ballot Design Studio to implement D.C.-specific features for election definition, ballot design, and ballot marking.
  3. We extended some earlier work we’ve done in voter record management to accommodate the subset of D.C. voter records to be used in the D.C. service, including the import of D.C.-specific limited-scope voter records into an application-specific database.
  4. We added a Web application user experience layer on top of that, so that voters can identify themselves as matching a voter database record, and obtain their correct ballot (the application and logic leading up to the blank ballot “download” function referred to above) and to provide users with content about how to complete the ballot and return via postal or express mail services.
  5. We added a database extension to import ballot files (created by the TTV Ballot Generator), using a D.C.-specific method to connect them to the voter records in order to provide the right D.C.-specific ballot to each user.
  6. We added the upload capability to the web application, so that users could choose the option of uploading a completed ballot PDF; this capability also included the server-side logic to encrypt the files on arrival.

All of these items, including the existing open-source TTV technology components listed above in 7.1 above, together with the several other off-the-shelf open-source operating system and application software packages listed in Section 2 above, were all integrated by D.C’s IT group to comprise the “test system” that we’ve discussed in this article.

In closing, needless to say, (but we do so anyway for the record) while items 7.1—7.5 can certainly be used to provide a complete solution for MOVE Act compliant digital blank ballot distribution, item 7.6 is not being used for any purpose, in any real election, any time soon.

One final point worth re-emphasizing: real election jurisdiction value from an open source solution…..

The components listed in 7.1—7.5 above provide a sound on-going production-ready operating component to the District’s elections administration and management for a fraction of the cost of limited alternative commercial solutions.  They ensure MOVE Act compliance, and do not require any digital ballot return.  And the District owns 100% of the source code, which is fully transparent and open source.  For the Foundation in general, and the TrustTheVote Project in particular, this portion of the project is an incontrovertible success of our non-profit charter and we believe a first of its kind.

And that is our view of D.C.‘s project to develop their “Digital Vote-by-Mail” service, and test it along with the digital ballot return function.  Thanks for plowing through it with us.

UOCAVA Remote Voting Workshop Makes a Strong Finish

24 hours ago I, along with some others, was actually considering asking for a refund.  We had come to the EAC, NIST, and FVAP co-hosted UOCAVA Remote Voting Systems 2 Day Workshop, expecting to feast on some fine discussions about the technical details and nuances of building remote voting systems for overseas voters that could muster the demands of security and privacy.  And instead we had witnessed an intellectual food fight of ideology.

That all changed in a big way today.

The producers and moderators of the event, I suspect sensing the potential side effects of yesterdays outcome — came together, somehow collectively made some adjustments (in moderation techniques, approach, and topic tweaking), and pulled off an excellent, informative day full of the kind of discourse I willingly laid down money (the Foundation’s money no less) in the first place to attend.

My hat is off; NIST and EAC on the whole did a great job with a comeback performance today that nearly excused all of what we witnessed yesterday.  Today, they exhibited self deprecating humor, and even had elections officials playing up their drunk driver characterization from the day before.

Let me share below what we covered; it was substantive.  It was detailed.  And it was tiring, but in a good way.  Here it is:

Breakout Session – Voter Authentication and Privacy

–Identified voter authentication and privacy characteristics and risks of the current UOCAVA voting process.

–Identified potential risks related to voter authentication and privacy of remote electronic absentee voting systems. For example, the group considered:

  • Ballot secrecy
  • Coercion and/or vote selling
  • Voter registration databases and voter lists
  • Strength of authentication mechanisms
  • Susceptibility to phishing/social engineering
  • Usability and accessibility of authentication mechanisms
  • Voter autonomy
  • Other potential risks

–Considered measures and/or criteria for assessing and quantifying identified risks and their potential impacts.

  • How do these compare to those of the current UOCAVA voting processes?

–Identified properties or characteristics of remote digital voting absentee voting systems that could provide comparable authentication mechanisms and privacy protections as the current UOCAVA voting process

–Considered currently available technologies that can mitigate the identified risks. How do the properties or characteristics of these technologies compare to those of the current UOCAVA voting process?

–Started to identify and discuss emerging or future research areas that hold promise for improving voter authentication and/or privacy.  For example:

  • Biometrics (e.g., speaker voice identification)
  • Novel authentication methods

–Chatted about cryptographic voting protocols and other cryptographic technologies

Breakout Session – Network and Host Security

–Identified problems and risks associated with the transmission of blank and voted ballots through the mail in the current UOCAVA voting process.

–Identified risks associated with electronic transmission or processing of blank and voted ballots.  For example, the breakout group considered:

  • Reliability and timeliness of transmission
  • Availability of voting system data and functions
  • Client-side risks to election integrity
  • Server-side risks to election integrity
  • Threats from nation-states
  • Other potential risks

–Considered and discussed measures and/or criteria for assessing and quantifying identified risks and their potential impacts.

  • How do these compare to those of the current UOCAVA voting process

–Identified properties or characteristics of remote digital absentee voting systems that could provide for the transmission of blank and voted ballots at least as reliably and securely as the current UOCAVA voting process.

–Discussed currently available technologies that can mitigate the identified risks and potential impact.

  • How do the properties and characteristics of these technologies compare to those of the current UOCAVA voting process?

–Identified and discussed emerging or future research areas that hold promise for improving network and host security.  For example:

  • Trusted computer and trusted platform models
  • End point security posture checking
  • Cloud computing
  • Virtualization
  • Semi-controlled platforms (e.g., tablets, smart phones, etc.)
  • Use of a trusted device (e.g., smart card, smart phone, etc.)

As you can see, there was a considerable amount of information covered in each 4 hour session, and then the general assembly reconvened to report on outcomes of each breakout group.

Did we solve any problems today?  Not so much.  Did we come a great deal forward in challenge identification, guiding principles development, and framing the issues that require more research and solution formulation? Absolutely.

Most importantly, John Sebes, our CTO and myself gained a great deal of knowledge we can incorporate into the work of the TrustTheVote Project, had some badly needed clarifying discussions with several, and feel we are moving in the right direction.

We clarified where we stand on use of the Internet in elections (its not time beyond anything but tightly controlled experimentation, and there is a lacking of understanding of the magnitude of resources required to stand up sufficiently hardened data centers to make it work, let alone figuring out problems at the edge.)

And we feel like we made some small contributions to helping the EAC and NIST figure out the kind of test Pilot they wish to stand up as a guiding principles reference model sometime over the next 2 years.

Easily a day’s work for the 50-60 people in attendance over the two days.

Back to the west coast (around 3am for my Pacific colleagues 😉

Its a wrap
GAM|out

Remote Voting Technology Workshop Wanders the Edge of an Intellectual Food Fight

[Note: This is a personal opinion piece and does not necessarily reflect the position of the Foundation or TrustTheVote Project.]

I should have seen this coming.  What was I thinking or expecting?

I am reporting tNIST_Logohis evening from the NIST Workshop on UOCAVA Remote Voting Systems here in Washington D.C..  After a great set of meetings earlier today on other activities of the Foundation (which we’ll have more to say about soon, but had nothing to do with our contributions to the District’s UOCAVA voting Pilot) I arrived at the Wardman Park Marriott near the Naval Observatory (home of the Vice President) for the Workshop, having unfortunately missed the morning sessions.  I barely made it into the lobby, when I had my first taste of what was being served.

My first exposure to the workshop (by then on lunch break) was witnessing a somewhat heated discussion between members of the Verified Voting Foundation and Rokey Suleman, Director of Elections for the District of Columbia.  Apparently, a speaker (identity is irrelevant) of noted authority had delivered a talk before lunch in which he spoke rather condescendingly toward elections officials (likening them to “drunk drivers”).

Mr. Suleman was explaining that so far the meeting appeared to be a waste of his time (principally because of such ad hominen remarks).  Those of the Verified Voting Foundation seemed unwilling to acknowledge that this speaker had (how ever unintentionally) denigrated the hard work of elections officials (as several others later relayed to me they too perceived), emphasizing instead that this individual was, “The nicest person who would never intend such a thing.”

Diplomacy 101 teaches: Perception equals reality.

Rather, they seemed to cling to the fact that this speaker was so much of an authority (which strictly speaking this person who made the drunken driving reference, is in fact a technical authority), that this comment should be overlooked.

The argument devolved from there; the substance of which is irrelevant.  What is relevant, however, is that in the very next session after lunch, another argument broke out over legal details of the letter of the UOCAVA law(s) and the related promulgated regulations enacting new aspects of overseas voting that enable (among other things) the digital delivery of blank ballots, and – arguably – the opportunity to pilot a means of digital return.

By the way: have I mentioned this workshop is supposed to be about UOCAVA remote voting which is limited to a qualified subset of that population overseas, and not the unrestricted widespread so-called “Internet voting?” But yet, an uninformed onlooker could reasonably believe that the battle lines were being drawn over the general widespread notion of Internet Voting on the basis of the so-called “slippery slope” argument.  (Note: I’ll leave it to trained Philosophers to explain why that argument actually is illogical in its own right in most applications.)  So, take a look at the Workshop description and draw your own conclusions.

The issue seems to be overly-trained on possibilities/potential of compromise and nowhere near a discussion of probability.  What’s more, I’m so far hearing nothing of the discussions about the technical challenges we need to address and how if at all (only an official from the Okaloosa Distance Balloting Pilot attempted to offer any such presentation or agenda).

Instead, I kept hearing the rhetoric of avoidance – both in and outside of sessions.  But the Internet has darkened the doorstep of nearly every aspect of society today. Why does it feel like we’re fooling ourselves into believing that somehow this cloud won’t also darken the doorstep of elections in a digital age?  Unfortunately, it already is; and future generations may well demand it.  However, that’s a discussion for another venue — we’re supposed to be exploring remote voting solutions for qualified overseas voters.

Let me say once again:

The Foundation and TrustTheVote Project do NOT support the widespread use of the Internet for the transaction of voting data.

That restated, as far as the Internet playing any role in elections is concerned, it seems to me that we need to look carefully at how to address this challenge, scourge, or whatever we want to call it, rather than try to abolish or avoid it.  Had this mentality been applied to sending man to the moon, this nation never would have achieved four successful lunar landings out of five attempts.

But again, arguing over what role the Internet should or should not play in elections is not why I am here.  Intellectually honest discourse on the challenges and opportunities of UOCAVA remote voting solutions is why I am attending.  And I hoped I would witness (and participate in) a healthy discussion of the technical challenges beyond encryption debates and ideas on how to address them.

So far, I have not.

Instead, what I have is a seat in an intellectual food fight.  Notwithstanding a few interesting comments, speakers, and hallway chats, this sadly so far is a near waste of time (and money).  As one election official put it to me at this evening’s no-host reception:

Today reminds me of an observation by Nick Bostrom, an Oxford Philosopher: there is absolute certainty that the universe we live in is artificial.  Because that’s the only logical conclusion you can reach when you exclusively calculate possibilities without any consideration of probabilities.

Thankfully, we (at the Foundation) have much to work on regarding the use of computers in real world elections that has nothing to do with the transport layer.  Outside of these workshops, we don’t intend to address Internet solutions in our work in any significant manner.

And thankfully more, we had some very positive meetings this morning that validated the potential of our work to actually deliver publicly owned critical democracy infrastructure for accurate, transparent, trustworthy, and secure elections.

Tomorrow is another day; we’ll see what happens, and I’ll report back.
GAM|out