Rocket Science.
It doesn’t take rocket science to understand the essentials of systems
integrity. But, wait a minute! It does take rocket science to properly make
high integrity systems.

Ohio, a state that is rapidly gaining
more notoriety than Florida
over the issue of elections, is once again in the news. The state’s history is problematic from both
the standpoints of machinery concerns
of possible tampering, and processes – a disparity in access, long
lines
, and the like. I note these
links point to a Wikipedia page that is not without controversy, and I have no
interest in diving into that, but leave it to the reader to draw their own
conclusions.

I want to
focus on the latest topic of that debate: a 334-page voting systems
report
released on Friday 07.Dec, by the Ohio Secretary of State. The report
was prepared by the “Evaluation and Validation of Election-Related Equipment,
Standards and Testing” (EVEREST),
with support and resources from Penn State, the University
of Pennsylvania and
WebWise Security.

In short,
the sum and substance of the tome describe a system so compromised that it’s
difficult to imagine how any vote tally could be considered anything other than
a “good guess.”

Steven Teppler, a professional colleague of mine
on the American Bar Association’s Science & Technology Law Division –
Information Security Committee, and a prolific commentator within the ABA on
matters of voting technology law, was on this story early on. In an ABA forum post he noted that a UK corporate IT
publication, “The Register” ran a news
story
about this which quoted an executive from Premier Election Systems as
cautioning people not to read too much into the report. Excerpting from the Register story:

It is important to note that there has not
been a single documented case of a successful attack against an electronic
voting system, in Ohio or anywhere in the United States
," an executive
for Premier said in response to the report. "Even as we continue to strengthen the security features of our voting
systems, that reality should not be lost in the discussion
."

Strictly
speaking Premier may be right, although we find this assertion unconvincing (the old adage applies: we don’t know what we don’t know),
especially when you consider that the quote expects us to ignore the specific
results of source code analysis, penetration testing, as well as processes on
the part of the administrator/lessees of the equipment. Consider:

ES&S

1. Failure to protect
election data and software failure to effectively control access to election operations

2. Failure to
correctly implement security mechanisms

3. Failure to
follow standard software and security engineering practices

Premier

1. Failure to
effectively protect vote integrity and privacy

2. Failure to
protect elections from malicious insiders

3. Failure to
validate and protect software

4. Failure to
follow standard software and security engineering practices

5. Failure to
provide trustworthy auditing

Hart Intercivic

1. Failure to effectively
protect election data integrity

2. Failure to eliminate
or document unsafe functionality

3. Failure to protect
election from "Malicious Insiders"

4. Failure to provide
trustworthy auditing

The Administrator/Lessee

1. Failure to
protect against insiders

2. Failure to
follow "standard and well known practices" for crypto, key management
and security hardware

3. Failure to
provide a trustworthy auditing capability, making it "difficult" to
discover when an attack occurs

4. Deeply
flawed software maintenance practices

Good grief
friends, this isn’t rocket science. Oh, wait a minute, it is rocket science! That’s just the point: if real high assurance
engineering methodologies… (you know: the
methods used by the aerospace industry and the military to name a couple
) …if
those methods had been employed, systems like this would never have been
produced in the first place.

As Teppler aptly summed it up: the executive at Premier is essentially saying, “We can build a house with twelve doors and eleven locks, and we can
have it certified as safe because there has been no documented case of a
successful intrusion
.”

I read the Ohio report as a
specification of what not to do as we begin to think about platform architecture at the
OSDV Foundation.

GAM|out