designing interfaces for voting machines benjamin b. bederson computer science department...

40
Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland www.cs.umd.edu/~bederson [email protected] February 4, 2005

Upload: marcus-woodruff

Post on 14-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Designing Interfaces for Voting Machines

Benjamin B. BedersonComputer Science Department

Human-Computer Interaction LabUniversity of Maryland

www.cs.umd.edu/~bederson [email protected]

February 4, 2005

Frustrated voters

Voting technology and ballot design can influence election outcomes

Minorities and the poor are more likely to cast their ballots on outdated systems

Technology is in need of updating

When Interfaces Get in the Way

Ballot design Butterfly ballot

Interaction Hanging chad Changing vote (i.e., how to unselect a candidate)

Write-In problems 2004 - NY Times Editorial reported on San Diego mayoral

election where voters for write-in candidate Frye didn’t darken a bubble.

2002 - Mt. Airy, MD mayor went from Holt to Johnson to Holt, based on acceptable spellings.

Usability Part of Larger Issues

Florida 2000 – Traditional technologies flawed Mechanical levers – break down, difficult to

maintain, difficult to store and transport Paper ballots – errors, difficult to process and

interpret Punch cards – hanging chad, etc.

Economics de-emphasizes usability Focus on security de-emphasizes usability Lack of research because of proprietary

systems and number of designs

Our Study

Funded by NSF (National Science Foundation), Grant #0306698

“Project to Assess Voting Technology and Ballot Design” Carnegie Corporation, Grant #D05008

Consists of: Expert review <= Focus today Lab study <= Focus today New technology <= Focus today Field test Natural experiments

Co-Researchers with Paul Herrnson – Univ. of Maryland (project leader) Michael Traugott & Fred Conrad – Univ. of Michigan Richard Niemi – Univ. of Rochester

Small-scale studies to demonstrate potential challenges and inform future research

Does not address accuracy, affordability, accessibility, durability, or ballot design

This represents partial results mid-way through a 3 year study. Future work will address accuracy, ballot design, and more

Partners

Federal Election Commission (FEC) Maryland State Board of Elections National Institute of Standards and Technology (NIST) Vendors

Diebold Hart InterCivic ES&S NEDAP Avante

Advisory Board

Machines Looked At

Avante Vote Trakker Diebold AccuVote TS ES&S Optical Scan Hart eSlate NEDAP LibertyVote UMD Zoomable system

As available for testing.

Some machines have been deployed with different options.

Some machines have since been updated.

Vendors (except NEDAP) implemented ballots for best presentation.

Machines selected to represent specific features

Avante Vote Trakker

All photos taken by ourresearch group – not provided by vendors.

Diebold AccuVote TS

ES&S Optical Scan

Hart eSlate

NEDAP LibertyVote

UMD Zoomable Systemwww.cs.umd.edu/~bederson/voting

Demo

Expert Review

12 HCI experts one evening 1 voting interaction specialist 1 government usability practitioner 5 academic HCI researchers 6 private usability practitioners

Each used 6 machines 2 ballot types where available (office block, party column) ~15 minutes each

Asked to list concerns Followed worst case perspectives of

novice poor language skills older voters stressed voters system errors

Most experts did not have background in voting systems

Subjective responses require interpretation

Expert Review Rating System

Each issue given a severity rating(1-low, 5-high)

Concerns listed with average severity, # of instances

Avante VoteTrakker

Concerns (average severity, number of instances)

5.0 1 Write-in requires last name4.0 2 Record shown too fast and without instructions4.0 2 No previous button1

3.0 2 Auto-forward confusing1

3.0 1 Smiley face inappropriate3.0 1 Title too small3.0 1 Instruction text poorly written3.0 1 Didn't like this one at all3.0 1 "Cast ballot to continue" not clear - it actually finishes

1 Navigation focuses on progress with later review by design

Avante VoteTrakker (more I)

Concerns (average severity, number of instances)

3.0 1 Timed out, but didn't see warning3.0 1 Angle of machine is awkward3.0 1 Lot of reflection on screen3.0 1 Flashing instruction is distracting3.0 1 Colors of text poor (green/white, black/blue)3.0 1 No progress feedback3.0 1 No way to cancel and leave 2

3.0 1 No way to start over3.0 1 "Please make selection" message is distracting3.0 1 no error-checking on write-in

2 Can time-out to cancel

Avante VoteTrakker (more II)

Concerns (average severity, number of instances)

3.0 1 Write-in association very small3.0 1 No way to go to end and cast ballot 3

3.0 1 Lack of color on amendment screen may appear to be an error

3.0 1 Disabled button is "white" which is very difficult to understand3.0 1 Cast ballot button requires 2 presses3.0 1 Can't say "no" to paper record - so why bother?3.0 1 Have to pick contrast/text size before starting3.0 1 No instructions after starting2.0 1 Not clear what to do at beginning

3 By design to minimize under-votes

Diebold AccuVote TS

Concerns (average severity, number of instances)

5.0 1 Ballot review confusing. Review colors don't match voting colors

5.0 1 No help on some screens

5.0 1 Write-in has no instructions

4.0 1 Contrast and text size controls not clear

4.0 1 Some font colors unclear (black on blue, red/blue)

4.0 1 Party not clearly indicated

4.0 1 Difficult to use while seated

4.0 1 Large font is good, but "issues" text runs over screen display area requiring arrow navigation

3.0 2 Wait icon is too computerish and not clear

3.0 1 Card hard to enter

Diebold AccuVote TS (more)

Concerns (average severity, number of instances)

3.0 1 Poor depiction of voting vs. reviewing state

3.0 1 "Card not inserted" error needs a diagram

3.0 1 Buttons have poor visual affordance

3.0 1 Instructions refer to "backspace" key, but is actually labeled "back"

3.0 1 Instructions unclear (i.e., "Vote for one")

3.0 1 Some text unclear (i.e., "2 of 4")

3.0 1 Multiple write-in unclear

3.0 1 Write-in not well associated with race being voted

1.0 1 Extra dots on help/instruction screens

ES&S Optical Scan

Concerns (average severity, number of instances)

5.0 1 Instructions not mandatory, errors likely5.0 1 Write-in has high error mode (enter name, but not fill in circle)4.0 2 Changing vote process is punitive - must start over which could

cause some to give up4.0 1 Poor visual grouping (title could be associated with items below)4.0 1 Could fold, bend or tear ballot4.0 1 No instructions to review ballot before submitting4.0 1 Instructions to turn over page not conspicuous enough3.5 2 Font size is fixed, and will be too small for some older and other

voters3.5 2 No error checking on under-vote3.0 2 No error checking on over-vote

ES&S Optical Scan (more)

Concerns (average severity, number of instances)

3.0 1 Should use different highlight/feedback that vote was correct3.0 1 Why two sets of matching instructions?3.0 1 Instructions somewhat difficult for voters with limited English

proficiency3.0 1 Instructions should say something about no extra marks on

ballot2.7 3 Needs a better table - low and shaky 1

2.0 1 Seated operation awkward1.0 1 "Vote in next column" unclear1.0 1 Appears to be an entry field at top of column

1 Different cost/quality tables available

Hart eSlate

Concerns (average severity, number of instances)

5.0 1 Combining summary and cast ballot confuses actual casting

4.0 1 No way to jump to end4.0 4 Dial slow to learn, hard to use 1

4.0 1 Red on blue text, and light fonts hard to read3.5 2 After reviewing, it's hard to get back to a choice to

change it3.5 2 Blue movement on screen is disconcerting3.0 1 Cast ballot button didn't accept push - required 3

presses1 Compare to subjective/objective data later

Hart eSlate (more)

Concerns (average severity, number of instances)

3.0 1 Poor progress indicator

3.0 1 May confuse with a touch screen

3.0 1 Can't clear entire vote and start over in one step

3.0 1 Write-in screen does not indicate office being voted for

3.0 1 Next/Prev and Dial ambiguous

3.0 1 Auto-forward on select, but not unselect (inconsistent interface)

NEDAP LibertyVote

Concerns (average severity, number of instances)

5.0 2 Write-in message after OK is confusing5.0 2 No way to confirm/review write-in name5.0 1 "No vote" light should be different color (difficult to see what wasn't

finished)5.0 1 No clear way to handle multiple write-ins5.0 1 Poor feeling of privacy due to size4.5 2 "Enter write-in" button doesn't seem to work4.3 3 Under-vote message easy to miss4.0 3 OK button for write-in too far away4.0 2 Too much reflection4.0 1 OK button with 4 arrows is weird4.0 1 Propositions too far away4.0 1 Hard to read/access from seated position

NEDAP LibertyVote (more)

Concerns (average severity, number of instances)

4.0 1 Number pad unclear - what is it for?4.0 1 Blue light coding (voted/unvoted) unclear4.0 1 "enlarge" scrollbar un-obvious (to left of little message screen)4.0 1 Buttons hard to press with poor tactile feedback4.0 1 Scroll bar thing to right of message box unclear3.7 3 Difficult to correct a vote3.5 2 Write-in area too far away3.0 1 "Partisan offices" unclear terminology3.0 1 Can change language accidentally3.0 1 Same color for race and candidate is unclear3.0 1 Prefer sequence to "jump around" model of full face ballot2.0 1 No second chance to cast vote - review is implicit

NEDAP Actual Ballot

UMD Zoomable

Concerns (average severity, number of instances)

4.0 3 Color of review & cast ballot buttons should be different than progress indicator and selected items

3.0 1 Not clear how to get started

3.0 1 Feels like a game - possibly inappropriate

3.0 1 "Not voted" confusing when multiple choices available

3.0 1 Peripheral races too visually confusing

2.5 2 Progress/navigation buttons is partly a progress indicator, but not clear enough

2.0 1 Overview buttons shouldn't split 4 sub-types

Lab Study

42 members from Ann Arbor, MI voted on 6 machines Paid $50 for 1-2 hours Different Random orders for different people

Latin Square design Over selected for potential difficulty

Most (69%) >= 50 years old Most (62%) uses computers once very 2 weeks or less

Most (30) voted on office-block ballot Indicated intention of (fictional) candidates by circling names on paper form Study not controlled for prior experience, but Ann Arbor uses optical scan

Data: Satisfaction ratings reported after voting on each machine Time measurement Videotaped interactions

Lab Study (more)

Looked at: Time voters spend reading instructions Response to paper or on-screen ballot Response to the reporting of under- or over-voting Ability to change a vote

Complications and malfunctions of DRE or Optical Scan Readers

Lab Study – Satisfaction Data

Usability studies typically measure: Speed, Accuracy, Satisfaction We are currently reporting on two

(Speed, Satisfaction)

“The voting system was easy to use”

1

2

3

4

5

6

7A

gre

emen

t (1

= S

tro

ng

ly D

isa

gre

e,

7 =

Str

on

gly

Ag

ree

)

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

“I felt comfortable using the system”

1

2

3

4

5

6

7A

gre

emen

t (1

= S

tro

ng

ly D

isa

gre

e,

7 =

Str

on

gly

Ag

ree

)

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

“Correcting my mistakes was easy”

1

2

3

4

5

6

7A

gre

emen

t (1

= S

tro

ng

ly D

isa

gre

e,

7 =

Str

on

gly

Ag

ree

)

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

“Casting a write-in vote was easy to do”

1

2

3

4

5

6

7A

gre

emen

t (1

= S

tro

ng

ly D

isa

gre

e,

7 =

Str

on

gly

Ag

ree

)

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

“Changing a vote was easy to do”

1

2

3

4

5

6

7A

gre

emen

t (1

= S

tro

ng

ly D

isa

gre

e,

7 =

Str

on

gly

Ag

ree

)

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

Lab Study - Time to Cast Ballot

0

2

4

6

8

10

12

Diebold Liberty Avante Hart Zoomable ES&S

Average score by Machine

Min

ute

s

Subjective

Objective

Lab Study – Analysis Remains

Why are some machines consistently most preferred and others least preferred?

Detailed coding of video interactions underway Planned analyses of video interactions:

Tally of problems by machine that do and do not lead to unintended votes cast

Explanation of satisfaction data in terms of voters’ actions

Remember that usability is only one characteristic of overall performance

Accuracy, Accessibility, Affordability, Durability, Security, Transportability, etc.

Future Parts of the Project

Field Test Assess usability among large, more representative sample Assess impact of ballot designs on usability issues Assess accuracy on different systems

Natural Experiments Assess impact of voting systems and ballot designs on

over-voting, under-voting, straight-party voting, and other measures across jurisdictions and over time

Assess impact of changing from specific types of voting systems (or ballots) to another system (or ballot)

Implications and Reflections

Voter intention is the key goal Usability is as important as security

(and so is accuracy and accessibility as well as affordability and durability)

Being able to update interface is important (i.e., certification may be interfering with usability)

Ballot/machine combination important (i.e., one size doesn’t fit all)

This talk available with vendor’s responses

www.capc.umd.edu