heuristic evaluation - sites.cs.ucsb.edu · heuristic evaluation • developed by jakob nielsen •...

32
Prof. James A. Landay Computer Science Department Stanford University CS 147 Autumn 2017 DESIGN THINKING FOR USER EXPERIENCE DESIGN + PROTOTYPING + EVALUATION dt UX + 刘哲明 Heuristic Evaluation November 1, 2017

Upload: others

Post on 20-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Prof. James A. LandayComputer Science DepartmentStanford University

CS 147Autumn 2017

DESIGN THINKING FOR USER EXPERIENCE DESIGN + PROTOTYPING + EVALUATIONdt UX+

刘哲明

Heuristic Evaluation

November 1, 2017

Page 2: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Hall of Fame or Shame?

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 2

LG F7100courtesy of Genevieve Bell, Intel

Launched in 2004 in UAE, SaudiaArabia, North Africa, India, Malaysia

Page 3: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Hall of Fame!

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 3

Good– targeted at Muslim audience– need to pray 5x/day pointing

towards Mecca

LG F7100courtesy of Genevieve Bell, Intel

Launched in 2004 in UAE, SaudiaArabia, North Africa, India, Malaysia

Page 4: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Hall of Fame or Shame?

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 4

WhatsAppmobile messaging

Page 5: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Hall of Fame!

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 5

WhatsAppmobile messaging

Good– do not have to pay expensive SMS– can contact people on other phones– works on “feature” phones → led to

rapid take-off in developing world

Page 6: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Prof. James A. LandayComputer Science DepartmentStanford University

CS 147Autumn 2017

DESIGN THINKING FOR USER EXPERIENCE DESIGN + PROTOTYPING + EVALUATIONdt UX+

刘哲明

Heuristic Evaluation

November 1, 2017

Page 7: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Outline

• Wizard of Oz• Heuristic Evaluation Overview• The Heuristics• Exercise• Team Break

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 7

Page 8: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Wizard of Oz Technique

• Faking the interaction. Comes from?– the film “The Wizard of OZ”

• “the man behind the curtain”

• Long tradition in computer industry– e.g., prototype of a PC w/ a DEC VAX

behind the curtain

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 8

Page 9: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Wizard of Oz Technique

• Faking the interaction. Comes from?– the film “The Wizard of OZ”

• “the man behind the curtain”

• Long tradition in computer industry– e.g., prototype of a PC w/ a DEC VAX

behind the curtain

• Much more important for hard to implement features– speech & handwriting recognition– how would we do it for VR/AR?

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 9

Page 10: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

• Faking the interaction. Comes from?– the film “The Wizard of OZ”

• “the man behind the curtain”

• Long tradition in computer industry– e.g., prototype of a PC w/ a DEC VAX

behind the curtain

• Much more important for hard to implement features– speech & handwriting recognition– how would we do it for VR/AR?

Wizard of Oz Technique

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 10

CarbonShopper

Page 11: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Evaluation

• About figuring out how to improve design• Issues with lo-fi tests?

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 11

Page 12: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Evaluation

• About figuring out how to improve design• Issues with lo-fi tests?

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 12

Not realistic– visuals & performance

Not on actual interface– can’t test alone

Need participants– can be hard to find

repeatedly

Page 13: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristic Evaluation• Developed by Jakob Nielsen

• Helps find usability problems in a UI design

• Small set (3-5) of evaluators examine UI– independently check for compliance with usability

principles (“heuristics”)– evaluators only communicate afterwards

• findings are then aggregated– use violations to redesign/fix problems

• Can perform on working UI or on sketchesNovember 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 13

Page 14: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Why Multiple Evaluators?

• Every evaluator doesn’t find every problem

• Good evaluators find both easy & hard ones

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 14

Page 15: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristics

H1: Visibility of system statusH2: Match between system & real worldH3: User control & freedom

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 15

Page 16: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristics (cont.)

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 16

H4: Consistency & standardsH5: Error preventionH6: Recognition rather than recall

H7: Flexibility and efficiency of useH8: Aesthetic & minimalist design

Page 17: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristics (cont.)

H9: Help users recognize, diagnose, & recover from errors

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 17

bad

Page 18: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristics (cont.)

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 18

good

Page 19: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Good Error Messages

• Clearly indicate what has gone wrong

• Human readable• Polite• Describe the problem• Explain how to fix it• Highly noticeable

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 19

Page 20: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

H10 – Help & Documentation• Better if the system can be used without

documentation, but it may be necessary

• How– easy to search– focused on task– list concrete steps

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 20

http://blog.screensteps.com/10-examples-of-great-end-user-documentation

Page 21: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristic Violation Examples

1. [H6 Recognition Rather Than Recall]Can’t copy info from one window to another– user needs to memorize the data & retype– fix: allow copying

2. [H4 Consistency and Standards]Typography uses different fonts in 3 dialog boxes– slows users down– probably wouldn’t be found by user testing– fix: pick a single format for entire interface

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 21

Page 22: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Severity Ratings

0 - don’t agree that this is a usability problem1 - cosmetic problem 2 - minor usability problem3 - major usability problem; important to fix4 - usability catastrophe; imperative to fix

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 22

Page 23: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Severity Ratings Example

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 23

1. [H4 Consistency & Standards] [Severity 3]

The interface used the string “Save” on the first screen for saving the user’s settings, but used the string “Store” on the second screen. Users may be confused by this different terminology for the same function.

Fix: Use “Save” everywhere in the application.

Page 24: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Decreasing Returns

* Caveat: graphs for a specific example

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 24

problems found benefits / cost

Page 25: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Heuristic Evaluatoin Summary

• Have evaluators go through the UI twice• Ask them to see if it complies with heuristics

– note where it doesn’t & say why• Have evaluators independently rate severity• Combine the findings from 3 to 5 evaluators

– come to agreement on problems, fixes & severity

• Alternate with user testing

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 25

Page 26: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 26

E X E R C I S E

Page 27: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 27

E X E R C I S E

Page 28: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 28

http://hci.stanford.edu/courses/cs147/2017/au/assignments/simple-heuristic-evaluation.pdfhttps://goo.gl/vkhHVyFind 12-15 Heuristic Violations

Page 29: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Problems Found

1. H3 – no purchase button [100]2. H4 – remove column has check boxes

and then one entry w/ yes/no [100]3. H5 – illegal input (text) allowed in

quantity field [90]4. H1 – not clear who is logged in [10]5. H2 – “what fits my car” is not a good term

that people would know [20]6. H5 – user can add “out of stock” items

[80]November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 29

Page 30: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Problems Found Last Year

1. H5 Error Preventionallows non-numeric data in the quantity field. fix: don’t allow it. [90]

2. H5 Error Preventionquantity field doesn’t multiply by the price to give a correct total. fix: make it work. [55]

3. H10 Help & Documentationlink for international visitors hidden at bottom & may not be readable by non-english speakers. fix: move up to prominent location & include flags? [30]

4. H5 Error Prevention“Remove item bolded in red”, but red used for multiple purposes. Fix: get rid of ads in the checkout! More direct way to remove out of stock item or not even let me add a item that is out of stock. [100]

5. H5 Error PreventionNo way to check out. [110]

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 30

Page 31: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Problems Found Two Years Ago

1. H4 Consistencyremove column, 4th item is different w/ checkboxes. [150]

2. H9 Error preventionnon-numeric data in the quantity. Do not allow. [125]

3. H2 Match between system & real worldvehicle selection link not language I’d expect [100]

4. H1 Visibility of System Statusunclear which item to remove based on error message (“red/bold”). [150]

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 31

Page 32: Heuristic Evaluation - sites.cs.ucsb.edu · Heuristic Evaluation • Developed by Jakob Nielsen • Helps find usability problems in a UI design • Small set (3-5) of evaluators

Further ReadingHeuristic Evaluation

• Longer lecture– https://drive.google.com/file/d/0BweiB6wu4sB

aN2tfZGxKb2tuOTg/view

• Books– Usability Engineering, by Nielsen, 1994

• Web site– http://www.nngroup.com/articles/

November 1, 2016 dt+UX: Design Thinking for User Experience Design, Prototyping & Evaluation 32