Post

Evaluation: usability studies, A/B testing, quantitative and qualitative evaluation, cybersecurity case study

Quantitative Evaluation

Cognitive Walkthrough

Requirements;

  • Description or prototype of interface
  • Task Description
  • List of actions to complete task
  • Use background

What you look for; (A mobile Gesture prototype)

  • Will users know to perform the action?
  • Will users see the control
  • Will users know the control does what they want?
  • Will users understand the feedback?

Heuristic Analysis

  • Follow ‘rules of thumb’ or suggestions about good design.
  • Can be done by experts/designers, fast and easy.
  • May miss problems users would catch.

Nielsen’s Heuristics

  • Simple and natural dialog
  • Speak the users’ language
  • Minimize user memory load
  • Consistency
  • Feedback
  • Clearly marked exits
  • Shortcuts
  • Prevent errors
  • Good error messages
  • Providing help and documentation

Personas

  • A fictitious user representing a class of users
  • Reference point for design and analysis
  • Has a goal or goals they want to accomplish (in general or in the system)

Running Controlled Experiments

  • State a lucid, testable hypothesis.
  • Identify independent and dependent variables
  • Design the experimental protocol
  • Choose the user population
  • Run some pilot participants
  • Fix the experimental protocol
  • Run the experiment
  • Perform statistical analysis
  • Draw conclusion
  • Communicate results

Analysis

  • Statistical comparison (e.g., t-test)
  • Report results

Usability Studies

Testing Usability of Security

  • Security is rarely the task users set out to accomplish.
  • Good Security is a seamless part of the task.

Usability Study Process

  • Define tasks (and their importance)
  • Develop Questionnaires

Selecting Tasks

  • What are the most important things a user would do with this interface?
  • Present it as a task not a question
  • Be specific
  • Don’t give instructions
  • Don’t be vague or provide tiny insignificant tasks
  • Choose representative tasks that reflect the most important things a user would do with the interface

Security Tasks

  • Security is almost never a task

Pre-Test Questionnaires

  • Learn any relevant background about the subject’s
  • Age, gender, education level, experience with the web, experience with this type of website, experience with this site in particular.
  • Perhaps more specific questions based on the site, e.g., color blindness, if the user has children, etc.

Post-Test Questionnaires

  • Have users provide feedback on the interface.

Evaluation

  • Users are given a list of tasks and asked to perform each task.
  • Interaction with the user is governed by different protocols.

Observation Methods

  • Silent Observer
  • Think Aloud
  • Constructive Interaction

Interview

  • Ask users to give you feedback
  • Easier for the user than writing it down
  • They will tell you, things, you never thought to ask

Reporting

  • After the evaluation, report your results
  • Summarize the experiences of users
  • Emphasize your insights with specific examples or quotes
  • Offer suggestions for improvement for tasks that were difficult to perform

A/B Testing

  • Doesn’t include any Cognitive or psychological understanding or model of user behavior.
  • You give two options, A or B, and measure how they perform.

How to Run A/B Test

  • Start with a small percentage of visitors trying the experimental conditions.
  • Automatically stop testing if any condition has very bad performance.
  • Let people consistently see the same variation so, they don’t get confused.
This post is licensed under CC BY 4.0 by the author.