Overview of UX Evaluation

Organization in Hartson and Pyla

  • UX Objectives (ch. 10, ch. 22 in 2nd edition)
  • Intro Evaluation (ch. 12, ch. 21 in second)
  • Low-cost Evaluation Methods (ch. 13, ch. 25 in 2nd)
  • Rigorous Empirical Methodology (ch. 14 - 16, ch. 22-24 in 2nd)
  • Evaluation Reporting (ch. 17, ch. 26-27 in 2nd)

Major Dimensions for Evaluation Methods

  • Analytical vs. empirical
  • Informal vs. formal
  • Formative vs. summative
  • Qualitative vs. quantitative

Other schemes

Rosson and Carroll Lewis and Rieman Nielsen and Mack Preece, Rogers and Sharp
Analytical methods Evaluating without Users Formal methods Predictive / Modeling user's task performance
Informal methods Predictive / Asking experts
Empirical methods Evaluating with Users Empirical methods Usability testing
Field studies
    Automatic methods  

Preece, Rogers and Sharpe (PRS) also describe a "Quick and Dirty" evaluation paradigm, which seems to generally refer to informal empirical methods and what H&P call Rapid Evaluation Methods.

Discussion Questions

  • What other ways are there for characterizing evaluation methods?