Remote Usability Evaluation Methods
These are usability evaluation methods where "the evaluators are separated in space and/or time from users" (quoted from Castillo, Hartson and Hix, 1998).
Relevant dimensions
- Asynchronous vs. synchronous
- Task completion (e.g. usability test) vs. inspection-oriented
- Data collection method (questionnaire, incident report, data logging, video capture)
- Moderation style (async must be automatic)
Common variants and tools
- Moderated remote usability test with live video capture
- Directed surveys linked to prototypes
- Userzoom --- unmoderated testing that collects task completion, click streams and comments
- Usertesting.com --- online, unmoderated testing; the service recruits the users
Related methods and tools
- Web analytics (e.g. Google's service)
- A/B design comparisons in the wild (e.g. Optimizely)
Issues
- Quality of remote testing
- Can users self-diagnose problems and report them?
- Is remote evaluation of desktop apps feasible?
- Do tools that work on live sites evaluate usability?
Andreason et al. (2007) address some of these questions with a study on remote usability testing.