UPA 2011: A Renovation in Buildings and Ideas

Jeff Sauro, Principal Usability Engineer

Jeff Sauro

The Usability Professionals Association (UPA) annual conference was held this year at the Hyatt Regency in Atlanta, a landmark in that city since the 1960s. The famed hotel on Peachtree Street is in the midst of a renovation. Between and sometimes during presentations, the sounds of jackhammering, banging, and sawing could be heard over the changing of Microsoft PowerPoint slides.

While at times inconvenient (bathrooms and escalators were largely inaccessible), these hiccups were good reminders of a profession also going through a renovation, or perhaps a renaissance.  Many of the presentations harked back to an earlier classical period in the usability profession.

While the early days of usability were largely similar to its parent discipline of human factors and ergonomics, over time, questions about sample sizes, disagreements between evaluators, and the rise in discount usability methods moved the field away from a quantitative focus.

 This year, I saw in the presentations a return to many of the fundamental methods and ideas that make usability both an interesting and effective discipline.

  1. Quantifying the User Experience: This year, the emphasis was clearly on quantitative methods (for example “Quantifying the Customer Experience,” "Practical Statistics for User Research," and “Quantifying the User Experience: Five essential concepts and controversies”). A theme throughout many presentations was quantifying the impacts of design using statistics and the general need to put numbers around what we do. Perhaps this emphasis was because of the economy (and the need to continually justify costs), but I suspect it speaks also to the maturing of the field.
  2. Cognitive Modeling: Techniques that were developed in the late ‘70s were also the focus of two presentations (“Effective cognitive modeling for business software” and “Usability's Next Top Model: Keystroke Level Modeling”). Cognitive modeling involves estimating the time that it takes skilled users to complete error-free tasks on applications. This technique uses standardized task times, refined over the decades from actual users. Cognitive modeling is not a replacement for usability testing; instead, cognitive modeling provides a cost-effective way to compare the relative productivity of experienced users between interfaces.
  3. Comparative Usability Evaluation: Ask independent teams of usability professionals to evaluate the same interface, and they will point out many unique problems. This diversity speaks to both the volume of usability problems in an interface, the different perspectives of evaluators, and the fuzzy nature that defines a usability problem. This year saw the ninth incarnation of the Comparative Usability Evaluation (CUE-9) hosted by Rolf Molich (one of the co-creators of Heuristic Evaluations). Nineteen independent usability professionals watched videos of the same users attempting tasks on a website, and again, the teams generated lists of largely unique problems. While this issue will continue to be explored and the data analyzed, instead of this type of study being viewed as harmful to the profession, I believe many practitioners now see it as an important part of improving our methods and effectiveness.
While the Atlanta hotel will have completed its remodel by the end of this year, this year’s UPA signals a renovation in methods and rigor that I suspect will continue for many years.

Post a Comment:
  • HTML Syntax: NOT allowed

The Oracle Applications User Experience (OAUX) Usable Apps in the Cloud blog.

Oracle Applictions User Experience logo

Twitter Usable Apps on Twitter

Main OAUX Website:

Usable Apps