It’s Final: Usability Testing is Not POR

Though Mother Nature says otherwise today, the sun is shining across the National Capital Region.

Word came this week that Treasury Board Secretariat released guidance on the definitions of consultation and citizen engagement, public opinion research (POR), stakeholder interviews and usability testing. The part of this guidance that should have web teams and usability practitioners tap dancing in the rain states:

Usability testing (test de facilité d’emploi) – technique for evaluating the speed, accuracy and confidence with which users can complete tasks on an existing or proposed website, application, system, form, or product. Usability tests can reveal problems with design and functionality. Specifically, usability tests can uncover navigation issues, and assess the impact of the content, layout and structure on users’ ability to complete critical tasks. In addition, usability testing can provide feedback about the overall user experience. While usability tests can have different formats, such as moderated or un-moderated, they generally include a set of predetermined tasks that participants complete, often while narrating their thought process. Data such as completion and error rates, time spent on a task and confidence ratings are collected by an observer or remotely through a web-based tool.

Past usability testing efforts in government sometimes were cast as POR, often to the detriment of strategic and operational business outcomes and the satisfaction of users wanting to use government services. With this new guidance, the Government of Canada firmly asserts the value of testing content, applications and systems to ensure they actually help people do the tasks they set out to do.

There is a wrinkle in the definition however that could create a potential bottleneck for testing efforts. The definition guidance cautions that usability testing is not POR “as long as the moderator does not engage the participant and does not ask questions or solicit comments.” The participant may narrate their experience, but if their statements are indecipherable or vague, the guidance suggests a strict prohibition by moderators to get clarity on the participant’s remarks. I don’t think this was the intention of the guidance, but it does create a gray area.  Part of the role of a moderator during a usability test is to understand the behaviour of the participant, and sometimes this includes asking what a person meant when they said “I’m confused” when trying to choose between two menu links. The confusion isn’t with the quality or nature of service being offered, but with the labels used to describe the service. In this context, it would be perfectly appropriate to ask the participant to describe their confusion. We may learn that the menu labels are too similar, too vague or too narrow. That kind of data is pure gold for web teams looking to improve the user experience.

So how should Web teams handle this wrinkle? Set up a meeting with the POR team and walk them through the kinds of questions typically asked during a session. The POR team can guide the testers away from any questions that fall into the POR area, and be assured that the testing complies with policy. Invite the POR team to be silent observers of testing sessions and in a subsequent discussion, invite their feedback to refine the methodology.

Another logical step would be for Web teams and Communications Advisors to discuss the implications for their projects and user research planning, and then to reach out to Web application development teams to determine a course of action for online applications.

This is a long-awaited and positive direction for the delivery of online government services. We look forward to seeing the impact this will have on the GC’s management of its digital presence and service to Canadians.


Denise Eisner is a senior-level web strategist and communications specialist with a passion for creating enhanced user experiences. As a member of the Government Service Excellence practice, Denise’s experience and specializations include web strategy development, information architecture, web analytics (WebTrends and Google Analytics) and web project management. She has led large-scale content audits, developed performance measurement frameworks, and coordinated site updates to meet Treasury Board policies standards and guidelines. Engaged in the evolving spheres of information technology, corporate communications and media for almost two decades, Denise has transformed business objectives into web strategies and information architectures for corporate and government clients in the U.S. and Canada.


Comments:


Leave us a comment: * Your information is never shared