Analysis strategies should be informed by your user testing plan, which includes the scope and the purpose of the project, testing goals and objectives, and testing methodology. Additionally, consider the intended audience for this analysis: Will you report your findings to your library’s administrative team during a high stakes meeting? Or are you simply sending an e-mail to your colleagues with quick recommendations? These two audiences are very different and thus require different analysis and communication strategies.
Strategies
- Organize quantitative and qualitative data
- Quantitative data refers to but is not limited to data points such as success and/or error rates; task time to completion; and demographic information. This can be a useful way to determine if users can simply function within a given online environment.
- Qualitative data refers to user testing participants’ commentary about the task or an online element, the facilitator’s observations of the testing session, and the data gathered from open-ended questions.
- Consider completing two levels of reports:
- Quick, initial analysis: conducted immediately after testing is concluded to figure out obvious patterns or problems.
- In-depth analysis. occurs over multiple weeks to gather and organize data to create a thorough report. For this analysis, explain results and recommendations that relate directly to the tests goals and objectives. Assign a problem severity level to each task (and the accompanying results) and prioritize recommendations. Also, consider putting together a report template that you and/or your team can use whenever you need a more formal approach in order to share your data. Template components can include but are not limited to:
- Summary section (e.g., what you did and why);
- Methodology (e.g., how you did what you did);
- Test results (e.g., what happened when you did what you did);
- Note: Be sure to include both quantitative and qualitative data.
- Recommendations (e.g., given what you did, and what happened during the sessions, what action items — if any — do you recommend?).
Tips for writing effective user testing reports
- Summarize and interpret the data. Make observations and connections that tie to specific actions that can be accomplished.
- Discuss how the findings relate to the goals and objectives of the study.
- Develop specific recommendations with clear links to the data and study objectives.
- Possible report structures could be to organize around original goals and objectives, group findings by category (navigation, content, etc.), or by level of severity of problems.
- Define time-frames for accomplishing each recommendation so they can be prioritized.
- Discuss areas for further or follow-up study.
Resources
- Communicating User Research Findings (UX Matters)
Advice on how to choose reporting formats, with excellent tips on getting the most important points across in a strong, effective way. - Making Usability Findings Actionable: 5 Tips for Writing Better Reports (Nielsen Norman)
Five things to focus on in reporting results to convey the most important changes to be made quickly, and to inform future efforts. - Usability Test Report (Usability.gov)
This downloadable report template in Word format has explanations and examples for each feature of the report.