Developer

User testing: Reporting your findings

Learn how to best organize your report, see what you should include in the report, and find out what you should do with the report once it's compiled.

By Seth Gordon
2/15/00

If your team took good notes, a lot of data now needs to be sorted and evaluated. Start by reviewing your notes with a highlighter to identify key quotes, recurring comments, and interesting findings. Then spend time determining which findings are most critical. Once key findings are identified and prioritized, it's time to start hammering out your report.

Report Vocabulary
Select the language used to describe the evaluation and the respondents carefully. You'll notice during the course of this article, I've never called the test participants users; they are referred to as respondents. By definition, a user has actually used the site. A representative user is somebody who fits the profile of a person who would actually use the site.

Also, steer clear from drawing concrete conclusions based on observations. When a conclusion is based on an observation and not a spoken response, make that distinction. For example, if a respondent appears frustrated or confused, the report should read "The respondent appeared frustrated," instead of "The respondent was frustrated."

Prepare and Organize the Report
Your goal for the written report is to provide a useful and readable document to the client, not just a thick packet of papers that gets filed away, unread. When writing the report, provide an executive summary, as well as specifics from the evaluation. With discount usability techniques, the goal is to identify usability concerns and potential solutions. Therefore, a majority of the report should focus on usability shortcomings rather than areas that tested well. Make sure the client understands this.

Use the following sections as a guideline when preparing your report:

  1. Overview: This is a high-level overview of the evaluation goals and methodologies used to collect data. Describes the test respondents, the data collection process, the testing facilities, and any considerations that potentially impact the results. Basically, this is the why and how of the test.
  2. Evaluation Highlights: If someone is only going to read one section of the report, this should be it. It includes self-explanatory headings, such as Significant Positive Feedback and Significant Negative Feedback. Also include an Interesting Findings section that discusses unexpected feedback that was especially notable. The highlights section should conclude with a selection of memorable quotes. A good evaluation captures respondent feedback, and the best way to communicate that is through respondent quotes.
  3. Results and Potential Solutions: Often the longest section of the report, this part details the findings of the usability evaluation. Highlight problematic areas through a combination of narrative and screen captures. The screen shots are especially effective for communicating the context of the problem. It's often easy to see the problem but extremely difficult to document it.
  4. Next Steps: Remember, this was only an evaluation. Now it's time to figure out what to do with the results. A usability evaluation does not exist in a vacuum, and the end solutions often reflect a compromise between all site concerns, such as visual design, branding, technology, and resource constraints. Also, use this section to recommend future evaluations.

Every good report also includes appendixes:

  • The moderator's guide references the questions asked to collect the data. This makes it easy to be reevaluate the site using the same test script.
  • Respondent profiles provide background information (demographics, experience level, Internet usage) on the test respondents.
  • Screen shots provide the context for the evaluation. All too often there is no documentation about which version of the site was tested and what it looked like. Screen shots capture the version of the site that was evaluated.

Make yourself available in person or on the phone to answer questions. If your clients don't understand what you gave them, the evaluation is useless. As sad as it sounds, it is not uncommon for the written findings from usability evaluations to remain unread. To make sure the results are disseminated, attempt to present the findings in person. Just like the evaluation, this step doesn't need to be high tech. It might be as simple as walking people through the highlights and findings of the report. This also ensures the client has no excuse for not being informed about the issues, and it provides an opportunity to address the client's questions on the spot.

Learn and Share
Now that all of these insightful findings have been compiled, make sure you share the results with people on your team and in your company. Talk about what works, so those successes can be captured in future projects. Talk about what didn't work, and try to come up with alternative approaches. It's much easier than starting at square one every time you build a site.

Seth Gordon is a frequent contributor to CNET Builder.com. In his spare time, Seth enjoys touring national parks and restoring mechanical antiques.

Editor's Picks