Developer

Simple, cost-effective intranet usability testing

You can do usability tests on your clients' intranets without a big budget. Get advice from the director of research at Nielsen Norman Group on conducting and evaluating intranet usability testing for your clients.


Your client’s internal communications are likely carried out via an intranet, and this often over-looked hub of day-to-day business can be a hidden cost hemorrhage if designed poorly. Yet a well-designed, easy-to-use intranet can dramatically increase the efficiency of internal procedures, improve morale, and save your clients some cold, hard cash.

According to a recent intranet usability study published by the Nielsen Norman Group (NNG), when salaries and overhead costs are taken into account, a company with a very poorly designed intranet would spend $3,042 per employee annually to cover time spent on sixteen tasks the study measured, as opposed to the $2,069 per employee the average company would spend or the $1,563 per employee spent by companies with the best intranets.

Consultants tasked with creating, maintaining, or evaluating an existing intranet may not have the resources to launch a full-scale study like NNG’s. But an intranet usability study that provides very valuable feedback can be conducted quite simply and economically, according to Kara Pernice Coyne, NNG’s director of research. We spoke with Coyne about the value of intranet testing and the best and easiest way to conduct simple usability studies. Here’s the advice she provided.

NNG’s usability study
NNG studied 14 corporate intranets—ten from the United States, three from London, and one from Hong Kong—using a diverse group of employees from each company as testers. NNG measured the participants’ ability to successfully complete a number of tasks, the time it took to complete them, and the participants’ level of confidence, satisfaction, or frustration regarding the tasks. The results of the study were recently published in a report, which may be purchased for $248.

According to the study, Coyne said the major time-wasting traits of most corporate intranets were:
  • Inconsistent design
  • Stale news or news that is centric to one specific group
  • Poor search capabilities
  • Multiple logins
  • Unnecessary or duplicated features

Troublesome traits of corporate intranets
To find out more about the major time-wasting traits of corporate intranets, read “Five ways to improve your intranet’s usability.”

Coyne said that consultants could duplicate the major steps in NNG’s procedures to conduct a simple, cost-effective usability study for their clients and find out if any of these issues are wreaking havoc on their bottom line.

Choose your participants
Coyne said the first step is to identify people within the client’s company who are already using the intranet. She cautioned that it’s important to get a good mix of employees from different areas of the company.

“It’s really easy to just grab a person that sits near you or works with you or something like that,” she said, “But you don’t want to just get people to participate who are just your coworkers or are in the IT group. You want to get people who are on the manufacturing floor, managers, administrative assistants, etc.”

Coyne recommended that you use four participants in each round of usability testing, or that you stay within a range of three to seven participants.

“There’s been a lot of research in this industry that shows that if you start testing more than eight people, you start having diminishing returns,” Coyne said. “You start to see the same issues over and over again and it just seems kind of a waste of time.”

Set up the test and the testing environment
Next, Coyne said you should decide on a few test tasks that employees should be able to accomplish using the intranet. For example, in most companies, users should be able to quickly and easily get reimbursement for a business expense.

It’s important when writing the tasks for testers that you don’t provide step-by-step instructions or otherwise lead participants through the activity, Coyne said. For example, you wouldn’t want to write, “Download the expense report form from the Finance area.” Instead, you’d write an anecdotal assignment in natural language, being careful to use as little of the terminology from the intranet as possible. For example, you might write, “You just went on a business trip. Here are five receipts. Get yourself reimbursed.”

Watch and wait
Test participants in a quiet area. You may choose to test participants at their own desks, Coyne said, but it’s best to test in an office with the door shut so that the testers don’t get distracted.

You should verbally prepare the person for testing as well, ensuring that they don’t feel like it’s they who are being tested, but the intranet.

“Tell them that what they’re doing is something that’s going to help you evaluate the system, and that they are not being evaluated,” Coyne said.

Next, give them a task by reading it aloud and then handing it to them and having them read it aloud. Then ask the participants to “tell you what they’re thinking as they work,” Coyne said.

Coyne said the next part of the process is often the most difficult for the administrators of usability testing: Watch and wait.

“What you tend to want to do is ask a lot of questions,” Coyne said. “People think of it as an interview or like a focus group and that’s really not the point.”

Collecting data
Coyne said collecting data about users’ time, success rate, and level of confidence vs. frustration provides a good mix of objective and subjective measures. You’re most likely, however, to get the best information from simply watching how the user attempts to perform the task and listening as they “think out loud” while performing the task.

To collect information about users’ satisfaction with the tasks, Coyne said consultants might mimic NNG’s study by using a written feedback form to collect data. A short written questionnaire could include basic questions about users’ satisfaction, frustration, and confidence level—using a 1 to 5 or 1 to 7 scale—after completing a task.

Recording timings
Recording the time it takes to complete a task can be tricky, Coyne said, because you’ve asked the user to “think out loud” while completing the task.

“If someone really goes off and is talking for a long time, I suggest you pause the time and then start it up again when they’re actually working,” she said. “If you really want strong numbers, you have to do a more quantitative study, where you don’t ask questions and you don’t have the user talk. But for internal purposes, this is a good way to do it.”

Participants should be tested for a maximum of two hours, and preferably one hour, Coyne said. When the users are finished with their tasks or when the maximum allowed time has elapsed, ask them if they have any other comments. Once they’ve provided open-ended feedback, you can then ask them more pointed questions, Coyne said.

Verbal and visual data
“You can ask them if they realized what some terminology meant or things like that,” Coyne said. “But be careful not to say, ‘See this? Do you like it?’”

Most of the time, questions about whether participants “like” a feature or piece of the intranet will yield nothing helpful, she said. Again, the best data is gleaned from simply watching the participants.

If you don’t watch out, though, you might miss out on the most important information by talking too much. Coyne said a good rule of thumb is that anytime you want to talk to the user or to move them on to the next task, count to ten before you do it.

“Most of the time, I find that right when I’m hitting nine the person says something that I really never would have noticed had I not waited,” Coyne said.

Often, when users are asked if they have any comments about a particular task they say no immediately. But if you wait a moment or two before speaking again, they’re likely to provide more comments, she said.

It may be helpful to use video or computer software to record the actions of the participants for later review. Coyne said that NNG is using Camtasia, a product that records both audio and actions on the screen. But even just recording a test by pointing a video camera at a user can provide valuable information, she said.

Recording success rates
For the quantitative data, like time and success rate, that you choose to collect, you should decide on a rating scale prior to testing. Coyne said NNG often uses a 1 to 4 rating scale to measure success. For example, when attempting to submit an expense report, the ratings might go as such:
  • 1 point: Found the expense report
  • 2 points: Found it and filled it out
  • 3 Points: Found it, filled it out, and submitted it, but to the wrong place
  • 4 Points: Found it, filled it out, and submitted it to the proper place

Test continually and frequently
The NNG intranet study calculated and reported an average time to complete the common tasks tested across all 14 intranets. While the averages provided a basis to compare the intranets tested, Coyne said that each intranet is different and doesn’t necessarily need to fit in with this average.

“Maybe you’ve got something really complex that people do, or maybe yours should be much quicker and easier,” she said. “It’s best to do your own benchmarks and track it over time.”

To get the most from intranet usability testing data, clients should be encouraged to test as often as they can handle making changes based on the data, Coyne said.

“Instead of thinking of it as a big event to do a usability test, think of it as you have a lot of shots at this and you would be doing lots of little tests,” she said. “So there isn’t as much risk involved when you think of it that way and it’s far less daunting.”

Comparing data
The data from your initial test of a client’s intranet can be compared to traditional methods of communication or processes. For example, if you’re trying to find out whether it would be beneficial for a client to put his expense reports online, compare the time it takes to find and submit the form via the intranet with the time it takes people to do it using their former method or a competing system.

If any of the times or success rates for the tasks you assigned during the initial testing are unacceptable, you can retest after updates have been made, using the original data as a benchmark. The idea is to compete against yourself to see if your updates and design changes are truly improvements, Coyne said.

However, she recommended using a different pool of users for each round of testing, generally speaking.

“If you’ve drastically changed the design and at least six months has lapsed, it might be okay to get a person in again that you’ve had before,” she said. “But you don’t want to just use the same pool over and over or you’re not getting enough of the feedback you want from your employees.”

Costs
Coyne said small, frequent usability tests could provide big value for little cost. Beyond the time of administrators and participants and the one-time cost of video equipment or recording software, little investment is required for what could provide important insight and save lots of money in employee downtime.

Because the test participants are internal employees, it isn’t necessary to pay them for their time, but the going rate for external testers is about $100 per test. Coyne suggested that it’s nice to provide users with tangible thanks for their efforts.

“It’s nice to give them something, like a thank-you note, a T-shirt, mugs, or any kind of little treat you have,” she said.

Are you an expert at intranet usability testing?
Do you have tried-and-true advice for usability testing or want to share a story with the TechRepublic audience? Send us your suggestions or post your comments below.

 

Editor's Picks

Free Newsletters, In your Inbox