Rick O’Brien is the owner of Ontario-based Business Data Solutions. This is his first article for TechRepublic.
A few years ago, the manager of a commercial property management company asked me to automate a series of lease documents in Microsoft Word to duplicate the WordPerfect 5.1 macros that had been in use for years. When I suggested we set up interviews with the end users to gather their ideas and preferences, he said that wouldn’t be necessary. The new macros simply had to work “exactly like the old ones.”
I had already taken my own informal poll of the office staff while waiting for my appointment with the manager. They all questioned why they had to learn new software just to keep doing things exactly as before.
I turned down the job as politely as possible. Without the support and cooperation of the people that would be using it, the new automated documents project was doomed before it even started.
User involvement is invaluable during the requirements-gathering phase of a project. Users’ opinions and preferences help shape the look and flow of an application during the design and coding phases. And user training is easier when the application is based on workflow models and scenarios developed with the help of users. In the end, user acceptance helps guarantee the successful adoption of the finished application.
You can gather user input during application development in a number of ways: surveys (usually for large organizations); on-site workflow observation; one-on-one user interviews; focus group, multiple-user sessions; or combinations and variations of all of these.
I’ll concentrate on the one-on-one user interview and how to get the most value from the precious time between you and the interviewee.
Gathering user input
In a small organization, you may be able to spend time with every end user who will be using the new application. In a large organization, this is impractical, and you’ll have to settle for a cross section of users. In either case, the clock is ticking for both you and the employee, so you must maximize the quality of the data gathered.
Be prepared
An interview is not a conversation; it has structure. While a survey makes use of yes/no or multiple-choice questions, an interview is more open and must grow out of a set of prepared and insightful questions. Working from a list of questions will allow you to demonstrate progress to the user, as in: “We are almost done now; just two more questions.” Working from a list will also let you skip more difficult questions until later in the interview and will help you recap near the end.
The interview can be a nerve-racking experience for users, so putting them at ease should be one of your prime concerns. Halfway through one interview with a user who had been described to me as having very few computer skills, my interviewee suddenly blurted out: “Why are you asking all these questions, anyway?” I assured her that her input was valuable and would be used to tailor the new software to her needs.
Who is the expert?
One way to put users at ease during the interview is to put your own ego aside and acknowledge the fact that they are the experts on the tasks and processes that make up their jobs. You are there to learn from them—not the other way around. You are going to use their experience and suggestions to design a software application that will make their jobs easier and remove barriers to productivity. The users will not only know the most about their current tasks but will also be able to tell you about the workarounds when things go wrong. You are, in effect, asking them: “How can I make your job easier/safer/less stressful?”
As a developer, it is easy to fall into the trap of designing the application from a developer’s point of view. I went into early interviews asking inane things like what color a screen should be or where the user wanted the OK button or which menu items should be included. Software must be designed around the tasks that users of the software need to perform; otherwise, the software becomes a barrier to productivity and a task in itself.
The fear factor
It may not have occurred to you, but many users might be afraid that their answers to your interview questions could lead to their being “replaced by a computer,” or may reveal some real or perceived lack of computer skills on their part that might jeopardize their job. Assure your interviewees that they aren’t being tested in any way, but rather that their input will be used to make the new or revised application as user friendly and intuitive to learn as possible. Remind them that their company must keep up with or stay ahead of the competition in order to stay viable. Sharing their knowledge and experience helps their “team” win.
But be careful here. You cannot simply tell the interviewees that they are not going to lose their job. New applications can be part of a restructuring that could lead to eventual job losses. You don’t want to put yourself in the position of telling someone who may eventually be laid off that there is nothing to worry about. Rather, point out to the interviewee that being part of the software design effort will position them to play a continuing role as an expert trainer and mentor to any new hire.
Be open
Unlike survey questions, interview questions should be open and structured to encourage examples and scenarios from the user by way of explanation rather than by yes/no answers. If you’re trying to assess someone’s comfort level with computers, a question like “Do you have a home computer?” is just going to result in a yes or no.
A more evocative question might be: “When you use a computer—whether at home, at work, at school, or at some public or business facility—what do you most commonly use it for?” Users who do not own a computer may still be heavy users of machines at libraries, schools, or even arcades. If most users say they surf the Web a lot, you may want to make your interface more browserlike.
Silence is golden, so listen up
During an ordinary conversation, there is a tendency to jump in during a lull. An interview is not an ordinary conversation. Your job is to listen while the person being interviewed speaks.
If you ask a question and there is a period of silence, it probably means the person being interviewed is carefully considering the answer. Let interviewees be the first to break the silence and let them answer later if they like. While waiting for responses, be aware of what your body language and facial expression are saying. Stay relaxed, and chances are the other person will too. When pondering a question, people often look up or away to one side. Don’t stare at them while they formulate their answer. Busy yourself by reviewing your interview notes or preparing for the next question.
I sometimes have had interviewees, after a prolonged period of silence, say they are stumped by a question or that they cannot come up with an answer. Simply tell them that you will come back to the question later. Sometimes one of the answers to a later question may help with a previous one.
Avoid leading questions
When planning and delivering your questions, avoid “leading” questions that may seem to hint at the answer you’re expecting. Try something like: “What aspect of using the current data entry system do you find the most difficult?” rather than: “The current data entry system is hard to use, isn’t it?” The interviewees may incorrectly sense a hidden agenda and begin giving you the answers they think you want to hear.
While gathering requirements for a replacement to an aging database application that I had designed, I had difficulty getting users to constructively criticize the old system. They knew I had designed it and felt obliged to talk about how good it was, thinking I was there to defend my earlier work. Users need to know that all software has a life cycle and is eventually replaced as user needs change and evolve.
If you’re working on enhancing an existing system, make sure users understand that you need to know how to preserve the best features of existing software, while at the same time eliminating or improving on features that are not working properly.
Did I miss anything?
As a wrap-up to the interview, verify the answers you have recorded and clear up any questions that may have been skipped earlier. “When I asked ____, you answered ___. Did I understand that correctly?” After this review process, don’t forget what is possibly the most important question: “What did I leave out that you would like to add?”
What would you add?
How do you structure user interviews to make the results valuable? Share your interview tips or your interview questions with us, and we’ll feature your tactics in upcoming articles.