Internal customer satisfaction surveys need to be carefully crafted to get to the heart of the IT department’s productivity, two experts in the field told TechRepublic.

“You need to ask questions that are important to you, yes,” said Linda Keefe, CEO of Shared Results, a consultancy that helps overcome organizational indifference, communication problems, technological deficiencies, and other barriers to growth. “But you also need to ask questions that may be on their (your customers’) radar screens but are not on yours.”

Surveys should ask questions that give you a good idea of where your department customer service is, where it’s been, and where it’s going, said Bruce Coughlin, VP of Portfolio Management for Siemens Business Services.

Coughlin said his clients use several approaches to develop internal IT surveys. The traditional customer satisfaction metric uses standard transaction questions, which now are often administered via Web-based services, Coughlin said. There also is the more “holistic” approach, which makes respondents feel that they can say exactly what they want about how IT is delivering services.

There are distinct benefits to these approaches. Standard questions often don’t require a lot of thought and provide a “strong hit back” from the respondent, Coughlin said. His examples of these types of questions include:

  • How long did problem resolution take?
  • Was the tech knowledgeable and professional?
  • Was the problem resolved appropriately and on the first try?
  • Overall, how satisfied are customers with IT services? (Gauge on a scale of 1 to 5, for example.)

Coughlin also noted that many organizations customize their assessment tools based on the corporate culture or what an organization values most. For instance, some companies put a premium on customer satisfaction over speed of problem resolution. Such organizations might gear their questions more toward finding out how satisfied customers are. Example questions from such surveys include:

  • Did the user understand how to access the service?
  • How quickly did the support team respond?
  • Did the support technician know what kind of equipment the customer was using?
  • Did the customer feel the issues were resolved?

That last question can be critical because it gauges perception, which is often the heart of customer satisfaction, Coughlin said. “For instance, the agent might feel the issue was resolved, but the client may not,” he said.

However, Keefe cautioned that an IT manager shouldn’t become too focused on company priorities, which can lead to survey questions that drive answers. “Too often surveys are designed to reinforce whatever the person sending the survey feels is important,” said Keefe, whose clients include Xerox, Eastman Kodak, and the American Red Cross.

“In a survey, it’s a challenge to ask questions that cover the overall issue. You have to have questions in your survey that allow for open-ended responses,” she said. “Because if all the questions are multiple choice, you may not get to the big issues.”

This is why quantifying responses on a scale can become so important in a survey, Keefe said. Instead of asking how fast a technician arrived, the survey could ask if the client was satisfied with the response time based on a scale of 1 to 10, then allow follow-up with comments. “If it’s not a 10, what will make it a 10?” Keefe said. “Because only if someone writes down what will make it a 10 will you get any value out of the survey.”

You can tally user responses from this model and evaluate them via a grading scale or scoring guide. Here’s an example of a grading scale:

  • 90-100—Excellent
  • 75-89—Good
  • 60-74—Acceptable
  • 1-59—Inadequate

There can be a downside to this type of survey, Keefe cautioned. “You will get a lot of things you can do different,” she said. “You also have to be careful not to fall down in the dumps if they say something negative.”

However, the value IT managers get in return is worth any negatives they might feel from unfavorable responses. Indeed, they could learn a great deal from unfavorable responses.

Survey frequency
How often IT managers should give surveys depends on a number of factors. In general, the survey should be given at least once a year, or sooner, if a major initiative has been completed.

One reason frequency of surveys varies is because of the number of surveys that can be offered. “The general survey that gauges the IT support capabilities and acceptance should generally be conducted two to three times a year in order to maximize participation and feedback into a continuous improvement effort,” Coughlin said.

“More frequent surveys tend to decrease user participation, and less frequent surveys tend to give a less precise view of the actual climate.”

Surveys are a great way to keep in the loop with what end-users think about IT’s services. Use the information they provide to guide the continuing development of your department.