We’ve all filled in customer satisfaction surveys, and I think you’ll agree that they are mostly designed to fit the company’s needs and not to allow the customer to express how they really feel.


I became aware of this some years ago when the help desk I was working on ran a survey to find out what was most important to its customers.

The result was that what the customer wanted most was to have his or her call answered within three rings. Nothing was said about quality, resolution, professionalism, or even talking to a real person. Getting the call picked up was the thing.

I was not surprised when an ACD system followed hot on its heels. The survey was worded so that the desired result was obtained.

Think of the last survey you completed; have you seen anything like this?

“When you last called the Help desk, were you:

1 Pleased

2 Delighted

3 Ecstatic

4 Overcome with joy

5 Speechless with Bliss

If you weren’t totally delighted with the experience, you will have picked the poorest option “Pleased,” and this will appear in the post-survey report that 98% of respondents were pleased with the experience.

When the ACD supplier designed the survey for their product, their thought process probably went something like this:

“How can we make the survey produce the answer that best suits our needs?”

Answer: By creating the environment that forces the respondent to give the answer we want.

I will exaggerate to illustrate the point:

Question 1

When calling the help desk would you prefer:

1. That the call be answered within three rings.

2. That a crazed psychopath hold you down and pull out your fingernails with rusty pliers.

3. That a thousand seal pups be cruelly slaughtered.

I know that is probably not accurate, but you see what I mean. What I’m trying to say is this: if you want to gauge opinions accurately, you need to allow the respondent to give his or her view and not restrict the respondent to a set script of answers.

Worse than this, you will find that some of these surveys will have required fields and that you can’t pass on to the next question until you have ticked one of the boxes, even though none of them apply. Badly designed scripts mean that the data supplied is not a true reflection of the customer’s real feelings or opinions.

Given the weight placed on the results of such work for IT service provision, wouldn’t it be better if such questionnaire information was better?

If you have any horror stories about customer survey work, please let me know.