Understand the happy customer tipping point for insights into loyalty and purchases

A Temkin Group survey analysis uncovers significant gains in loyalty measures for tech vendors that succeed in being easy to work with from their customers' perspectives.


Image: Yuri Arcurs

A positive customer experience builds customer loyalty for tech vendors. It isn't exactly news, right? Well, Bruce Temkin and his customer experience consultancy Temkin Group thought of a new angle for examining this topic when running its Q1 2013 IT decision maker survey: They grouped and analyzed vendor-specific results on an "easy to work with" loyalty scale that ranged from one to seven (one is the hardest to work with). I interviewed Mr. Temkin about these survey results, as well as the company's November 2013 report Blueprint for a Successful CX Organization.

I first spoke with Mr. Temkin in September 2013 about his firm's Tech Vendors Benchmarking report, which showed satisfaction scores dropping in product and relationship areas from 2012 to 2013. During that interview, I gained an appreciation for how the Temkin Group conducts its research with clarity and transparency, and I was looking forward to another talk with Bruce. I was not disappointed.

TechRepublic: Could you take me through the process of going back to the survey, and how you arrived at the loyalty scale?

Bruce Temkin: Let me start with the survey. We have a random survey that goes out to IT decision makers. We buy panel access; we contract out to provide responses from 800 IT professionals at companies in North America. (They need) at least $500 million in revenue. Then we qualify IT professionals as those that are involved in making decisions about the vendors that the company does business with, so they end up being the IT decision makers. We also make sure there are no more than three people from any given company, and we make sure they don't come from the tech industry. We don't want a bunch of people from Microsoft surveying Microsoft.

TechRepublic: Tech vendors surveying themselves in other words.

Bruce Temkin: Yes. So we do that, and we end up getting responses from 800 and more IT decision makers. I think in this case we had 802. And so in this particular analysis of that blog post I looked at a few of the questions on the survey.

One of the questions was: "Thinking about your most recent interaction with each of these companies, how easy is it to interact with the company?" If you remember back to the Temkin Experience Ratings, it's one of the three elements of our experience ratings. I looked at one of the three questions, which is around being "easy." So we just plucked out that one, because we work with lots of tech companies and many of them struggle with this, and a few of them are starting to focus on being easy to work with. This is the accessible component of our Experience Rating. So we looked at just that one and had the respondents answer the question on a scale from one to seven, one being very hard to work with and seven being very easy to work with.

Individual respondents rated multiple tech vendors. If you think about it, the decision makers don't just deal with one tech vendor, they tend to deal with many of them. So we looked at the evaluations from each one of those answers from one to seven, from very difficult "ones" to very easy "sevens," and we bucketed all of those ratings. So in other words. if I am an IT decision maker and I rated Microsoft a three and Oracle a five, then those are considered separate ratings. I would the look at all the other things that I rated Microsoft on as well, and do the same thing with Oracle.

So the people who rated a company a three -- we then look at other things. What are their plans to purchase from that vendor? How likely are they to try a new product or service from that vendor? And what was their Net Promoter Score for that vendor? We end up grouping these together and for our research we looked at every instance where someone rated a tech vendor a three on our easy-to-use scale and what their collective plans to purchase were, how likely are they to buy a new product, and the collective Net Promoter Score for the vendor. What you see in that graph is a collection. While there are 802 IT professionals, I think on average they might have rated 10 or so companies, so that represents more than 8,000 individual evaluations.

TechRepublic: For the Net Promoter Score, the three basic answers are promoter, detractor, or passive. From your survey, how do you calculate the Net Promoter Score?

Bruce Temkin: The Net Promoter Scores have become a somewhat common measurement within the tech sector -- it's even more prevalent than for businesses at large. What we used is the standard version, in which you ask someone how likely they are to recommend a vendor on a scale from zero (not very likely) to 10 (very likely). Then you bucket people into one of three categories. If they give a zero to a six, they're called a detractor, from nine to 10 a promoter, and if they give a seven or eight they're called a passive. The score is calculated by taking the percentage of promoters minus the percentage of detractors, then you multiply times 100 to take out the percentage.

So for instance, if you look at the scale we have if you're very difficult to work with, getting a "One," you have a minus 72 Net Promoter Score. It might mean you have 85% detractors and 13% promoters. And of the high-scoring "Sevens," you have a positive 79, you have 82% promoters and 3% detractors.

TechRepublic: I'm wondering if the decision to take the loyalty metric resulted from feedback you were getting from tech vendors. You mentioned that a lot of them are struggling with loyalty. Did you hear some chatter, or did you take part in conversations that led you to follow up with this particular study?

Bruce Temkin: We work with a lot of tech vendors, and I would say that many or most of them are using the Net Promoter Score as a loyalty metric, a customer experience metric. One of the things I always try and do is to help companies understand that the way to improve a score like that is not to focus on the score, but to focus on the things you are doing operationally that result in good or bad scores.

The one metric that I like a lot is the likelihood to try a new product or service, because what people don't think about is that it's an important component of loyalty for a tech vendor. If we have a bunch of people who are not willing to try our new products, we are not as likely to succeed than if we have lots of customers out there who are clamoring to try our new products.

I do other studies where we just look at sort of a correlation -- you know what's the correlation between factors and the correlation between these is pretty high. But when you look at it on the easiness scale, you start to cull out things that are really interesting. Like the fact that pretty much once you get above neutral in being "easy to work with," you are rewarded incrementally with more and more loyalty. It doesn't top out.

Also, once you get to neutral and below, they are all pretty bad and it doesn't really fall off the charts. It's just all bad and it just gets a little more incrementally bad. But when you start getting good the good stuff it really grows rapidly.

For my purpose, doing studies like this where we look at individual responses, it's really to understand the one thing that is the tipping point. The other is sort of the elasticity of good behavior. So if we found for instance that if there's a huge gain from going from a 4 to 5 in sort of the easiness scale and then you didn't gain much after that, then the lesson might be all we have to do is be decent, there's no reason to spend money on the change our organization to be really good. This really demonstrates the elasticity of showing that you are rewarded at each incremental improvement above neutral in being easy to work with.

TechRepublic: That's very key... I can see that.

Bruce Temkin: There are areas we have that are topping out earlier -- this doesn't.

TechRepublic: Yes, loyalty matters.

Bruce Temkin: I would say because loyalty matters, being easy to work with matters a lot.

TechRepublic: Yes, one drives the other, according to your results. I have a question about a different survey, your Blueprint for a Successful CX Organization. This report was released in November 2013. It has case studies on five organizations. What would be the value proposition to a tech vendor, specifically, in buying and analyzing this report?

Bruce Temkin: A lot of the tech vendors are committed to improving their customer experience, which doesn't come easy, right? To be able to do it you need to have some person and usually some organization at least driving change. So you see in lots of tech vendors that they are putting together or they have lots of customer experience organizations, we call them different things, and they are driving change.

As a matter of fact, we just announced our customer experience excellence winners, and there are four tech vendors: Cisco, EMC, and two at Oracle. And that shows the degree to which tech vendors are investing in transformation. For any organization that has a customer experience organization to drive change, this really helps them understand what are the best practices for that group to operate, what are the characteristics of success for a group like that.

In summary

The basic takeaway from the "easy to work with" analysis is that positive customer experience builds loyalty for tech vendors. Then, as tech vendors start earning higher grades on the easiness scale, going from four to five and higher, there is a definite uptick at each level in loyalty measures: customers' willingness to buy in the coming year and their openness to trying a new product or service.  

In Temkin's view, this makes a solid business case for investing in positive customer experiences. Toward the end of our call, Bruce summed it up this way: "Because loyalty matters, being easy to work with matters a lot."