CXO

Business leaders: How to interpret new IT studies

It can be difficult to know if a new IT research report is worth your time. Here's what you need to know when you encounter these reports.

Image: iStockphoto/shironosov

Another day, another research report. Whether it's the ushering in of a new era of IT, or the death of the platform your company is built on, these reports are difficult to ignore.

The truth is, you shouldn't ignore them because many of these studies and reports offer some good insights that you could use to better your IT practices or grow your business. Still, it can be difficult to know which studies to trust and which ones could have a real impact on your organization.

However, there are some steps you can take to help you get more out of IT research. It all starts with understanding some of the basic principles of statistics.

Learning the language

To make real use of the information presented to you in a research report, it's important to review a few basic terms you're likely to encounter. The University of California Berkeley has a good glossary to look over, but here are the basics:

Mean - The sum of a set of numbers or scores, divided by the number of numbers or scores you have in that set. Most commonly associated with the concept of "average."

Median - The middle number in a set of numbers, with half of the numbers existing above it in the set and the other half existing below it.

Mode - The number that occurs the most in a set of numbers.

Sample size - The number of measurable elements in a sample from the population. In IT reports, the sample will be the number of business leaders or organizations that participated in the study. Another way to refer to sample is with the letter "n."

Range - The difference between the highest number or score and the lowest number or score in a set. You simply subtract the lowest from the highest to get the range, and that helps you determine variability.

Standard deviation - How far an individual data point deviates or varies from the mean.

Bear in mind, the set of terms used in the reports you are looking at could vary from the list above. Make sure you use statistics term glossaries, like the one mentioned above, to fully understand the terms before proceeding.

Vetting the research

The first thing you want to make sure of is that you're looking at real research, not just opinion, said Kyle McNabb, senior vice president of product management at Forrester Research. What will be truly valuable to business leaders are ideas backed by research that is a mix of quantitative (surveys, data), and qualitative (interviews, anecdotes, examples) information, he said. Also, doing your best to prove transparency and objectivity is key.

"Leaders should be able to see the plans behind the research so they have a good idea of what the early ideas and hypothesis were, how they were tested, and what led to the results," McNabb said. "That instills confidence you're getting something trustworthy."

Start by viewing all of the big takeaways in the context of one another, as viewing and individual statistic apart from the whole may give you an incomplete view.

For example, consider this 2015 survey on agile IT practices where 94% of the respondents said they have implemented agile IT practices in some capacity. Now, take into account that, of those same respondents, only 53% said their agile efforts were successful. If you want to dive even deeper, try to find out how these respondents, or the authors of the survey define "success."

Another consideration when looking at a report is the sample size used in the research, or how many business leaders contributed to the findings. Unfortunately, there's not a hard and fast rule but, generally, more people in the sample size can yield more trustworthy results. However, sample size can be subject to other factors.

"Sample size can be a little bit questionable, because if it's a niche problem you don't need a sample size of 1000 or so to have a relevant population," McNabb said. "But, if it's a bigger problem, you need a larger sample size."

It's also a good idea to take a look at the demographic information of those who participated in the research. What industries do the respondents represent? Are they dealing with similar problems that your organization is encountering? This is important because it shapes how they'll answer the questions presented in a survey.

Finally, make sure you check to see if the research was sponsored by a specific business or organization. It might not have any bearing on the objectivity, but it could play a role in how the questions were phrased in the survey.

Taking next steps

So, you've found a report, it seems to be legitimate, and the data looks like something you'll be able to use in your organization. What's next?

For starters, although it may seem obvious, you must determine if the research topic is relevant to your business. It should easily relatable to the decisions you need to make or anticipate making in the future, McNabb said, and offer actions you can take to move your business forward.

This is where good research reports step out ahead of the others—they give readers actionable steps to take in their business to react to the data they present.

"Data on its own is interesting," McNabb said. "Data combined with all the qualitative as well, actually helps tell these leaders a story that they can learn from and act on."

The value of the research is contingent on what you can do with the information it gives you. If there's no action related to it, it's not useful to you as a business leader.

Do your own mind mapping. Involve key leaders in your organization and develop a plan. Ask if these are actions your organization can actually act on, and figure out the best course of action to get started.

To further vet and validate the data, look for any additional interviews or qualitative data that might take you deeper. If you are still having trouble drawing conclusions about specific research relative to your business, go back to the source—call the author and ask.

"The value of research is not just seeing the report," McNabb said, "but actually knowing there's experts behind it that you can talk to."

Also see

About Conner Forrest

Conner Forrest is a Senior Editor for TechRepublic. He covers enterprise technology and is interested in the convergence of tech and culture.

Editor's Picks

Free Newsletters, In your Inbox