CXO

Question the accuracy of job and salary surveys

There are plenty of job and salary surveys out there. But as Bob Weinstein points out in this Tech Watch column, much of the information is inaccurate or biased. Find out what to look for and what questions to ask to make sure the information is reliable.


Want to know where the jobs are? Who’s hiring and firing? What the average salary range is for a certain job? No problem. Various surveys will give you the lowdown. But whether the information is accurate is another question.

Each year, we are bombarded by surveys on thousands of topics. Beyond just researching job trends and salaries, surveys cover thousands of subjects. The Google search engine lists 722,000 job-survey sites and 209,000 salary surveys in its database. Many are formal surveys conducted by trade associations, the government, think tanks, and PR agencies. Others are informal ones put together by consulting firms and headhunters. Associations alone—every industry has at least one—turn out thousands of surveys.

I’ll tell you why you shouldn’t believe everything you read and what questions you need to ask to figure out if the information you’ve been provided with is trustworthy.

Credibility is in the details
The biggest mistake that job-searchers—and everyone else—makes is assuming surveys are accurate. Inaccurate information is most often generated by organizations with a bias, rather than by a top-name survey company like A.C. Nielsen, J.D. Power & Associates, or Yankelovich Partners, to name a few that demonstrate professionalism and integrity.

Most surveys are awful, according to Allan McCutcheon, professor of survey research and director of the Gallup Research Center at the University of Nebraska-Lincoln. Many distort the truth by focusing solely on information that supports a trend or issue.

Whether a survey is formal (i.e., uses sophisticated gathering and random-sampling tools) or informal (i.e., based on selected telephone interviews), the results are carefully worded so that the information is presented with a tone of certainty, which subtly implies the information is definitive. Glowing adjectives such as “global,” “major,” or “top,” for example, immediately imply credibility. Another favorite survey word is “leading.” If I had a dollar for every time a survey used that word, I’d be filing this column from an island retreat off the coast of Greece.

A well-known, often-quoted headhunter just released an “informal” technology survey listing the top executive jobs most in demand. The information was gathered via telephone interviews. The carefully worded disclaimer at the bottom of the release said, “This list is not scientific and is only meant to be a leading indicator of senior jobs in demand.” Check the clever wording—the survey is “not scientific” but it’s “a leading indicator of senior jobs in demand.” Clever, right?

The new doublespeak
A deft wordsmith can make a case for an abundance of tech jobs in Tibet or Tasmania. A well-conceived survey in the hands of clever spin-masters could make you believe almost anything.

Most surveys are so bloated with doublespeak that readers can’t separate fact from fiction, says Chris McCarty, survey director for the Bureau of Economic and Business Research at the University of Florida (at Gainesville). “Survey questions should be neutral,” he says. “But bogus surveys set them up in a certain way so respondents are led down a particular path. You can bias a survey virtually any way you want by asking questions in a certain pattern.”

Adds McCutcheon, “Innocent questions are often preceded by questions which skew people’s opinions.”

“A carefully created survey, however, will ask the same question three different ways so you don’t know you’re answering the same question,” explains William Lutz, trend-watcher and author of The New Double-Speak (HarperCollins; $23.00). “And most surveys are purposely short because they know people don’t have the time to answer a long survey. It’s all the more reason to carefully word questions so you get reliable information.”

Question the results
“Reader beware: Start by asking who is sponsoring the survey,” advises McCutcheon. That gives you a good idea of whether the information is biased.

Another important issue is the size of the sample, which tells you about its margin of error. “The bigger the sample size, the greater the accuracy,” McCutcheon says. “A well-drawn random sample of 500 yields a sampling error of plus or minus 5 percent, whereas a sample of 2,000 yields an error ratio of plus or minus 2.5 percent.” Don’t be put off by the length of time it takes to generate a survey. Surveys put together in hours can be just as accurate as ones that take several months to write—and vice versa.

Finally, question Internet and newspaper surveys’ accuracy. “One person can take a survey several times, biasing the results,” adds McCutcheon. An Internet survey boasting thousands of respondents can be almost worthless.

You’ve been warned.

Read any good research lately?
What’s your take on Bob Weinstein’s skepticism? Do you agree that many surveys are inaccurate? What about intentional bias? Or have you found survey information to be accurate and helpful? Share your opinion. Send us an e-mail or post a comment below.

 

Editor's Picks

Free Newsletters, In your Inbox