Networking

Anatomy of a survey gone bad

Surveys seem to be marketers go-to sales device these days. But if you look closely, you'll find that they're not always on the up and up.

Sometimes, I get a press release that really irks me. Most of the time it has something to do with survey results. Either the survey bears out some obvious fact that makes me wonder why time, money, and effort were even wasted on it, or the "sponsor" of the survey, astoundingly enough, profits in some way from the positive results.

You've probably seen those emails whose subject lines scream "Football now America's favorite pastime!" and then in tiny letters in the body copy you see "...sponsored by the NFL."

So you can imagine how much eye-rolling I did upon getting what looked like the National Enquirer of emails that screamed "Americans willing to divorce in order to work from home!"

Upon opening the email, you see that 5 percent of the people surveyed said that they would divorce in order to be able to telecommute. Now, maybe it's me, but why would that be a choice? And here are some of the other survey "findings" of what people would sacrifice:

  • Social media - 34%
  • Texting - 30%
  • Chocolate - 29%
  • Smartphone - 25%
  • Shopping - 20%
  • A salary increase - 17%
  • Half of vacation days - 15%
  • Daily showers - 12%
  • Spouse - 5% (just so you know, people in the West were significantly more likely to say they would give up their spouse (7%) in order to telecommute than people in the Midwest (2%).)

First of all, chocolate? In what nightmare (dare I say, apocalyptic?) scenario would one be asked to give up chocolate in order to telecommute? And how does that even make sense? (How is a Cadbury Egg related to telecommuting? Huh? Somebody tell me that!) Oh yeah, and the divorce thing is weird too.

Then you check out this survey's sponsor -- TeamViewer. TeamViewer is, coincidentally, a "provider of remote control and online meetings software." So needless to say, they have their reasons to want telecommuting to be something people would sacrifice measurably for.

I'm bright enough to put a survey in perspective on that basis. But when they get into the phrasing of survey questions that are supposed to bear out some relevant fact but are kind of meaningless, then I get a little perturbed.

Case in fact: Here's some more of the survey's findings:

Most Americans believe that more people want the option to telecommute (62%) with an overwhelming percentage (83%) believing that telecommuting is on the rise.

That doesn't mean that most Americans want the option to telecommute. It means that most Americans think that more people want the option to telecommute. Got that? So of all the people asked, "Do you think more people want to telecommute these days?" 68 percent said yes. Big freakin' deal. That tells me nothing as to whether telecommuting numbers are up or down or whether the concept is becoming more popular among working folks.

Just be careful when looking at survey results.

About

Toni Bowers is Managing Editor of TechRepublic and is the award-winning blogger of the Career Management blog. She has edited newsletters, books, and web sites pertaining to software, IT career, and IT management issues.

16 comments
DFO_REXX
DFO_REXX

Toni, thanks for illuminating this bit. I'd like to think IT people are smarter than this... but they're only people, and even smart people fall for dumb things sometimes. A short list of survey context questions (which echoes yours, in many respects) and what to know about the questions follows. Please note the list is not all-inclusive, nor is it meant to be; these are just the biggest points. Please add others if you can think of some. 1) Who did the survey? Sometimes hard to find, but necessary. 2) Do they have an ulterior motive (why did they do the survey?) Often predictable; be skeptical of a survey being made about a subject when the surveyor is heavily invested in the subject matter and/or the survey's results. 3) What kind of questions do they ask? Leading questions are not indicative of usable data but are often used to "prove a point" (as in Toni's example). You should be very skeptical of any article summarizing a survey if they don't print the questions in their entirety. 4) Where was the survey delivered? Corollary: where do the survey's answers come from? Local surveys are always biased by local culture, and a widely-circulated survey which receives most of its data from only a few places may as well have been a local survey. 5) What people got the survey? Directed distribution could be an indication of "begging the question". Be careful; asking only IT people about an IT scheduling product is probably OK, but is different from asking only IT people about telecommuting since many disparate jobs can be done by telecommuting. 6) What people answered the survey? "IT people" is simply too general a category; there are lots of different sub-types of IT people. 7) How many responses were there? 2,000 responses sounds like a lot, but if there are millions of people in the category 2,000 isn't statistically significant.

AnsuGisalas
AnsuGisalas

A hundred years ago, tilling the field from the living room was hardly a realistic option, was it? But the 5 percent of people willing to "sacrifice their spouses" (ominous wording) are obviously of the Homer Simpson School of Working from Home: Bring on the mumu and let the flab reign free!!! :^0

summerspa
summerspa

I had a coworker who was working on his phd. He introduced a survey to a group of post high-school students related to suicide. 4 people ideated the evening they took the survey. The committee killed the whole research project the next day.

father.nature
father.nature

Once worked for a commercial survey firm. We loved those which paid a fee for participation - got all our friends, neighbors and relatives to participate and pre-briefed them to lie about their prequalifications and say mostly positive things so they'd get their money and the client would love our results and return for more. Remember the luxo-slug "hand-built sportscar" Buick Riata? That's one of the reasons Buick fielded that turkey. Baloney in, baloney out, and Buick took a beating in the end.

C-3PO
C-3PO

I heard of a survey once that reported peas will kill you... the statistic was something like 90% of people who have died have eaten peas in the last week. Be careful what you eat!

kjrider
kjrider

I do hate these stupid surveys - that ask for your postcode and then ask what part of the UK you live in. The ones that ask what I do for work, and when I put 'I am retired' in the 'other' box next ask where 'I am retired' is located and how many people work there. Ones that ask my wife her age (60) and is she pregnant? Surveys that try to push a new product at you (mobile phone, etc) then ask you how many you will buy! I could go on......... KJR

Robiisan
Robiisan

I've done some statistical sampling, although without the formal training he had working for the polling firm. How you write the questions and how you introduce the questions/survey to the target audience/sample universe significantly affects their responses. Unfortunately, in our "fast-food, fast-everything" society, using related questions to eliminate question bias is frequently counter-productive to getting successful completion and return ratios. Unless the research is really significant, the best one can often do is attempt to write bias-neutral questions (not easy at all !!) and keep the survey form short and to-the-point. In such a situation, yes/no and true/false questions are completely useless. Better bets are multiple choice and "rank in order" (their preferences) types of questions, as long as the questions include an "other" selection choice and room for the respondent to explain. I also agree with Toni that most "surveys" out there now, from "customer satisfaction" to product preference to political opinion polls are worthless to anyone, INCLUDING the perpetrator of the survey - they are so biased that the "results" are pre-formulated fiction designed by their creator to please their employer/contractor. The worst of the lot, IMHO, are the political junkmail (from ALL sides) that ask you to buy into their obviously slanted questions, then want you to pay them for the gathering of their irrelevant trivia! Most current surveys and polls are useful for little more than fire-starter if hardcopy and not even that if electronic.

JamesRL
JamesRL

If you write it the right way.... During my university days, I took a stats course and used a survey for my research. This lead to me working for a polling firm one summer. They taught me all about survey bias. A good survey will try to ask you the same question in a number of ways, if its an important question. It will be worded very neutrally. It will ask a number of related questions to check for question bias. If there is a list of responses, it will vary the order of the list every time. In short, it is hard work to design surveys that are unbiased. You also have to eliminate sample bias. If you make more calls in the day, you bias towards retired people, people who work from home, people who are at the extremes of the income scale. The proper technique is to have demographic breakdown of the unit you are trying to survey, then oversample. Then you choose responses until you have a set of responses match the profile of the unit (state, country etc). You also have to use a random sampling method to ensure that you aren't inadvertently introducing bias that way. Needless to say, surveys where the participant choses to participate without being asked are also biased towards active people. I've seen many surveys I consider biased in the way they were written published as scientific fact in the newspaper.

Sterling chip Camden
Sterling chip Camden

Were just a subset of the population who are considering giving their spouse the heave-ho anyway.

JamesRL
JamesRL

Sample size depends on how deep you are going with the demographic analysis of the question. 2000 responses may be fine for a million people if you don't care to further stratify the demographic by sex, age, income, region or whatever. Most polls though want to be able to look at some of those factors, so a larger size is required. I remember doing daily tracking polls where the question was "Canadians" and "voting intention" and the same size fairly small. But if you want a regional breakdown, or look at socio-economic factors, the needed sample size increases exponentially.

Sterling chip Camden
Sterling chip Camden

Whenever a statistic pops up saying "90% of x also have y" we need to ask "what percentage of the general population have y whether or not x is present?"

rwidegren
rwidegren

Back in my school days I had a marketing research class and while talking about eliminating bias, the text book told this story: There were two monks from different orders who met at a conference while they were outside taking a smoke break. They got to talking about whether or not it was okay to smoke while saying their prayers. One said it was and one said it wasn't. They couldn't come to an agreement so they decided that when they went home they would each ask the opinion of the head of their order and then compare notes later. The next time they saw each other the first monk said, "Well it looks like I was right. The head of my order said that it was wrong to smoke while you're praying." The other one said, "That's funny, I asked if it was okay to pray while I smoked and I was told it was probably a pretty good idea if I did."

DFO_REXX
DFO_REXX

True, 2000 responses might be enough; I wasn't trying to be specific, merely trying to make a point. However, I should point out I wrote "2000 of millions in a category", and that does make a definitive statement. Your criticism, while valid, downplays the ingroup and outgroup distinctions necessary in understanding what the survey was trying to "explain":

AnsuGisalas
AnsuGisalas

List ten reasons not to kill yourself: (These kinds questions always bug me no matter what the subject; what if I can't think of 10... am I then lacking?) Worse yet: List ten things you think will happen in your life, which to look forward to: List ten reasons to be hopeful about the future being better than today:

Editor's Picks