Toni, thanks for illuminating this bit. I'd like to think IT people are smarter than this... but they're only people, and even smart people fall for dumb things sometimes. A short list of survey context questions (which echoes yours, in many respects) and what to know about the questions follows.
Please note the list is not all-inclusive, nor is it meant to be; these are just the biggest points. Please add others if you can think of some.
1) Who did the survey? Sometimes hard to find, but necessary.
2) Do they have an ulterior motive (why did they do the survey?) Often predictable; be skeptical of a survey being made about a subject when the surveyor is heavily invested in the subject matter and/or the survey's results.
3) What kind of questions do they ask? Leading questions are not indicative of usable data but are often used to "prove a point" (as in Toni's example). You should be very skeptical of any article summarizing a survey if they don't print the questions in their entirety.
4) Where was the survey delivered? Corollary: where do the survey's answers come from? Local surveys are always biased by local culture, and a widely-circulated survey which receives most of its data from only a few places may as well have been a local survey.
5) What people got the survey? Directed distribution could be an indication of "begging the question". Be careful; asking only IT people about an IT scheduling product is probably OK, but is different from asking only IT people about telecommuting since many disparate jobs can be done by telecommuting.
6) What people answered the survey? "IT people" is simply too general a category; there are lots of different sub-types of IT people.
7) How many responses were there? 2,000 responses sounds like a lot, but if there are millions of people in the category 2,000 isn't statistically significant.
Keep Up with TechRepublic