Posts Tagged ‘Internet surveys’

Free Online Survey Tools Can Yield Costly Useless Results if not Used Carefully

June 15, 2010

Thanks to online survey tools like Zoomerang, Surveymonkey, and SurveyPirate, the ability to conduct surveys has been greatly democratized. Small businesses, non-profits, and departments within larger firms can now conduct surveys that they would never have been able to do because of cost and lack of resources. Unfortunately, the greatest drawback of these free survey tools is the same as their greatest benefit: anyone can launch a survey. Launching an effective survey requires a clear definition of the business problem at hand; a carefully thought out discussion of the information needed to address the business problem, the audience of the survey, and how to reach it; determination of the sample size and how to select them; designing, testing, and implementing the questionnaire; and analyzing the results. Free online survey tools do not change this process.

Recently, a business owner from one of my networking groups sent me an online survey that he designed with one of these free tools. It was a questionnaire about children’s toys – which was the business he was in. He wasn’t sending me the survey to look at and give advice; he sent it to me as if I were a prospective customer. Unfortunately, I’m not married and don’t have kids; and all my nieces and nephews are past the age of toys. The survey was irrelevant to me. The toy purveyor needed to think about who his likely buyers were – and he should have good knowledge, based on his past sales, of who his typical buyers are. Then he could have purchased a list of people to whom he could send the survey. Even if that meant using a mail or phone survey, which could be costly, the owner could get more meaningful results. Imagine how many other irrelevant or uninterested recipients received the business owner’s survey. Most probably didn’t respond; but others might have responded untruthfully, giving the owner bogus results.

Also, the “toy-preneur’s” survey questions were poorly designed. A double-barreled question: “Does your child like educational or action toys?” What if a respondent’s child liked both educational and action toys? The owner should have asked two separate questions: “Does your child like educational toys?” and “Does your child like action toys?” Or he could have asked a multi-part question like, “Check the box next to each of the types of toys your child likes to play with,” followed with a list of the different types of toys.

The survey gets worse… questions like: “How much does your child’s happiness mean to you?” How many people are going to answer that question negatively? Hello? Another asking the respondent to rank-order various features of a toy for which there was no prototype pictured, and if that wasn’t bad enough, there were at least 9 items to rank? Most people can’t rank more than five items, especially not for an object they cannot visualize.

We also don’t know how the toy manufacturer selected his sample. My guess was that he sent it to everyone whose business card he collected. Hence, most of the people he was surveying were the wrong people. In addition to getting unacceptable results, another danger of these online survey tools is that people are more frequently bombarded with surveys that they stop participating in surveys altogether. Imagine if you were to receive five or more of these surveys in less than two weeks. How much time are you willing to give to answering these surveys? Then when a truly legitimate survey comes up, how likely are you to participate?

I think it’s great that most companies now have the ability to conduct surveys on the cheap. However, the savings can be greatly offset by the uselessness of the results if the survey is designed poorly or sent to the wrong sample. There is nothing wrong with reading up on how to do a survey and then executing it, as described, as long as the problem is well-defined, the relevant population is identified, and the sampling, execution, and analysis plans are in place. “Free” surveying isn’t good if it costs you money and time in rework and/or in faulty actions taken based on your findings.

Do you have trouble deciding whether you need to do a survey? Do you spend a lot of time trying to find out what you’re trying to learn from a survey? Or how many people to survey? Or the questions you need to ask? Or which people to survey? Let Analysights help. We have nearly 20 years of survey research experience and a strong background in data analysis. We can help you determine whether a survey is the best approach for your research needs, the best questions to ask to get the information you need, and help you understand what the findings mean. Feel free to call us at (847) 895-2565.

Advertisements

Beware of “Professional” Survey Respondents!

April 3, 2009

Thanks to the Internet, conducting surveys has never been easier.  Being able to use the Web to conduct marketing research has greatly reduced the cost and time involved and has democratized the process for many companies.

While online surveys have increased simplicity and cost-savings, they have also given rise to a dangerous breed of respondents – “Professional” survey-takers.   

A “professional” respondent is one who actively seeks out online surveys offering paid incentives – cash, rewards, or some other benefit – for completing the survey.  In fact, many blogs and online articles tell of different sites people can go to find paid online surveys.

If your company conducts online surveys, “professionals” can render your findings useless.  In order for your survey to provide accurate and useful results, the people surveyed must be representative of the population you are measuring and selected randomly (that is, everyone from the population has an equal chance of selection).

“Professionals” subvert the sampling principles of representativeness and randomness simply because they self-select to take the survey.  The survey tool does not know that they are not part of the population to be measured, nor their probability of selection.  What’s more, online surveys exclude persons from the population without Internet access.  This results in a survey bias double-whammy.

In addition, “professionals” may simply go through a survey for the sake of the incentive.  Hence they may speed through it, paying little or no attention to the questions, or they may give untruthful answers.  Now your survey results are both biased and wrong.

 Minimizing the impact of “Professionals”

There are some steps you can take to protect your survey from “professionals,” including:

  • Maintain complete control of your survey distribution.  If possible, use a professional online survey panel company, such as e-Rewards, Greenfield Online, or Harris Interactive.  There are lots of others, and all maintain tight screening processes for their survey participants and tight controls for distribution of your survey;
  • If an online survey panel is out of your budget, perhaps you can build your own controlled e-mail list (following CAN-SPAM laws, of course).  E-mailing your survey is less prone to bias than keeping it on a Web site for anyone to join.
  • Have adequate screening criteria in your survey.  If you can get respondents to sign in using a passcode and/or ask questions at the beginning, which terminate the survey for people whose responses indicate they are not representative of the population, you can reduce the number of “professionals”;
  • Put “speed bumps” into your survey.  An example would be to have a dummy question inside that simply says: “Select the 3rd radio bottom from the top.”  Put two or three bumps in your survey.  A respondent who answers two or more of those bump questions incorrectly is likely to be a speeder and the survey can be instructed to terminate;
  • Ask validation questions.  That is, ask a question one way and then later in the survey ask it in another form, and see if the responses are consistent.  If they’re not, then the respondent may be a “professional” or a speeder.

The Internet may have made marketing research easier, but it has also made it more susceptible to bias.  The tools to conduct marketing research have become much easier and more user-friendly, but that doesn’t change the principles of statistics and marketing research.  Online surveys, no matter how easily, fast, or cheaply they can be implemented, will waste time and money if those principles are violated.