Survey Length Can Impact Findings

Last week, I talked about how it might be better to conduct a few short surveys in place of one longer survey. Whether or not the more frequent shorter surveys are better or feasible depends largely on your business problem, urgency, budget, and target respondents. In survey research, for the most part, shorter is almost always preferable to longer.

With more surveys being conducted online, respondent attention spans are very short and patience is in short supply. About 53% of respondents to online surveys say they will devote 10 minutes or less to a survey, according to InsightExpress back in September 2002. Dropout rates tend to increase as surveys get longer. Karen Paterson of Millward Brown found that after 10 minutes, each additional minute a survey takes lowers completion rates by 2%.

Moreover, the number of survey screens (that is, how many times a respondent clicks a “Next” or “Forward” arrow” on the Web survey) can greatly fatigue respondents, especially in business to business (B2B) research. Bill MacElroy demonstrated in the July/August 2000 issue of Quirk’s Marketing Research Review that the dropout rate of B2B respondents increases exponentially as the number of survey screens increases. According to MacElroy, the dropout rate is 7% for an online survey with 10 screens. With 15 screens, the dropout rate is 9%. But at 30 screens, the dropout rate is 30%, and at 45 screens, a whopping 73%!

The question that should enter all of our minds, then, is “what impact does the dropout rate have on both the integrity and findings of the survey?” Generally, respondents who terminate a survey are lumped together with the non-responders. Non-response error has always been a concern of the most dedicated researchers, but quite often is ignored in practice. However, with termination rates growing in the wake of online surveys, ignoring non-response error can cause misleading results.

Karl Irons, in the American Marketing Association’s November 2001 EXPLOR Forum pointed out that the longer the survey, the more inclined respondents who completed the survey were to check the top two boxes on a purchase intent survey:

Hence, when the survey took 14 minutes or more, nearly half of the respondents who completed the survey were likely to choose the top two boxes, indicating that they were most likely or definitely likely to buy, compared with just one-quarter of respondents when the survey was less than 6.5 minutes.

In addition, InsightExpress compared two surveys – a six-minute, 12-question survey and a 21-minute, 23-question survey – in Issue #11 of Today’s Insights. The completion rate of the shorter survey was 31.4%, but only 11% for the longer one. The demographics of the completing respondents weren’t dramatically different, but the results were markedly different: Just under 9% of the respondents in the shorter survey expressed intent to purchase, but almost 25% of those completing the longer survey did! Only four percent of those completing the shorter survey said the product concept appealed to them, compared to nearly 14% for those completing the longer survey!

Why is this? First, when a survey is long, the people who stay to complete it likely have some vested interest in the survey. If the survey is about premium chocolate, for example, a chocolate lover might stick it out through the duration. And someone like the chocoholic is more likely than the average respondent to purchase premium chocolate. Secondly, some respondents don’t want to terminate a survey, either because of the incentive offered or because they want to “be polite.” Hence, they might speed through the survey, just marking the top boxes. In either case, the researcher ends up with biased results.

So how do we rectify this? First and foremost, if you have to do a long survey, tell respondents upfront how long it is expected to take – both with dial-up and with high-speed broadband internet connections. Secondly, make sure there is an appropriate incentive for their participation. Also, make use of a progress bar to let respondents know how far along they are in the survey. Make the survey questions as short, as easy to understand, and as simple as possible. And always test the questionnaire before administration. Have someone else read certain questions, paraphrase them, and try to answer them. And of course, if you have the time and money to do a couple of shorter surveys instead, by all means do so.

*************************

Analysights is now on Facebook!

Analysights is now doing the social media thing! If you like Forecast Friday – or any of our other posts – then we want you to “Like” us on Facebook! By “Like-ing” us on Facebook, you’ll be informed every time a new blog post has been published, or when other information comes out. Check out our Facebook page!

Tags: , , , , , , , , , , , , , , , , , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: