One of the big issues when running a survey is low response rates. The internet has made creating a survey and collecting feedback easier than at any other time in history. The result is a massive proliferation of surveys and forms which means you’re competing for the increasingly limited time and attention of your desired respondent.
In this post, I look at the following three elements and how you can use them to help maximise your survey response rates:
There are a number of factors that can decrease or increase the likelihood of a person completing a survey. When you ask someone to complete a survey you are asking for something from them and the respondent will subconsciously undertake a cost/benefit analysis i.e. “what is the cost to me in completing this survey and what are the potential benefits”. One of the biggest costs to the respondent is time and I am sure many of you have heard, “I’m sorry I don’t have the time” when conducting field surveys and may have even said it yourself.
The same applies to online surveys, telephone surveys and paper surveys. You are asking the respondent for some of their time during which they could be doing others things like preparing dinner, reading a book or watching TV. To reduce the time cost to the respondent best practice is to reduce the length of the survey. There is a lot of debate on how long a survey should be however the GreenBook Research Industry Trends Report 2017 suggests that the perfect length for a survey is between 6 and 10 minutes. The report found that 18% of people preferred to spend no more than 5 minutes on a survey, 27.6% were prepared to spend up to 15 minutes while only 2.8% of people would be prepared to spend over 20 minutes on a survey. Another study found that completion rates started dropping by between 5 and 20% if a survey took more than 7 or 8 minutes to complete. Best practice is to inform the respondent at the start of the survey roughly how long the survey will take. As a rough rule of thumb, a respondent will be able to answer ten questions in this time.
Often surveys deal with complex issues. We have just completed a few surveys for Councils on their Long Term Plans which included basic issues like do we need another dog exercise area through to complex issues like addressing coastal erosion, planning for climate change as well as strategic Council investments. A respondent may think that they are not the “right person” to answer the survey or “don’t know enough” to provide a response and either not attempt the survey or abandon the survey resulting in a partial response. It can be helpful to explain how the survey results will be useful and benefit them, others and the community.
Survey design can reduce a lot of complexity, Gorge Orwell’s rules for writing apply here, especially Rules 2, 3 and 5:
Rule 2. If it is possible to cut out a word, cut it out
Rule 3. Never use a long word where a short one will do
Rule 5. Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent
The style of questions we ask can also add to the complexity of the survey through the process of cognitive burden. Cognitive burden is the amount of thought one must apply when answering a survey. The more complex the question or the answer is the higher the cognitive burden is, this can result in survey fatigue and a person not completing the survey.
Think of how much thought one has to put into an open text question versus a checkbox question, now times this by 10 questions; which survey are you more likely to respond to and complete? Questions with checkboxes, drop-down menus, Likert scales and multichoice options not only reduce the complexity and cognitive burden for the respondent but also shortens the survey. By using logic you can also use the above questions to trigger follow up questions but note that opened questions such as the common “why did you choose this option” adds complexity and burden.
There is a growing distrust of surveys, polls and the results of these especially due to the growth in the use of surveys and the growth of rouge surveys.
Ways of improving public trust in your survey and increasing response rates is to explain who you are, why you are collecting the data and what it is going to be used for. A number of organisations now contract out their research including consultation and engagement surveys. Trust can be established by identifying the company undertaking the survey but also who the company is doing the research on behalf of i.e. the local Council. It is important to be prepared for questions from respondents, common questions include:
- Why are you asking these questions?
- How will the information collected be used?
- Are my responses anonymous?
- Will my response make a difference? I am just one person.
If you are able to answer these common questions you will increase the likelihood the respondent will trust you which has positive flow-on effects. If a respondent trusts the researcher not only does it encourage people to respond to the survey but also increases the quality of the data as people consider their answers instead of rushing through just to complete the survey or get rid of the researcher.
Being able to tell the respondent how information collected will be used is very important especially in today’s environment with some telemarketing companies buying and selling personal information. The collection and use of personal information is covered by the Privacy Act 1993 and while we haven’t had a data breach on the scale of the recent Cambridge Analytica scandal New Zeland Post was criticised by the Privacy Commissioner in 2011 for selling information from a survey conducted in 2009 to marketing companies in a “systematic and large-scale breach of privacy principles.”
By building trust with respondents you also increase the likelihood that they will respond to future surveys.