Asking your customers to complete a survey is like starting a conversation with them. You should do everything you can to approach your survey design as such.
When your customers accept an invitation to participate in your survey, they agree to volunteer their time to dialogue with you about their needs and expectations, as well as share what they thought of their experience with your brand.
But just like with any conversation, it's hard for survey respondents to stay engaged if they feel that the discussion is dragging on, or if something smells fishy. Not to mention, over the course of a long survey, survey fatigue could start setting in, causing respondents to prematurely quit the survey ("drop out") if they no longer feel value from providing their feedback.
So how can you keep your survey respondents engaged and increase your survey response rates, all while ensuring that you still collect customer feedback relevant to your current business objectives?
In this post, we look at 16 useful tips you should consider so you can offer a positive survey experience and increase your survey response rates today:
Make your survey invitation, well, inviting
- Make your survey invitation reflect your brand
- Be transparent as to what you are asking from people
- Use the right type of survey invitation for the right context
- Regularly refresh the survey invitation
Manage the number of screen interactions
- Leverage an auto-advance feature
- Consider the question types you are using
- Don’t ask questions you already know the answer to
- Include a progress bar
Keep your research relevant and to the point
Make your survey easy on the eyes
- Consider differences in Desktop and Mobile survey experiences
- Manage the amount of text in your questions and answers
- Show only one question at a time
Make the survey experience secure and accessible
- Be upfront about how customer feedback is handled
- Leverage an accessibility-friendly survey collection interface
Before we start
- Survey response rate can be defined as the percentage of those who were invited to participate in the survey and completed it (# of completed surveys / # of invitations sent).
- It can be difficult to determine what an ‘ideal’ survey response rate should be, since many different factors, including (but not limited to) your visitors’ level of engagement with your website or brand, can impact it.
- The method you use to invite visitors to participate in your survey can also impact your survey response rates. While this is not examined in detail in this post, you can learn more about this in our whitepaper, "Voice of the Customer Methodologies".
Make your survey invitation, well, inviting
Naturally, your survey invitation is the stage of the survey experience where you will see the most significant drop in your survey response rate. This is mainly due to people just not being interested in participating in your survey, and so they click “No” on the invitation or ignore the invitation altogether.
However, another factor that can play a role in whether people accept your survey invitation is uncertainty or uneasiness about what is being asked of them, and by whom.
Let's look at four ways you can do this:
Your survey invitation should feel like an extension of your brand. It should reflect the branding of your website, mobile app or whatever channel through which you are sending the invitation.
Not to mention, with increased concerns about information security (more on that later), brands must quell any concerns their customers may have and design an inviting and positive survey invitation experience.
As such, you should avoid generic and non-branded survey invitation creatives. Instead, you should customize your survey invitation to everything from your logo, your font type, and even your brand's verbiage. Using a survey invitation that reflects your branding can all help reassure potential survey respondents that you are indeed the one conducting this survey and, in turn, encourage them to share their thoughts about their experience with your brand.
Example of a branded survey invitation (fictional brand used in this example)
Like most things, people want to know what they’re signing up for. As someone who completes many customer satisfaction surveys myself, I want to know for what I'm being asked to volunteer my time, and how much time I should expect it to take.
Perhaps most importantly, I want to know what my feedback will be used for, and who will see it. This is important information especially in the age of the GDPR (General Data Protection Regulation), with people taking extra steps to manage the information they share online.
The text you include in the survey invitation should answer these questions in a clear, concise manner. Avoid generic texts like “Let us know what you think” or “Take our survey”. Instead, clarify why you want them to answer your survey, and how much time your survey will take:
“Would you help us design a better website experience for you by answering this short 2-minute survey?”
“Help us make our website better by completing our short survey.”
“We need your help to improve our shopping cart feature. Do you have 2 minutes to answer a quick survey?”
Not all survey types and invitations are created equal.
What may be a good approach for a certain type of survey may not be appropriate for another. For example, if you are conducting a more general website satisfaction survey, using a layered invitation (like the one below) to engage customers for their feedback at the beginning of their visit (asking them to provide their feedback after their visit) is appropriate. However, using this invitation while a customer is in the middle of the checkout process? Not so much.
Example of a Layered Survey Invitation
Today, there is a plethora of ways you can engage your visitors for their feedback, all with their own sets of advantages and degrees of intrusiveness that are appropriate for specific contexts. Some contexts call for more visible invitation methods, while others call for more discreet, passive approaches, like a comment card (activated by clicking a "Feedback" button) or a Slider Survey, like the example below.
Example of a Slider Survey
Check out our whitepaper, "Voice of the Customer Methodologies" to learn more about the right approaches for the types of insights you want to get. Choosing the right approach based on your business and research objectives can help ensure you engage your customers for their feedback the right way, and help increase your likelihood of collecting the type of feedback you need.
Like your website or mobile app, it’s always a good idea to give your survey invitation a breath of fresh air once in a while. This is especially true for longitudinal surveys that you are running for months, and that visitors might potentially see on more than one occasion over time.
If a survey respondent sees the same invitation again in the future, they may just choose to disregard it as a reflex (I've been guilty of doing this myself). However, the feedback a customer gives you today is as valuable as the feedback they may provide you in the future.
Refreshing your survey invitation can be useful to attract the eye of those who may have already seen the invitation in the past. It also gives you the chance to update the text on the invitation to match any changes in your research objectives, or even just to try something new.
Manage the Number of Screen Interactions
The number of questions you ask plays a part in the drop-out rate for survey respondents. However, something that is just as important is the number of screen interactions that the respondent is asked to perform to complete the survey.
A screen interaction is any mouse click or scroll required by the respondent to answer a question. A survey that requires 20 screen interactions needs less effort to fill out than a one that requires 40.
Here are four ways that can help increase your survey response rates by managing the number of required screen interactions in your survey:
If you know ahead of time how many screen interactions are expected from the respondent to answer a question (such as Single-Select or Set-of-Rating questions), ensure your customer feedback solution allows respondents to advance to the next question whenever they have reached this limit of screen interactions.
Example of a survey with an auto-advance feature enabled
Over the course of a longer survey, these saved screen interactions can add up and can help reduce survey fatigue.
A single-select question requires only one click to move on to the next question and is ideal for qualification and demographic purposes (assuming you have an auto-advance feature enabled).
On the other hand, an open-ended question requires the respondent to type their answer, which can require a relatively large amount of effort on the respondent’s part (especially if they have a lot to say). Although, qualitative feedback is often a very potent source of insights and helps to identify new trends that you would have otherwise been unable to identify with close-ended data alone.
Example showing the difference between low and high effort survey question types
A variety of question types can help keep things engaging for your respondents. However, you want to make sure that you manage the number of “high interaction” questions whenever possible.
For example, if you always see the same themes mentioned for one of your open-ended questions, you could consider converting it into a close-ended question and programming these themes as answer choices, with an ‘Other, please specify’ answer choice included to still allow for user-defined responses.
Marketers can already automatically detect certain information about their website visitors or mobile app users. Whether it’s their browser information or performed some sort of action at any point during their session, there are some useful segmentation data that marketers already know about their visitors that they should not need to ask in their surveys.
A robust customer feedback solution should provide the ability to program hidden (“auto-fill”) questions that automatically pulls in this information about your survey respondents. As well, you can also integrate your customer feedback with your web analytics or session replay solution provider so that you can get additional context as to how your visitors interacted with your website or mobile app during their session.
These options can save you from having to ask survey respondents to confirm certain types of information about their session and save on the number of survey interactions they need to complete your survey.
The psychology behind progress bars typically revolves around giving people a sense of the impact of their efforts on bringing them closer to achieving their goal. After reading this post, make sure to check out this short-and-sweet TedTalk from Daniel Engber, aptly titled “How the progress bar keeps you sane”, that gives an interesting perspective on this subject.
In the context of surveys, showing a dynamic progress bar gives respondents a sense that they are advancing in the survey, which can especially come in handy for longer surveys. Some respondents may feel more compelled and engaged to finish a longer survey if they can see the finish line.
Keep Your Research Relevant and to the Point
Another effective way to keep your survey respondents engaged and increase survey response rates is by only asking questions that are relevant, and which reflect behavior or answers that they have already confirmed. You can do this the following ways:
By default, your survey respondents will see all the questions you program in your survey. However, programming Skip Logic whenever applicable ensures that respondents will only see certain question modules if they meet the criteria for these questions.
For example, if you manage an Automotive website and want feedback about a car configuration feature on your site, you would only want to show car configuration-related questions to those who already stated earlier in the survey that they used this feature. For those who did not use this feature, Skip Logic would prevent them from seeing these questions and asking them to feedback about something they never used.
Programming Skip Logic provides the ability to cover a wide range of topics in your survey while minimizing the number of required screen interactions by keeping questions relevant to the individual respondent.
Sometimes, you just need to go into more detail on specific features. Depending on how in-depth you go (i.e., make your survey longer), the respondent may potentially reach a point where they no longer wish to continue in the survey and drop out as a result.
This is why an option to consider is running more than one survey, each focusing on specific business objectives, as opposed to fitting all of your research in one comprehensive survey. For example, if you would like to collect feedback about the shopping cart feature of your website, you could invite respondents who have finished purchasing an item on the site to participate in a survey focused on conversion optimization.
Conducting multiple surveys provides the opportunity to obtain insights on key experiences throughout the customer journey, which is essential to better inform your Customer Experience Management efforts. Note though that a strong customer feedback solution would provide the tools necessary to ensure visitors do not receive invitations for more than one survey during a given session, to not potentially negatively impact their website experience.
Business and research objectives change, and so should your survey to reflect these changes.
There is no point asking respondents to expend effort providing feedback that is not relevant to your current priorities, and are not planning to reference in your decisions in any way. There will always be evergreen questions you need to include in your survey (e.g., KPIs you are trending over time), but you should regularly perform spring cleaning on your research to make sure all the questions are still relevant to your current needs.
This will help keep your research from ballooning to the point where it results in unnecessary screen interactions and potential survey fatigue.
Make Your Survey Easy on the Eyes
Your survey shouldn’t feel like an exam. Offering a visually pleasing and straightforward survey experience is vital to keeping your survey respondents engaged, and prevent them from dropping out of your survey before they reach that "Thank You" page.
Here are some design-related aspects that can make it easier for your respondents to process your survey.
It is imperative that you consider the differences in how respondents’ experience your survey based on the device they are using.
Those answering surveys on a Desktop / Laptop are most likely sitting down (unless they are much more agile than I am), and using larger screens. On the other hand, Mobile respondents are taking the survey on smaller screens. While they may also be sitting down when taking the survey, it is also possible that they are standing or walking while answering this survey. As a result, Desktop respondents may be able to tolerate a higher number of required screen interactions than those on Mobile.
It's important to remember the potential nuances and differences in how questions will look on Desktop and Mobile, whether it's how a wordy question will appear, or the amount of answers you provide the respondent.
So, whether you choose to program skip logic or launch mobile-only versions of your surveys, you should always test drive the survey experience on different devices to better understand just what your respondents will expect, and make any necessary changes accordingly.
Including a lot of text in your questions requires more effort on the part of the respondent to process, which can lead to greater respondent fatigue as they advance through the survey.
Understandably, some questions may require a higher level of detail to ensure respondents fully understand what you are asking them. However, brevity can prove useful whenever possible. An answer list of 5 options is much easier to process than one with 25 options (like the example below):
Example of a survey question that includes too much text
In addition to managing the amount of text you use for your questions and answers, you should also consider a customer feedback solution that only shows one question at a time to further help make it a visually-pleasing survey experience for your respondents.
Having to answer one question at a time is easier to process than seeing a long list of questions on a single page. Plus, it makes it easier to program tactical Skip Logic and could be another way of how to increase your survey response rates.
Make the survey experience secure and accessible
We’ve looked at some of the more tangible ways you can help ensure a simple and inviting survey experience that guides survey respondents to the end of your surveys. However, you should also have the measures in place that tackles two key intangible aspects of the survey experience, which could potentially impact your overall survey response rates: Security and Accessibility.
Data security has never been a hotter topic. People are more careful than ever about the information they share online, and with whom they share it.
Customer feedback is no different. If you don’t take the necessary measures to ensure a secure survey experience, and reassure potential survey respondents of how their feedback will be handled, then you could risk losing potential survey respondents.
Anyone should be able to interact with your website or mobile app the way they want. The same is true for your surveys.
To truly get a better understanding of how all your customers perceive their experiences with your brand, you must ensure that your surveys can easily be accessed and completed by all of your customers. This includes ensuring that your survey collection interface is accessible for those with disabilities that may potentially interfere with their ability to have a positive survey experience.
Ensuring that your customer feedback solution is at least WCAG 2.0 Level AA compliant, and offers a survey interface that is compatible with assistive technologies (AT) such as screen readers, is critical to offer everyone the ability to provide feedback about their experiences with your brand.
Designing a Voice of the Customer project is a science, and there is no magic formula for creating the perfect survey.
Every website is different, and every company has its own set of business objectives. As essential to keep in mind is that each survey respondent is unique. While it may be easy to determine the type of feedback you want to collect from your survey respondents, the tricky part is finding that balance between what works for you, and what works for your customers to keep them engaged so you can maintain a good survey response rate.
While everyone is unique, it is difficult to argue that people prefer to complete something that is easy and requires minimal effort. So when figuring out how to increase survey response rates, it is important to keep the respondents’ experience in mind and to keep everything engaging and easy, while ensuring that your research still reflects your current business needs.
VoC programs, especially ones that include surveys for which you are trying to get a representative view of your customers, rely on healthy survey response rates. With strong response rates fuelling valuable insights from your survey efforts, you can better inform your Customer Experience Management (CEM) strategies.
This post was originally published on November 20, 2015
Image source: Unsplash