Surveys one of the most important parts of your customer experience strategy. But the hardest part of running surveys is arguably writing survey questions that customers want to answer, and that results in a good dataset that you can actually get insights from.
Everyone thinks they are good at writing survey questions.
But the truth is that collecting useful responses is harder than it seems.
TLDR;
- Don’t use surveys when you can get the data from other tools
- Use simple, sharp, and clear language
- Don’t use double-barrel questions
- Keep survey questions to a minimum
- Don’t use leading questions
- Offer incentives
- Target your surveys to the right audience
- Let people know how many questions there are and how long it will take
- Use the same scales in all questions
Let’s dive right in!
1. Don’t use surveys when you can get the data from other tools
Avoid asking questions that you can get the answer to using other tools. If you want to learn how your customers use your website or product… use a tool like Clarity where you get session recordings and can uncover the answer yourself.
Surveys are better used to find out what your users feel when using your product or if they feel something is missing.
“Is the search bar valuable to you?” is a question that caters to your (or someone in your organizations) biases and personal beliefs.
You can easily see if the search bar is used by using free analytics tools. And survey questions like these usually end up in a survey after a discussion between colleagues or teams with different opinions. Don’t let your customers be the mediator in your internal politics.
If you are looking for improvement suggestions, a better question to ask customers is:
“If you had a magic wand that could make any change to [Website/product], what would it be?”.
This type of question gives the participant complete freedom to answer based on their current experience with your website or product. It also allows them to draw on their experience with similar products or websites.
Use simple, sharp, and clear language
Don’t use overly complex words or sentences that can be misinterpreted or use confusing language
Example of a bad survey question: “Imagine you have just come home from work after picking up your children from preschool, and in the middle of cooking dinner, you receive a 25% off SMS discount. How likely are you to make a purchase within 7 days?”
The complexity of this question is exaggerated for the purpose of this article, but it’s not that far off from how some organizations write their questions.
If all you wanted was an answer to whether or not your customers are likely to purchase something when they receive an SMS promotion, you should use a simple question like this:
“On a scale of 1-5, how likely are you to make a purchase from an SMS promotion?”
Cut unnecessary qualifiers, don’t ask participants to imagine a scenario, and get straight to the point.
Don’t use double-barrel questions
Double-barrel questions are asking a participant to provide one answer to two (topically related) questions.
Example of a double-barrel question:
“How would you rate our website and checkout experience?”
The insights from a question like this would be invalid, because you have no idea if participants rate your website OR checkout experience.
With a double-barrel question like this, participants will most likely answer this question based on the best or worse experience with one or the other. Or worse, using a made up “average” based on the full experience.
Another thing to consider is when you present your results and someone asks “So, based on your results… What are the next steps?”
A question like this should be split into multiple questions. Remember, one question at a time that gives a clear answer.
Keep survey questions to a minimum
Never ask questions just to have a certain number of questions. There are no min or max number of questions that you need in order to get good results.
Don’t use leading or loaded questions
You don’t want your customers to be swayed to respond based on your opinions. Adding words like “awesome” when asking participants to rate their experience with support, or using a word like “fast” when asking participants to rate your delivery times will make them respond in a certain way.
The last thing you want when running a survey is to persuade your participants to respond in favor of how you want them to respond.
Here are a few examples:
❌ Leading question: “How was your experience with our awesome support team?”
✅ Neutral language: “Rate your experience with our support team”
❌ Leading question: “Rate how fast our delivery times are on a scale of 1-5”
✅ Neutral language: “On a scale of 1-5, rate the delivery time for your last purchase”
❌ Leading question: “How much did you enjoy your shopping experience?”
✅ Neutral language: “Rate your shopping experience”
Offer incentives
Most people have nothing to gain from helping you improve your business. Sorry, but it’s the truth.
If you offer an incentive on the other hand… Your survey suddenly becomes a lot more interesting.
But keep in mind that you should keep the incentive good enough for your target audience to want to participate, but not so good that you attract people who just want the reward.
A classic example of a “bad” incentive is giving away an iPhone to 1 or more randomly selected participants.
It’s better to use a 10% or 20% discount that participants receive in your store.
Target your surveys to the right audience
This is a mistake we see a lot of organizations make. Especially the large ones.
Most survey requests should not be sent to your full list. Targeting is key to qualitative responses, and even more so if you are using website surveys to evaluate your shopping experience.
If you are using a solution like Triggerbee surveys you can use website behavior, CRM data, location and 37 more criteria to target your surveys to individual customers visiting your website.
Below you’ll find some common surveys and their optimal audience.
- Micro surveys. Targeting: Everyone. Micro surveys target users who interact with a specific feature on your website or perform a specific action.
- NPS Surveys. Targeting: Customers or end-users. NPS surveys measure loyalty and your relationship with your customers. Make sure you target
- CSAT surveys. Targeting: Immediately after purchase. CSAT surveys are great for measuring how your customers experience your website, and is commonly used as a quick post-purchase survey on the thank you-page.
Let people know how many questions there are and how long the survey will take to complete
It’s good practice to be upfront with how many questions are in your survey along with how long it takes to go through them.
This lets users know what to expect and can increase your response rate.
Use the same scales in all questions
If you use rating- or scale-based survey questions, you should aim to use the same scale for all rating questions.
It’s totally OK to mix different types of questions as long as the response criteria is consistent across all questions
For example, don’t use a 1-5 scale question immediately after a 1-7 or 1-10 scale question.
Using the same scale across all similar questions will also allow you to have more questions in your survey. The human brain loves what it recognizes, and the effort to rate a statement will go down the more things you rate.
Basically, it will take more cognitive effort to rate two statements with different scales, compared to rating 3 or 4 statements using the same scale.