A Customer Experience Catch-22 – CX Surveys
When it comes to market research and CX surveys, would you rather ask 100 people 20 questions or 2,000 people a single question each?
Is the end result the same? Or does one approach provide better results?
Historically researchers have tended to focus on finding a small sample of consumers willing to ask multiple questions. Once ‘chosen’ – this group will be milked for all it’s worth.
From a research position, this makes sense. It’s just not that easy to get people to talk to you. But as the question list gets longer, the number of people willing to engage shrinks. And here we meet a Customer Experience Catch-22.
The more you ask, the fewer people respond. But the smaller the response rate, the more you want to ask.
A tricky situation indeed.
Demographic Data – does it help?
One answer is to try and use demographic data of survey responders. By comparing this to census or customer databases, researchers believe they can weight the sample of responders to create something more representative.
While this might be demographically true – is it – can it – ever be true attitudinally?
You might show that your sample covers all age groups, income, and geographies – but you’re still left with a group willing to give up 10-20 minutes to enter a prize draw. What unites the group is stronger than what divides them.
Is this really the best way to make strategic decisions for your business?
The Move to Micro CX Surveys
In the end, researchers have to put a limit beyond which they agree not to expand. A one in one out policy on questions. This is where the trend towards micro CX surveys comes in. Based on the idea, busy people will give up some of their time if you make it simple enough. Think of the kind of one-touch reviews that fuel Uber or delivery food apps.
The ‘ask as you act’ model is a step in the right direction. But does it go far enough in respecting customers’ time? Why not take it to the logical endpoint. Only ever ask one question. If survey respondents know they are only going to be asked a single question, the proportion of those willing to answer will increase. And the overall number of question responses will grow to be far more representative.
One way for market researchers to try to mitigate the danger of lengthy surveys and the diminishing ‘gene pool’ of responders is to look for questions where responses are highly correlated and then choose one question from the list dropping all the others.
The problem with using this approach in isolation is that the temptation still exists to replace every dropped question with a new question (or two) and it is always easier to think of something else you want to ask rather than something you are willing to let go.
The Wisdom of the Crowd – the Power of Using Customer Groups
One question per person doesn’t mean the same question for everyone. The original list of 10 -15 questions can still be asked in parallel (in rotation) – the only difference is that the workload is spread across more individuals.
There are some potential downsides to this one-question approach. The ability to target specific follow-up questions based on an initial response is lost, at least in that moment and to that individual as is the understanding of the relationship between responses to two different questions at the individual level.
There are, however, ways to address these points to at least some degree by finding a larger entity than the individual around which to collect responses.
One example might be to ask those follow-up questions to a similar-looking cohort of customers who bought a similar set of products or visited the same store. Likewise comparing the responses to two questions for a cohort of customers can reveal any close relationships or key drivers.
While there are pros and cons to both approaches at TruRating we are passionate about giving every customer a voice. We believe the only way to do this is to limit each interaction to one simple and easy question. As attention spans drop and requests on customers grow to have any chance of gathering enough responses to make truly representative decisions, we believe that one question really is enough.
For more from the blog, why not check out the key takeaways from our latest insight report: