The Good, Bad and Ugly of Customer Satisfaction Surveys

17 Jan
January 17, 2012
Customer Satisfaction

Image by agaumont via Flickr

Customer Satisfaction Surveys.  Nearly every business uses them.  They have to, in some form or fashion, ascertain how happy (angry or apathetic) their customers are with their products.  But are they always beneficial?  I am going to tackle this topic from the customer, not the brand, point of view.  It’s all a part of my guerrilla customer service research.

I am not knocking customer satisfaction surveys at all.  What I am knocking is the process by which many surveys are deployed.   I know businesses want and need to know that their customers are Very Satisfied vs. Satisfied or (Yikes!) Very Unsatisfied; that their customers are Very Likely to Recommend; that their customers are Promoters (9 or 10), rather than Passives (7 or 8 ) or Detractors (0-6).  However, I believe that the traditional means by getting that information might be becoming antiquated.

Paper, phone calls, email, web, IVR..Any market research firm worth their weight will deploy these techniques on behalf of their clients.  My firm does it.  But the times they are a changin’..People are busy.  Why should I take time out of my day to tell you about you just because you asked me to on your receipt?  What are you going to do about my comments if they are negative?  And frankly, my chances of winning $1000, $500, a gift card or an iPad are slim to none.

So, I’ve decided to investigate the surveys that we customers are asked to complete.  My biggest beefs so far are as follows:

  • Length of time it takes to complete.  Are you kidding?  If your survey is 50+ questions, the average Jane on the street isn’t going to finish.  I would love to see drop rates!
  • Making every question an NPS (Net Promoter Score) type question.
  • Asking me the same question twice using two different scales (like Highly Unsatisfied-Very Satisfied and then 0-10)
  • Asking me questions about everything, as opposed to trying to pinpoint a specific issue that I had (By the way, customers don’t remember very much about who did what.  They remember satisfaction.  If you want to know who did what, use a mystery shopping program
  • Showing a progress bar that barely moves (2 pages in and I am 8% complete?)
  • Asking questions that populate new ones (Who wants to bet that people change their answers just to get those to go away; not to mention it makes that progress bar move more slowly)
  • Non-responsiveness.  Why on earth are you asking me about how happy I am with you if I tell you that I am unhappy and you don’t do anything about it?
In my other life, I have a client that performs mobile surveys.  They push a text message to every customer that goes through a transaction in the store.  The survey is short and sweet (4 questions).  Every negative response is responded to because District and Regional managers are instantly notified of dissatisfied customers.  It’s not a perfect system, and we do see drop off (24% response rate on the first question drops to a 15% response rate by the 4th).  But the feedback we get from unhappy customers who knew that their issue was listened to and acted upon is amazing.
So, with that, I will share with you my Walgreens survey.  What do you think they hit on and missed?  Are there opportunities for improvement by Walgreens to create a better survey?  In the very near future, I am going to discuss some alternatives that are changing the landscape of customer satisfaction surveys, as well as offer some suggestions.  I just had to rant a little first.

Related articles

Enhanced by Zemanta

The Service Witch roams around with hidden cameras, and captures the good, bad and ugly of customer service. Then she discusses it. She also has minions out there doing the same thing. All over. You never know what they will catch. (@theservicewitch, @measurecp)

Facebook Twitter 

Be Sociable, Share!
27 replies
  1. Edward04 says:

    Great post, Kimberly. I think quant. surveys – Customer Sat. is a good example area – often we prescribe what we would like respondents to talk about (because we have to fulfil objectives), whereas in fact their “hot buttons” may well be elsewhere. In an ideal world, asking folk what they would like to get off their chests/focus on might be a good way to start any survey on any topic – in workshops that’s a technique that encourages collaboration and engagement. Just a thought ;) Fascinating journey that MR is undergoing – it’s an exciting time to be in CI

    Reply
    • ServiceWitch says:

      Edward, I agree. It is a fascinating time to be in mrx. I think that the next couple of years will see an evolution or revolution (not sure which); I think it will be driven by the consumer rather than by the brands.

      Reply
  2. Alex Engelman says:

    You raise important points about the lack of user-friendliness in many surveys. Perhaps customer satisfaction surveys could benefit from taking a lesson out of the brand-attribute-associate playbook. Since even the most brief and convenient method of survey-taking (the mobile survey mentioned in your post) experiences significant drop rates, we can assume that a large percentage of survey takers are not motivated or intellectually engaged enough to provide meaningful feedback. Solution: provide customers with a brief attribute association survey that measures the overall brand experience.

    For example, the survey could ask: how much would you associate the following 4 attributes with Walgreens after your visit? The 4 attributes should be ones that have the highest impact on Walgreen’s NPS score: speed of service, product availability, good value, comfortable atmosphere. This brief survey could end with an open-ended question asking ‘what, if anything, could we do to make your experience at Walgreens more enjoyable’.

    If the customer had a positive brand experience, they are more likely to recommend Walgreens to friends. If Walgreens consistently sees a low score on one of the questions, they know where to focus their improvements efforts. Drop rates would decrease given the brevity of the survey, while usefulness of information for Walgreens would increase because they would have a better understanding of the lasting impressions they are leaving with their customers.

    Thoughts, comments, suggestions? Please discuss.

    Reply
    • ServiceWitch says:

      I think that crowdsourcing customer satisfaction is going to play a larger and larger role. I believe that brands can engage customers for 1-2 questions far easier than they can engage them for 5+ questions. Crowdsourcing will bring higher response rates, but having far more people complete smaller segments. My two cents, but I think that we will continue to see decline in customers spending large amounts of time responding to lengthy surveys. Brands that need in depth insights may start to look at the CS model as a way to get all of the answers needed by asking more and more people fewer and fewer questions.

      Panels will continue to succeed because panelists are compensated or incented. However, panelists themselves are essentially “professional” survey takers. I have spoken to many panelists who actually state that they have taken so many surveys that they know how to answer questions in order to get to the incentive at the end of the “survey rainbow.” Which is problematic in and of itself. I think I see a new topic to write about.

      Reply
  3. Scale of Ten says:

    Kimberly, on a scale of ten, how well is that survey designed? (giggle)

    Reply
  4. Eric Welch says:

    Interesting post. My criticism of many surveys is that their motive seems less to improve what they are doing than to collect data so they can brag about how great they are and presumably have data to prove it. The way questions are posed often prevents anyone from providing a meaningful reply. Before even thinking about creating a survey, the company should really ask itself what it is they want to learn from the survey and then design it. This may require the use of many open-ended questions, a rarity on the surveys I’ve seen. I used to do a lot of Yougov surveys but finally tired of their repetitive nature and questions that seemed totally inconsequential to me.

    Reply
    • ServiceWitch says:

      Eric, agreed. Conducting surveys for the sake of surveys, or to take back to a board of directors as proof of success isn’t the right model. Conducting surveys and doing something about the negative is. Surveys should be brief. They should be a change agent for the brand. And now, are surveys even as consequential when you have people getting better responses from brands by going through social networks (for no incentive or return on time), than by filling out the surveys.

      Reply
  5. David Lewenz says:

    The ability to interact with consumers should be based on what the consumer end game was when they purchased a product. That means if you spent a night at the Hilton Hotel, were you happy with your purchase, in five questions that is it! Do not focus on the inner workings of the hotel, let the client makes comments in question five what they liked or did not like. Time is money and no one has the ability to waste time in today’s environment. A professional consumer will not waste more than a minute on an online survey is just crazy.

    Reply
  6. Dan says:

    Where is the credit to Reichheld given on this post? your ideas are ripped, word for word, from his Harvard Business Review article dated December 2003: The One Number You Need to Grow”

    plagiarist

    Reply
    • ServiceWitch says:

      In the spirit of transparency, I decided to post your accusation. However, I hate to tell you, but I haven’t read that article. After googling it, and coming up on the Wikipedia article that is referenced in the post, I see it relates to NPS. In market research, everyone pretty much knows what NPS is. The thoughts, sentiments, opinions and critiques are my own-for better or worse.

      Reply
  7. Ari says:

    Hi,

    Great post on why surveys have become so bloated and awful. We created SquidCube.com for that particular reason. To deliver instant meaningful feedback.

    We hope we can lead the charge to solve this problem. I have no idea why companies seem at home with the radio button when it is so awful.

    Oh well. Great blog!

    Ari

    Reply
  8. Jacquie says:

    Another thing to think about — How many survey takers just select “Very Satisfied” (or unsatisfied), etc. for every response without even reading the questions thoroughly just to quickly get to the “10% off Your Next $25+ Purchase!” coupon at the end? I know I can be counted among them. That makes for many skewed and inaccurate survey results.

    Reply

Trackbacks & Pingbacks

  1. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  2. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  3. [...] blog’s list of survey sins included my pet peeve “Asking me questions about everything, as opposed to [...]

  4. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  5. [...] The Good, Bad and Ugly of Customer Satisfaction Surveys (servicewitch.com) [...]

  6. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  7. [...] stores, fills out patron compensation surveys out of veteran interest. She recently wrote a screed on her blog, Service Witch, about a extreme length and miss of concentration in many online [...]

  8. [...] stores, fills out patron compensation surveys out of veteran interest. She recently wrote a screed on her blog, Service Witch, about a extreme length and miss of concentration in many online [...]

  9. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  10. [...] fills out customer satisfaction surveys out of professional interest. She recently wrote a screed on her blog, Service Witch, about the excessive length and lack of focus in most online [...]

  11. [...] stores, fills out patron compensation surveys out of veteran interest. She recently wrote a screed on her blog, Service Witch, about a extreme length and miss of concentration in many online [...]

  12. [...] The Good, Bad and Ugly of Customer Satisfaction Surveys (servicewitch.com) [...]

  13. [...] The Good, Bad and Ugly of Customer Satisfaction Surveys (servicewitch.com) [...]

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>