How we got our NPS mojo back

We had set out to change how companies and their employees manage their expenses at Fyle and were very focused on providing a great user experience. A grave mistake we made was that we didn't systematically measure what users thought of the product. We did a proper Net Promoter Score (NPS) survey in June 2018 and were dismayed at the results - it was +31.

In case you aren't familiar, NPS survey asks users of a product or service to answer a very simple question: On a 0-10 scale, how likely is it that you would recommend the product/service to a friend or colleague? Respondents to the survey are broken up into:

  • Promoters (score 9-10) are your champions who will keep buying and refer others, fueling growth.
  • Passives (score 7-8) are satisfied, but are not very enthusiastic users who are vulnerable to competitive offerings.
  • Detractors (score 0-6) are unhappy users who can damage your brand through negative word-of-mouth.

To get your net promoter score, you add 1 for every promoter and subtract 1 for every detractor. Passives get 0 points. You finally divide the sum by number of respondents and you have your NPS. If every respondent hates your product, you end up with -100 and if every user loves you, you end up with +100.

Different industries have different benchmark values for NPS - you can see this blog by Delighted. For software, the mean score was +40 and top class products were at +55. As you can imagine, we weren't thrilled about our score of +31. It was a kick in the cajones - pardon my Spanish.

We rolled up our sleeves and delved deeper into the numbers. We noticed that the open rate to our NPS mailer via Wootric was 35% and the response rate to our survey was 4.5%. We had 57% promoters, 17% passive and 26% detractors. We looked through specific feedback we received from passives and detractors. One area we got dinged quite a bit was our mobile app usability. We also found that many first time users had trouble figuring out how to use the product.

Next time, we made changes to our subject lines to try to improve the open rates. After all, if someone doesn't open the survey email, they're not going to rate us. We made the subject line more personal and we used active voice instead of passive voice. The open rates shot up to 69%. The response rate to our survey also jumped to 9.7%. Some of our product changes had gone out by the time we did this round and the NPS this time around was +38. It was a little better but still quite far from where we wanted to be.

We discussed our NPS in our monthly all-hands meetings and set ourselves a target of +60. In September, we switched to using to run our NPS survey automatically. We got around 8% response rate and an NPS of +45. In October, we hit +50. We released a revamped version of the mobile app with massive improvements in usability. In November, we finally hit a score of +62 with 69% promoters, 24% passives and 7% detractors and I've finally found the courage to write this post :)

If you work on a SaaS product and haven't yet done a NPS survey - I hope this post convinces you to do it as soon as possible. I'm immensely grateful to be working with a team that took on this challenge and delivered amazingly. If you would like to work in such a team, please do check out our openings on Angelist.

Siva Narayanan

I am known to be "the CTO of one, the father of two, and the roasting baba of many."

More of our stories from

Customer Success
How We Improved Our First Response Time from 60 mins to 16 mins

How the Fyle Support Team improved their first response time from 60 mins to 16 mins

Traits of a successful CSM

If you are wondering what traits help a Customer Success Manager excel, you're in luck! Dive in!

How we got our NPS mojo back

We had set out to change how companies and their employees manage their expenses at Fyle and were...


All Topics