Hacker News new | past | comments | ask | show | jobs | submit login
How we reduced our cancellation rate by 87.5% (reemer.com)
331 points by kareemm on June 17, 2012 | hide | past | favorite | 70 comments



Seriously, one of the best and most actionable articles you'll read this week. (n.b. This sort of thing prints money for companies at pretty much all sizes. Well, after you've got enough customers to worry about cancellations.)

I'll probably write something about this eventually. There are a lot of generalizable tactics which repeatably work well. (Email engagement is probably the highest bang for the buck, considering that you can implement it in about an afternoon and, coming from the starting point "We send no email", it will virtually immediately produce visible results.)


On the topic of emails and account cancellations: When I started sending out "Your Tarsnap account will be deleted soon" emails, the account attrition rate dropped by 50%. The change wasn't from people who didn't realize their account was going to be deleted -- they had received two emails already, 1 week and 2 weeks earlier -- but getting that last prompt seems to have shifted people's default action from "do nothing" to "pay some money to make the account stay around".

Some time later, I cut Tarsnap's account attrition rate by another ~50% by adding a line to the "account will be deleted soon" email: "If you've decided to stop using Tarsnap, I'd love to know why." This wasn't deliberate -- I added it for the simple reason that I really do want to get that information -- but it seems to be causing people to stop and say "hmm, I can't think of any good reason to not use Tarsnap, so maybe I should keep using it after all".


Regarding that last bit:

I wonder if they feel a bit of social pressure to stay. Basically when you say "I'd love to know why" all of a sudden there is an actual person who would be clearly harmed by their decision to leave.


Would like to see some A/B testing with "I'd love to know why" vs. "We'd love to know why."


Fortunately, the number of people who let their Tarsnap accounts expire is small enough that unless there was a huge signal it would take a very long time to get a statistically significant result.


Thanks for the kind words, Patrick.

Your writing and approach (viz. "engineer who thinks like a marketer") has changed how I look at what I do from "write code" to "scalably reach profitable customers so they'll pay me for a technological solution to a painful problem."

And agree re: email engagement!


Thanks, glad to know I've helped someone.


Thanks for writing. Now, how hard was it to ask the gym for their customer's email addresses. I would assume the instant answer would be "sorry no, privacy policy", but since you pulled this off, would love to know how!

Thanks.


As someone who goes to a crosscut gym, I'm guessing privacy is the least of their concerns. Most are focused on user retention and renewal rates. Sadly many if these gyms get sucked into groupon cycles


I never understood why CrossFit gyms were so much more expensive then mass market gyms -- the mas market gyms had more equipment, more expensive equipment, and larger, more expensive spaces. I also know that the big gyms depend on people not using their memberships for a large portion of their profits, but that couldn't be everything, could it?

I understand a premium, but a mass market gym could be 40 to 60 dollars, and a crossfit gym was 140 to 200 dollars.


Crossfit gyms are very labour-intensive compared to normal gyms. You're getting something that's close to training with a personal trainer every time you show up.


As a CrossFitter, I can tell you that I compared the price to personal training. I feel like I get the results of a personal trainer + a community feeling -- and it costs $7/session instead of $30 or more (since I go 5 days/week). After a year, I feel like the results justified this decision.


Yes, as the owner of a CrossFit gym, I can verify that the trainer pay is the main difference in costs and the reason why CrossFit is more expensive.


Crossfit gyms properly have more people who actually show up, and they may also have license fees to pay.


As the owner of a CrossFit gym, I can also verify this. Our business model is built around getting people to show up to the classes, and in any given week we have around 90% of members showing up. In one of the larger gyms (24hr, Lifetime, Gold's, etc.), you're looking at closer to 10%. The main difference in costs is the trainers though, since every class is closely coaches and led by experienced instructors.


Cross fit branded ones pay a license fee of several k a year (3 if I recall ) to the crossfit brand owner.


We just asked, and gave them the stats as to why ("Gyms who use this feature have X% more athletes claim").

Worst they could do was say "no".


Great analysis and I really want to believe in this but I am a sceptic and this:

Interested in learning how a cohort analysis can help your business grow? Get in touch – I work with select clients to help identify growth and retention opportunities, and build features to realize those opportunities.

Kind of killed it for me. Now I am not sure whether I should trust the results.


I agree, felt the same way when I hit that at the end.

In his defence though, to be fair, this article says:

1) What the problem was.

2) What he did.

3) What he found out.

4) What actions he took as a result that lead to the improvement.

That's a lot more than you get from other conversion experts I've seen (usually the write ups I see stop at (2) or (3) and you're left with the pay-for-more-info).

Seeing exactly what actions were taken for each of the things they found makes me err on the side of, yeah, this was a cool post, not just a sales plug.

It's really good to see this sort of stuff turning up on HN.


Yeah I agree. I probably should just have kept my mouth shut but I was just so disappointed when I saw that last line.

I was in fact trying to go after the ball not the man.


Do you routinely distrust code written by people who take money for coding, on the grounds that it might have been written with an eye to securing work?


That's a bit dishonest -- the scepticism is direct at the results, not the process itself.

I routinely distrust the gushing claims by paid TV advertising actors about how the Magic Bullet changed their entire life. That hardly implies I think the product is trash, but the alleged results are obviously designed sucker less discerning consumers.

And the general answer your question is yes, as far as I'm aware, employers routinely seek references to back up the claims made by potential hires.


I am not sure what that comparison is supposed to prove? Care to elaborate?

I have no doubt kareemm is a fantastic at his job. I have simply seen to many cases where a case study is used to create sales. Again nothing wrong with that. It just kind of makes me sceptic when I see fantastic results combined with a a sales message.

Thats just me.


Except that we probably don't have many gym owners here :-)


Surely you see the difference between the two cases?


I was just confused. So this is the owner of SocialWOD and a 'cohort analysis consultant'? After reading that I thought it was a guest post or something.


I'm bootstrapping SocialWOD by consulting with tech companies.


At the end of the day my results will be different from yours.

I'd say the best way to see if a cohort analysis can help you is to try it for yourself.

Happy to answer any questions I can for ya - gratis - if you do try it out. Email in profile.


kareemm

I am sorry cause I can't answer this without sounding like a naysayer and I normally hate that.

It's just when you have been around on the net as many of us have for so long you just realize that normally if something is too good to be true it probably is.

For the record. I think there is nothing wrong with you trying to get work that way I don't find it wrong or anything.

I just went from wow to oh he wants to sell me something.

Occupational hazard I guess :)


It's just when you have been around on the net as many of us have for so long you just realize that normally if something is too good to be true it probably is.

Yeah, technology isn't really revolutionising things that much. I mean, someone told me you could send a electronic letter around the world, instantly and FOR FREE! There's gotta be a catch. That's too good to be true.

And someone else told me that there's a phone company that does FREE online voice and video phone service all over the world. No way is that possible.

And then someone else told me about an online encyclopaedia that has waaaay more than my Encyclopaedia Britannica, but I don't believe them because they said it was free aswell. Can't happen.

(To quote Bill Gates) Who can afford to do professional work for nothing? What hobbyist can put 3-man years into programming, finding all bugs, documenting his product and distribute for free?

The internet is changing things.


The only things online that are truly free are those where the creator is donating the time or the money. Very few people do that. Everything else is paid either with your money, your time, or your privacy.

That "free" email you talk about? It runs ads. That free online phone call? You're paying more than you should be for another product, so that they can give you this one for "free". That free encyclopedia? The "please donate" ads are ads!

So yeah, there's almost always a catch, but we either don't realize it's there (because it's indirect) or it's subtle enough to not bother us.


>> The only things online that are truly free are those where the creator is donating the time or the money. Very few people do that. From Firefox to Linux to the hundreds of open source software developers, they are not very few. And their numbers are growing.

>> That free encyclopedia? The "please donate" ads are ads! Donating is not mandatory. So it is free. It doesn't waste my time nor my privacy.


How on earth is that by any metrics tangential to what we are talking about here.

I reacted to a post that made great claims and then ended up with trying to sell me something.

The internet is changing things, but humans are often the same.


i agree - there was definitely a bit of deflation at the end when i hit the plug for his services. i'm not saying kareem did anything wrong either, but there's definitely a difference in feel between "this is some cool thing i did that helped me with my job, and i'm sharing it so you can check it out too" and "this is some cool thing i did that is my job, and if you found it cool you can engage my services". it's not that the advice itself becomes suspect in some way, it's more that some subconscious sales resistance kicks in, so you lose the "cool" factor.


I don't mind being downvoted for being wrong. So where am I wrong?


Personally, I didn't mind your initial post about being skeptical — that thought occurred to me too. But then you posted several more times about your skepticism without going to kareem's page to double check your skepticism, which I thought was unfair. On his site he makes it pretty clear what he does and how his consulting work fits into that: http://blog.reemer.com/about

Plus the guy blogs pretty regularly. My personal sense is that someone who writes an article sharing his process and results for free deserves the benefit of the doubt — and at least a few clicks around their website to check it out — before posting in a way that suggests that his integrity may be suspect.


I am not sure I understand.

I never questioned his integrity. In fact I think I went to great length to say that it was not him but that post.


Integrity is more than soundness of moral character; it also refers to someone's honesty.

When you say something like, "Now I am not sure whether I should trust the results"... I think you're implicitly questioning the author's honesty.

Same goes for this: "It's just when you have been around on the net as many of us have for so long you just realize that normally if something is too good to be true it probably is."

Anyway, don't mean to debate you on any of this. You had asked where you were wrong, so I just thought I'd share how your comments came across to this outside observer.


I was commenting on the "I produced these great results. Want to buy some of the same?" which is archetypical in sales.

That has nothing to do with the integrity or honesty of the person, which I have repeatedly praised.

It's the difference between saying "you are X" and saying "what you said is X"


If the average time to cancel is 61 days, and they waited only 2 months since the changes to calculate the new cancellation rates, it stands to reason that the cancellation rate will rise over the next few months, right?

Assuming a normal distribution (probably not that accurate, but it's just a guess), the final cancellation rate would be about double that measured until now.

Am I missing something here?


The key here is that it is a cohort analysis.

So while your analysis is correct and the cancellation rate will likely rise over time, the key thing to check when comparing a pre-change cohort to a post-change cohort to test for improvement is that the cancellation rates at two weeks post-signup are significantly different. Obviously its the most ideal if you can run the tests in parallel to reduce the risk of selection bias. However, sometimes that is not always feasible, and with a result like 85%+ improvement is not really necessary to do so in order to assume that the control was beaten.


I don't really understand. Isn't the "cohort analysis" part of it how the data about the gyms that cancelled was extracted? That shouldn't affect how you compare the results once you've made changes. I'm also not sure I understand what you mean by "comparing a pre-change cohort to a post-change cohort" because we're not comparing the cohorts, we're comparing the cancellation rate among the whole list of customers.


Ah, the article is actually extremely unclear. What I took his conclusion to mean was that some cohort of gyms that had signed up post-change had lower cancellation rates than pre-change customers. If the analysis just took into account the overall rate across all customers is prone to all sorts of errors (including the one you mention).


My impression was that the pre-change cancellation rate was based on all customers over the lifetime of the product (they did calculate the 61 day average from this), whereas the post-change was only for the new customers in the last two months (the max it could be, and the only thing it makes sense to measure).


I think I see what you're saying (given an average cancellation time, only half the customers will have cancelled by that time), but I think it depends on how they calculated it.

For instance, they could know the average number of cancellations in a week, and see how that number has dropped.

I think this would work out, provided it doesn't matter too much whether the two actions they've taken (the two emails) occur right when the gym signs up or later on.

You're right that an average time to cancel of 61 days implies some may cancel at, say, 80 days. But some gym signed up 20 days before this cohort analysis began, and could have been the one to cancel had it not been for these changes.


I'm interested in why you found Mixpanel hard to use. It satisfies every requirement you have described out of the box and only takes minutes to set up. Unless you wanted to process a ton of past data it probably would have been easier for you to skip all the manual data entry excel requires.


I haven't used Mixpanel, only Kissmetrics. I wanted to run the analysis on all of our data and while that's doable in Kissmetrics, Excel is faster (desktop vs latency of web app requests), more flexible (I can do what I want vs being limited to what KM lets me do), and likely easier to get data in (import CSV vs ...?)


I had the exact same problem as you, so I'm writing http://www.instahero.com. It will offer everything kissmetrics etc offer, but will give you the ability to easily program your own reports, without having to download your data or anything like that.

Think of it like kissmetrics having an "edit the code that generates this report" option. You can leave your email or drop me a line if you want to try out the beta.


I would strongly suggest you drop the "websites not using us yet" line from your landing page. Showing their logos isn't a trick to increase conversions, but borderline dodgy -- and people will notice.


Ah, sorry, that was just a joke, the logos came with the theme. I'll remove them asap, thanks.


Cool, makes sense given you were working with a bunch of past data. Going forward I would recommend you try out Mixpanel though, it's much more flexible than KISSmetrics Mixpanel actually does a lot of stuff that makes it easier than excel for stuff like cohort analyses, it'spaiute flexible.


I'll take a harder look at MP - I'm all about ease of use. One of the other reasons Excel is nice is that your present self doesn't have to anticipate future data analysis needs. i.e. there was data I pulled into the analysis that I never would have set up MP or KM to track beforehand. At query-writing time I thought it might be interesting to pull in and see if there was any kind of pattern.


You can import data into Mixpanel any time you want: https://mixpanel.com/docs/api-documentation/importing-events...

=)


Kareem,

What were you trying to measure in KISSmetrics that you were unable to? Did you try using the Data Export feature? http://support.kissmetrics.com/apis/data/data-export-setup

Btw- You can import data into KISSmetrics as well: http://support.kissmetrics.com/advanced/importing_data

I'm the Product Manager at KISSmetrics, so I'm genuinely interested in your issues to understand what you're struggling with. We try to cover key use cases like cohort reporting, but perhaps missed something for you.

Thanks, Jason


The advantages of using Excel over a 3rd party service:

- since all the relevant data's in our db, i don't need to import data into a 3rd party service if i haven't been tracking it. i just modify my sql query. no need to futz with data importing, syncing, yada yada. - using Excel is a matter of writing sql, running the query, then importing a CSV. simple. - data manipulation is easier - Excel is designed for it.

I wrote the SQL and ran the analysis in my post over a couple of hours one afternoon. I can't imagine it being easier using any 3rd party service.

If you want to email more, drop me a line - email's in my profile.


Perhaps the OP didn't feel like learning new software when he had a mental model of how to do what was needed using familiar tools?


These are the sort of posts that keep me coming back to HN. Great analysis, and as a former Crossfitter - awesome product idea!


thanks!


Great post on the usefulness of cohort analysis (or of experimentation more broadly, which if you think about it, is exactly what cohort analysis is - it just uses the past as a control).

One thought on ways to analyze the follow-on problem of customers canceling after 61 days (a problem similar to what I've seen at every web company I've ever worked at).

First, perform the same cohort analysis you’ve already done, but look at the cancelling customers vs retained customers at day 1, day 15, day 30 and day 45, then use this analysis to figure out your triggers (things like # of Facebook posts needed by day 15, % of profiles claimed by day 30, etc).

Once you have your triggers, you can make proactively calling / emailing problematic customers a key part of your daily routine. While discounts might still be the way to go, this trigger based approach is one I've seen work well. Additionally, because you are in touch with problematic customers it often gives you insight into what do next.


Related great post written by Shopify guys a while ago: http://www.shopify.com/technology/4018382-defining-churn-rat...


Congratulations on the impressive result. It seems you have a good product but the real change appears to be encouraging those who wouldn't use your system properly to improve their habits.

I wish you could work out more concretely why the situation improved but with three substantial improvements (that likely impact different customers in different ways) that's difficult. I could imagine "drop[ping] prices by 15-60%" would help those not using your product fully for example as even if they don't use all the features they don't feel like they're overpaying.


Our hypothesis is that the biggest reason for the change was price. If it's a good use of my time, I'll run another analysis in a couple months to see if I can isolate the reasons for change.


Interesting article, two issues.

First, you changed two variables simultaneously, so its tricky to tell how much either contributed. The cynic in me, suggests that the article could actually be summarised as 'we cut our prices'.

Second, you right:

> So we improved our onboarding to help a gym owner export a CSV of their members’ email addresses to send to us.

Certainly in Europe, that could fall foul of data protection legislation, you'll need to make sure that the customer has given permission for their data to be past to 3rd parties.


Great article. I went to check out your site socialwod.com and was unable to scroll on my iPhone. You may want to check your analytics to see if this effecting a large percentage of your visitors. Keep up the great work!


Thanks for the heads up - will take a look.


Which items from your analysis made a the most difference?


Not sure - will run another analysis in a couple months if it's important enough to figure out.


Cool. I've read about JBara, which provides a CRM plugin that aims to predict when people are going to cancel and give you a chance to retain them.


I think the real takeaway here is: "Email is a better marketing channel then Facebook"


This is a good article. I have a few issues with price-reduction for existing customers, though, that I want to highlight. I think it's fantastic that OP got the desired results on customer churn, which was his goal - but I'd categorize pricing changes as "gray hat" retention improvement with regard to the overall health of his business and future revenue. Here's why:

On price specifically: In a subscription business like this one, you have to meet a minimum utility requirement in each month that a customer is able to cancel if you want to retain customers. Each customer's minimum utility is different and could even be comprised of different factors/features depending on the breadth of the product offering. But there is one factor that cuts across all of them: price. A significant element of churn is price because the initial purchase thrill may decrease over time and result in customer cancellation requests at a certain point in their lifecycle. So, cutting price is kind of an easy way to reduce churn in a subscription model...particularly because people bought in at X and are now paying fractional X. Boom - happy customers. Also, price-cutting is habit-forming, and the customers who received a reduced price will come back asking for more reductions in time.

On features: Multiple times, I've seen the "get more people using our product" as a good way to reduce churn. I won't comment on permission customer marketing and whether or not what OP did was legal, but the results of a feature like this are great, and seems like he added more than just this one. He added improved functionality and invested in his product at reduced prices - great deal!

On onboarding & cohort analysis: OP was right to focus on onboarding features and adoption to improve stickiness among new cohorts. He would have also been smart to raise the price for new customers if he materially improved the product (which it sounds like he did). Over the same time period, he could have had newer cohorts of higher paying customers, making the older ones less important to the financials of the business. By "hiring" higher priced customers to increasingly recent cohorts and continually "firing" older-lower-priced customers, the balance of his revenue would have shifted to these newer, more valuable customers over time, making the older-less-happy customers less important to his business. That's how you really turn the crank on a subscription business, and if your onboarding is good enough to continually improve retention in new cohorts, you've really nailed it.

Overall, I don't mean to be overly critical of OP's choices. I aim to highlight where optimizing for customer churn alone can harm the financials of the business, particularly around price-cutting for existing customers. He's doing lots and lots right with his cohort analyses, onboarding improvements, and assumptions about churn impact of new features. However, we're in business to make money, so these have to be balanced with the health of the business itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: