Hacker News new | past | comments | ask | show | jobs | submit login
Busting the small startups can't A/B test myth (kylerush.net)
27 points by kylerush on April 15, 2014 | hide | past | favorite | 14 comments



Unless a startup is doing something absolutely terrible, a 45% increase is just not likely. Your advice could better be rephrased as "if you are seeing low enough numbers that it effects the viability of your company, redesign X and then A/B test the change."

Startups have a hundred better things they could be spending time on. A/B testing preys on a founders sense that they are leaving money on the table and not optimizing for maximum profit, sometimes to the detriment of focusing on things like good hires or critical product changes.

Edit: It isn't obvious until I clicked around, but the author is a little biased and should disclose he is the "Head of Optimization" at Optimizely


It's pretty easy for absolutely terrible decisions to be made at a smaller startup, especially when you don't have the tools handy to prove that they're terrible decisions.

I happen to work at a smallish eCommerce startup with lower traffic volumes and a 5% (checkout–not engagement) conversion rate, and while A/B testing small (or even large) changes to our homepage haven't proven fruitful, two small changes to our product pages have made a meaningful difference to our bottom line. (And I mean small as in I spent maybe an hour total setting up those tests while on an Optimizely free plan.)

So yeah, I'm an Optimizely and A/B test fan.


Ahh... That last part makes sense. This is what I was trying to get around to. If you only have a handful of designers and developers, you shouldn't be focusing on min/maxing your landing page. It's not that this is a "myth", it just doesn't make sense for a new team to worry about it vs product development.

I'm not claiming this to be truth or anything. Just in my experience. Bringing up a/b testing to my clients or teams just results in a "we have better things to do."

(I suppose being the head at a co. that does this would make it biased on at least some level).


This is a completely fair point. Startups naturally will have to decide where the highest ROI is. It may not be a/b testing. However, they should know that a/b testing isn't necessarily off the table.


These types of articles should quote error ranges.

"Optimizely's homepage redesign that yielded a 46% '+/- x%' increase in new accounts"

They never seem to do so.

If you have small traffic, forget about performing A/B tests. Instead learn from the A/B tests performed by companies with high traffic, and implement their findings.


You can click through on the Optimizely article to see the raw data: http://2nwl40151nra3yah86118nxtto0.wpengine.netdna-cdn.com/w...


Incremental performance jumps with huge % increase wins are extremely rare.

I can easily jack up click-throughs on a new banner or get users to fill out forms more by changing text copy and/or CTA size/positioning/color, but that doesn't mean those increases translate to an incremental improvement in overall prospect to lead conversion rates - you very well might just be shooting yourself in the foot by killing good conversion rates later on in the funnel with your upstream changes.

A good example was moving the lead gen form on a paid search destination landing page from below the fold to above the fold.. what did we see? More form conversions, a 30%+ increase actually. Yay! Dug into it further and what we actually found was that customers who were looking for customer service were clicking paid search ads for the company and filling out the form to try to reach customer service rather than reading the lead gen CTA's on the page that clearly spoke to new customers.

Unless you have a very controlled linear sales funnel, websites are spider-webs with numerous paths visitors could take to arrive at the content they're looking for. So you always have to take into account what downstream changes you could be introducing with a "big win increase test" on your site's homepage.


From what I've seen, many companies see some “big” wins early on, which start to taper off into fewer, often smaller wins the more they test. Typically people are testing things such as moving the form fields above the fold, changing CTA colors/text, updating headlines, etc – smaller tweaks to an existing design, or rather, optimizing for the local maximum. As a result, fewer and smaller wins over time intuitively makes sense, as there is a certain point (a theoretical “perfect” page) where you can no longer improve a design. While it is highly unlikely you will ever reach that “perfect” design, the more you test, the closer you will get. If you are only optimizing for the local maximum, you could be leaving conversions on the table by not trialing “bigger” changes, or rather, optimizing for the global maximum (change to process flow / conversion funnel, introducing a totally new page template, etc). In doing so, it is more likely you will see bigger effects.


It's funny, but I've noticed that all of my clients in this space are just outright opposed to a/b testing. Most don't even bother to leverage analytics at all. I think more than anything they see the level of effort to ROI ratio too high.

For example, a startup not testing small changes right? Well unfortunately, an entire landing page makeover, while not overly difficult can windup taking a great deal of time. The redesign alone is just such an "opinionated" process (even by people who know nothing about design...) that throwing that much investment just to test is hard to justify. So even though most startups shouldn't waste time on stuff like button colors...that's often the only thing they'd have time for.

Of course I'm talking about the smaller scrappy startups. Not the ones backed by prolific amounts of capital.

Don't get me wrong. This is a GREAT article, I just would like to know the best level of effort for ab testing for the smaller guys.


Here's two super easy test ideas that can produce big wins.

1. Change text, specifically around your value proposition. Emphasize different things and see what happens.

2. Remove content. Make the page more focused on what's important. Here's an example that increased account creation by 43% at Optimizely: http://blog.optimizely.com/2012/10/09/optimizelys-100000th-e...


A/b testing doesn't have to mean extra work. If the small startup is going to redesign their homepage anyway (which a lot of them do), they might as well a/b test it.


I'm not really sure what you mean by this? In my experience redesigns occur when new features are added, major updates, or changes to the team (whom chime that something should change). Yes likely everyone will redesign at some point, but to do so just for a test?

Based on this article, only larger changes should be done by smaller startups. But smaller startups have less resources to throw at larger changes. I mean, what of we're not planning on redesigning our web page?


You don't do things for tests. You do the tests because you want to change something.

That said, the kind of results startups could get by tests are usually obvious at the balance sheet (or some other big metric) anyway, without any kind of formalization. If you have one of the rare situations where it isn't, go ahead and do a formal test, but I wouldn't recommend anybody to lose some sleep over it.


The point of the article is to help small startups understand that a/b testing is not off the table. Of course startups will have to decide where the highest ROI is themselves. Given a point in time, the highest ROI may not be a/b testing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: