Hacker News new | past | comments | ask | show | jobs | submit login

If I were to play pop psychologist, I think we have some fairly simple fixes that we want to work because they're cool/sexy/etc, and some simple fixes which we don't want to work because they're "stupid" or sound marketing-y.

For example, in the original madlibs test, where seven things changed, we want to attribute the change to the madlib element (which was sexy and creative) rather than to e.g. the text on the conversion button changing. I've gotten 20% lifts out of "stupid little things" like that before, and seen good writeups of the same from other people. But e.g. designers do not want to hear that the thing they spend 98% of their time on is essentially meaningless and the elements of the page which they scrutinize least carefully -- like microcopy or button text or (in one memorable test) which way the stock photo model is facing dominate the measurable results. This is like how doctors do not want to hear that the difference in clinical outcomes due to professional experience is totally dwarfed by the difference in clinical outcomes due to following checklists. (On the plus side, designers don't kill anybody for their professional pride.)

In general, I think the point of A/B testing is that you should ruthlessly soulcrush what you want with what the data actually tells you. I respect HN folks enormously, but I'm not hearing "I tested it and it worked" -- I'm hearing "I've got a great theory on why your data does not conform to reality. Reality is, of course, what I predict it to be."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: