Hacker News new | past | comments | ask | show | jobs | submit login
Optimization at the Obama campaign: a/b testing (kylerush.net)
83 points by kylerush on Dec 12, 2012 | hide | past | favorite | 38 comments



About halfway through the campaign we figured out that of all variables that affect user behavior (design, usability, imagery, etc.), copy has the highest ROI

I thought I'd highlight this because it is true, apparently not obvious since everyone discovers it for themselves, and wildly actionable by many people on HN.


As the other comments to your very helpful comment point out, the real art of writing copy on a website is to write it so that it fits the hurried reading style most people bring to websites. The webmaster who writes scannable, actionable text that actually answers the questions of site visitors will dominate competing websites.


For anyone confused about the meaning of the word 'copy' as used here: http://en.wikipedia.org/wiki/Copy_%28written%29


But it takes someone with a good understanding of language to really test copy. If you put a designer in charge of A/B efforts, if all they have a design hammer, all they will see is design nails.


Interesting as well, because most people I know assume users won't read copy if their life depended on it


Maybe most people you know aren't writing copy the way tokenadult recommends above. He's absolutely correct imho. Send them a link to his comment, and try it and see.


Any idea or theory why that is? E.g., after reading sites like http://www.useit.com/ you'd think the only thing that people do is scan (not read).


Scanning and reading are not mutually exclusive. In my opinion (and as someone who leads testing at Adobe), scanning is a style of reading that has become more commonplace with the Web -- a medium that is extremely copy heavy (a good thing!). People who understand how to write copy know how to get their copy read. Generalizations like "nobody reads" serve to keep everyone down EXCEPT for the people who know -- based on data -- that statements like it are erroneous.


In my experience: This is true, but it's true for everything. They're still predominantly scanning your copy because they're trying to figure out if they're in the right place and what they should do. Would they be able to pass a test on your copy afterward? Maybe, maybe not. But they wouldn't be able to pass a test on any other aspect of your site either.


I imagine users going directly to a site like barackobama.com or xcandidate.com would be more inclined to read, not skim. It might be as simple as "this guy is running for President, maybe I should read this stuff."


bloggergirl: You are hell banned. Your comments are are marked dead on submission, and can only be seen by people who are logged in, and have "show dead" on in their profile. Open an incognito window or logout, and you won't be able to see your comments.

PS: I don't know what is the etiquette about letting hell banned users know, but from bloggergirl's previous comments, I don't see any reason for her to be hell banned.


People do scan, but certain words will pop out to them, or if they're styled in bold or italics.


> Overall we executed about 500 a/b tests on our web pages in a 20 month period which increased donation conversions by 49% and sign up conversions by 161%.

How did they measure this? Did they isolate the effects of the a/b testing? Donation and sign up conversion rates will increase anyways as the day of the election approaches.


You're always testing against a control and have something to compare to.

If they really wanted to, they could've kept a small amount of traffic going to the original control from before they started testing.


HN-ers, I'm new at this whole A/B Testing, Copywriting, "testing your hypothesis", and the whole "trying to sell something" + "MVP" (or the Lean Startup).

I recalled yesterday I saw a link to Intuit methodology that also do a similar A/B Testing to decide what to build: http://www.forbes.com/sites/bruceupbin/2012/09/04/intuit-the...

Could someone connect the dot to me or point to me to somewhere that I can make sense of the whole thing?

How can one come up with many pages so quickly to test? and how one could decide what the content of the page to test is?


There are a few javascript-based solutions for testing

* Optimizely (A YC company with a WYSIWYG editor for specifying variants)

* Xander.io (Targeted at developers with HTML5 markup to specify variants - full disclosure I help with this project)

* Visual Website Optimizer

There are also server-side solutions depending on what framework you are working within.

On getting started:

You need a goal for your pages (or collection of pages). It could be you want users to sign up for a mailing list, or donate to a cause. As long as the goal is clear, you will have a direction to work your hypotheses towards.

Once you have the goal, coming up with pages and content is the fun and difficult part.

I like to think of pages as a combination of sections, with each section having a bunch of variants - we wrote a simple example with xander.io - http://www.xander.io/tutorials.html and there are a few on each of the services I linked above.

It's definitely worth figuring out, let me know if I can help!


patio11 has a lot of good stuff on this topic:

http://www.kalzumeus.com/greatest-hits/ (scroll down to 'Conversion Optimization')


Great write up. For those interested, I've found similar results in five years of A/B testing on a non-profit fundraising site (www.globalgiving.org).

Pro tip: Don't just copy what Obama did. Your mileage will vary based on your user base and context. Take their learnings and A/B test it on your own site to see if it works for you or not. Not all optimizations are universal.


> Not all optimizations are universal.

One example from the top of my mind would be Wikipedia's "personal appeal from Jimmy Wales" campaign. That photo made me uncomfortable, and I would have liked a more panoramic shot with the subject in focus like the one used for "Dinner with Obama". But Jimmy Wales responded to someone(I can't find the thread; was on reddit I think) that it converted better than other alternatives.

That said, starting with some known best practice(long forms, make it multi step) and then measuring the alternatives goes a long way.


Wiki published all the numbers from that campaign. The personal appeal from Jimmy creative absolutely ROFLstomped everything else they came up with. 10X improvements, literally.

Here's the raw data.

http://meta.wikimedia.org/wiki/Fundraising_2010/Banner_testi...


Why is it that there's no Romney campaign write-ups? Are they that less open of a tech community? Yes, I know they lost, but technically, they beat out many other competitors in the Republican primaries and so must have exercised some kind of technical prowess?

(and yes, I'm aware of their fail-whale Orca system....that would also make for a fun writeup)


From personal experience with campaigns i can say that it is much easier to write about things like this when you win. I worked a campaign in 2006 where we lost by 800 votes out of 160k, It took me until early 2008 to be able to comfortably put down on paper where we went wrong.


there actually was a write up yesterday. http://www.targetedvictory.com/2012/12/success-of-the-romney...

funny thing is they were using optimizely as well.

disclaimer: I work at optimizely


I knew someone was going to point out my confirmation-bias...thanks!


Well, they outsourced their IT to Best Buy and Staples subsidiaries, so I can't imagine they have much useful advice to contribute.


The biggest surprise was the 5% increase for breaking the steps - seems little - when compared to the copy and imagery change (massive 19% and 21%).

That's kinda insane! Love the report, thanks.


A lot of conversion optimization is finding the balance between the number of steps and the complexity of each step.


I have mixed feeling about "Now save your payment information". "Save your payment info for next time" makes it clear that it is optional whereas the latter makes it look like part of the process. It might increase conversion(and did per the post), but I get the feeling that is somehow conning the user into signing up.


That's one issue with a/b testing, it can lead to behavior that's all data driven which then can lead to "tricky" designs being implemented. Sometimes, someone should just be able to make a call.


Well a call certainly had to be made about what they were going to test. The trick there is that it's easier to get people to agree to "try a technique out" even if it is unscrupulous, and that means a tricksy has a better chance of getting its foot in the door.


I wish somebody would do A/B testing with internal company applications.


The big difference is most employees are forced to use internal company applications, so getting user adoption/conversion is not as big of a concern.



It is interesting to see how the Romney campaign copied from the Obama campaign.

For the flipside, what was copied from the Romney campaign?


Awesome. Every business could learn from this. I know Facebook does A/B testing.


Clearly, Kyle had no experience with eCommerce before this project as it's evident in his writeup.

However, props to him for all the hard work.


> Clearly, Kyle had no experience with eCommerce before this project as it's evident in his writeup. However, props to him for all the hard work.

I don't know if it was your intention, but you sound incredibly vain and arrogant. "Clearly Kyle had no experience" implies this is so trivial why is anyone even talking about it. And to top if off, "However props to him..." is even more insulting.


How did I know someone was going to be shitting on the author in the comments? The Obama team ran the most successful presidential campaign in history in terms of donations, and this guy was a part of it. Be appreciative that he let us take a look behind the curtain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: