Some of the best content on the web is the stuff that gets you worked up. I mean, If you can get someone to think about their deeply held belief and talk about it an interesting and possibly convincing way then you've added value for that person. You don't even have to change their mind, you just have to make them question some part of the belief. That's what happened to me yesterday.
What Was Said
I'm no fan of click bait but when I read the headline "Why most A/B tests give you bullshit results" I immediately got worked up and I'm definitely clicked that link. Well done Venture Beat.
The author gave several examples of why A/B testing is so poorly done and how it often disappoints. The promise of A/B testing is that you'll be able to inch your way to success but the reality is that 80-90% of all A/B tests are statistically insignificant. His entire premise is that A/B testing requires a bit more work than most people are ready to put into it. That's the tricky thing about A/B testing, its not magic. It really does take some hard work to not only design good tests but also interpret the results.
Don't Throw the Baby Out With the Bath Water
I don't disagree with much of what the VB article said but I do wish they had provided some more guidance on how to get the most out of A/B testing. I'm going to supplement that article with a few tips but first we should talk about why A/B testing is important.
- The point of A/B testing is to help you make a data driven decision about how a page should be designed, what color the buy button should be, what wording converts the best, etc. For too long we as marketers and e-commerce professionals have depended on our gut and personal preferences to make these decisions. Everyone thinks they are Don Draper. Sometimes it works out great but generally leads to only middling results. There is no reason for you to be selling online today without A/B testing you site.
- As businesses continue to be digitally transformed, the buyers journey is becoming more complex. Because there are so many more touch point today campaign effectiveness is incredibly hard to measure. What parts of my marketing efforts are actually attributable to conversion optimization? There are plenty of operational ways to increase conversion but the content and design aspects often go unmeasured.
- To a marketer the two most important metrics are probably conversion and customer retention. Conversion is indicative of content and design effectiveness while customer retention is a measurable indication of the overall brand experience. Sure, you'll look at click through rates, monthly unique visitors, time on page and others but I can pretty much run a marketing organization with conversion and customer retention. Both of these can benefit immensely from A/B testing.
How to Get the Most Out of A/B Testing
There is no way to avoid the hard work mentioned in Venture Beat article. You'll probably still see 70% of your test yield statistically insignificant results. Do your self a favor, don't despise that 70%. You have to go through bad tests to find good ones and the good ones can make all the difference in the world.
Here are a few tips to maximize the effectiveness of your A/B tests:
- Drive traffic - It sounds silly to say but I've seen plenty of A/B test that have an n value of 50. Its just not enough traffic because a few hits in one direction or the other can create the illusion of statistical significance.
- Make sure you have optimized your content distribution channels
- Use partners and thought leaders to amplify your message
- SEO - seriously, this is too important to ignore. Relevance trumps everything. Make sure your message is getting to the right people
- Test drastic before subtle - Honestly, when you start A/B testing you should try testing big changes before the small changes. The wording on the "buy" button may not even get read by a customer familiar with your site. If they want to buy it they know which button to push and swapping out "Buy Now" for "Add to Cart" won't be nearly as helpful as testing the position of that button.
- Test both content and design elements - The majority of A/B testing in the e-commerce world is around promotional effectiveness or content stickiness [read bounce rate]. You can also test the design elements on the page. Changing element location, size, or color really do effect the shopping experience. By the way, if you want to see design element A/B testing in action you can get a free demo of Oracle Commerce Cloud here.
- Multivariate - Sometimes a single change doesn't accomplish much but making multiple changes to a page at once can be the goose that lays the golden egg. Multivariate testing is a form of A/B testing that tells you which changes across multiple pages will have the most impact. This is an advanced tactic but worth its weight in gold. Maxymiser is a great tool to help with this.