We have discussed before how A/B testing helps you go from a set of “expert” opinions and fuzzy partial data analytic points to trustable (i.e. statistically significant) knowledge you can use to improve your site conversions following a kind of scientific method to optimize your website.
Does this mean that A/B testing transforms web design into a boring activity with all sites being created with the same look and feel? Isn’t any place for creativity in companies adopting an A/B testing strategy? This is far from true. Creativity and A/B testing are friends, not foes. Let me give you two huge reasons for that:
- A/B Testing tells you which alternative works better but it doesn’t help you to come up with new alternatives to try. Yes, you can find many examples of A/B tests online (WhichTestWon is one of my favorite pages for that) and you could use those as inspiration but remember that for every example of a winner A/B Test you’ll find somebody claiming the opposite results. Just look for tests on the most converting button colors and enjoy the variety of “best” converting colors (red, orange, yellow, blue, …). It’s not that they lie, it’s that every website targets a different audience with a different background, age, gender, personality, … and therefore that reacts differently in front of the same visual stimuli.
- A systematic use of A/B testing tends to get you a local maximum which is the best conversion result you can get without radically changing the design of the site. When A/B testing we tend to go one step at a time, making sure every small change has a positive effect. This is a good testing philosophy but doesn’t help to get big conversion leaps resulting from more radical changes.
Cleary, a series of small A/B tests based on examples found online alone will only get you so far. Throwing creativity into the mix will instead solve this problem. You know your visitors, try to be creative in designing alternatives that will cater to them. And every once in a while, make a leap of faith and test some major redesigns to make sure that your local maximum is also the global one and you’re not just trapped into “a small valley” missing plenty of big conversion opportunities. Remember, this is a controlled risk since your favourite A/B testing tool will be monitoring for you the results of the experiment and will let you know whether you found gold or not.
This quote from Jon Bentley summarizes it beautifully:
Don’t experiment when you should think; don’t think when you should experiment
I would love to know more about your A/B testing strategy and how you come up with new tests! Care to share it with us?