The Success of Failure in A/B Testing

Online Marketing

Did you know that we're only three people here at Nelio? And, yet, our posts are pretty cool, huh? That's because of our new plugin, Nelio Content! Do you want to use it too?

In this blog we’ve widely discussed all the advantages聽A/B Testing brings when it comes聽to improving the conversion rates of your website and understanding your visitors better. And, to be honest, this shouldn’t surprise you, because, you know, that’s what we do for a living. I’m also pretty sure that you’ve read聽plenty of articles all over the web presenting amazing uplifts in people’s conversion rates just because they run a successful split test.

What you probably don’t know is that split tests fail too. And they tend to fail a lot. Does this mean that split tests are fraud? Absolutely not! Keep reading and you’ll learn everything you need to know to improve the effectiveness of your CRO efforts.

Success Stories Are Great, But…

…they set high expectations. It’s like winning the lottery. Or being a successful entrepreneur. Everybody talks about that guy. The one who, out of nothing, suddenly got a billion dollars. Wow! Who doesn’t want to be that guy? I’m sure as聽hell I want! 馃槈 But what about all those people who tried and failed? Those who didn’t make it? No one’s like to admit they’ve failed聽and, yet, here they are.

Having faith in what you’re doing and using a split testing tool appropriately can definitely contribute to outstanding results. But mark my words when I say聽some of your tests are going to fail. If you’re not ready for it, then you or your client might聽get discouraged and give up on a certain test聽(or even split testing in general) too soon,聽ignore the results of any test that didn’t match the expectations, even though you can learn a lot from them, or聽evaluate your conversion rate聽based solely on the exceptional鈥攜et rare鈥攚ins.聽So let’s get rid of these false expectations and make sure that, whatever we do, we聽lose no聽time, money, or opportunities, shall we?

A/B Testing: the聽Last Step in聽CRO

After two years of Nelio A/B Testing, there’s one thing I can tell you: those customers who come to our tool expecting “magic” results, barely ever see the success they were expecting. Why does this happen?

A/B Testing is only one step in the conversion optimization journey鈥攁nd, by the way, it’s at the end of the journey. So what’s this “conversion optimization journey” all about? The ultimate goal you pursue when subscribing to a split testing service like ours is to increase the effectiveness of your website. Your real goal, thus, is to sell more products, or get your visitors to share more posts, or get more subscribers to your newsletter. How do you do that? How can you be more effective at communicating your message and engaging your visitors? Well, nobody knows!

The first step of the conversion optimization journey is understanding your visitors. You need to know who they are, what they’re looking for, why they are in your website in the first place. What they like, what they don’t. What’s bothering them from or what’s missing in your website. All these insights should be acquired before聽designing a split test so that you can hypothesize what you should change that聽will lead to better conversion rates.

Screenshot of Heatmaps with Nelio A/B Testing
Heatmaps are one of the most powerful tools available to comprehend what users do in your website.

Use tools like heatmaps or a usability test suite聽to聽highlight the weak聽spots聽of your website, and then simply imagine how to strengthen them. It’s that easy!聽Look. Think. Test. That’s the key to successful tests!

Now that you know that unsuccessful tests aren’t uncommon and how split testing should be addressed, let me wrap up this section with just a few notes on what you should really expect from your split tests:

  • Split Testing works in the long run.聽Think about split testing as if it were a marathon, not a sprint. A/B Testing is a powerful mechanism for improving your conversion rate… but you’ll rarely see huge uplifts all at once. Each test counts. Look for the long run and be ready to work little by little for weeks, following a disciplined process.
  • Huge wins rarely occur. I’ve already mentioned this, but it’s really worth emphasizing. If you’re lucky, I’m sure someday you’ll run a test that will boost your conversion rate. Just remember this is uncommon, and even though small improvements (+5%) might seem, well, “small”, they can have a huge impact in your聽annual revenue.
  • Neutral and negative tests matter.聽When you test a change in your website, you expect it to be better than your control version. Unfortunately, this is not always true. Sometimes, the results will be inconclusive (that is, both versions are equally effective at converting your visitors) or even worst than those you already had. Don’t despair! We’ll shortly discuss how to get insights from those tests too 馃槈
  • What works for me might not work for you.聽Looking at your competitors, reading conversion optimization blogs, and looking for new ideas on what to test聽in your blog is always recommended. But don’t blindly copy those tests and expect the same results鈥攜ou need to understand聽why those tests work and think whether that reason might work in your website too.

Understanding Failure

I’ve been talking about “failure” and “failed tests” for a while, now, and I haven’t even defined what those terms聽mean. But I’m pretty sure you got the idea. When we talk about “failed tests” we’re talking about a split test whose outcome doesn’t match our expectations. So we created this test, expecting the alternative to be far better than the control version, and… Oops!, its conversion rate is worst. Or it聽isn’t, but it’s not better either.

As you can see, there are two types of failure:

  1. Neutral tests.聽The test resulted in neither a winner nor a loser. Results are “inconclusive”, for none of them is better than the rest. It’s like we wasted our time! Here you can see a list of聽6 A/B tests that did absolutely nothing for the guys at Groove.
  2. Negative tests.聽The variation we created resulted in fewer conversions. Better remove this test before anyone sees it, right?

If a test didn’t end up as expected, never聽discard it without looking into it鈥攁sk yourself聽why聽it didn’t match your expectations, try to understand where things went “wrong”, and get ready for the next split test!

Embrace “Failed” Tests

Failed tests are no failure. They simply tell us that our original assumptions were wrong. So, for example, if you ended up with a neutral test, try to answer the following questions:

  • Are聽the variations different enough?聽Sometimes, we test small details of our pages. The color of a button here, the order of a set of icons there… These changes may聽have an impact in聽your conversion rates, but they may as well be so subtle that the test results in a draw. You’ll need to be more creative and broad the scope of the experiment.
  • What hypothesis does this result invalidate?聽You created聽this “failed” test because you thought it’d modify your visitors behavior. But it didn’t. For instance, you decided to add a “30 day money back guarantee” in your pricing page, to reassure your visitors that, should they not like your service, they can unsubscribe and get their money back, risk-free. But this didn’t improve your conversion rate! So maybe you tried to answer the wrong question. Maybe your visitors are not worried about getting their $10 bill back. Something else is bugging them, and that’s what you need to find out.

What about negative tests? Well, similar questions apply:

  • Was the hypothesis wrong?聽Again, the results are exactly聽the opposite of your expectations. Why? What were you expecting exactly and what happened in the end?
  • Why did the original version perform better?聽Look at the original version and compare it to your聽alternatives, and try to understand what makes the former better.
  • What does聽this test teach us about our visitors?聽You聽just learned your assumptions were wrong, and that’s great. But can you get further insights?

Key Takeaways

In the end, the most important question is聽how to聽turn a failed A/B test into a win?

  1. You must have a clear goal.聽It’s聽the first step of A/B testing. Propose a hypothesis (e.g. “if I [change this], then [the number of visitors that do that] will increase/decrease [because聽they…]”).
  2. Test changes that are relevant.聽Be bold in your tests; don’t change things that will go unnoticed or have nothing to do with your goal.
  3. Wait for statistical significance. Never jump to conclusions too soon.
  4. Rinse and repeat.聽If your test failed, don’t dismiss it. Try to understand why it failed, go back to your original hypothesis, rethink it, and prepare a new experiment that will test that again.

Finally聽let me suggest the reading聽You Can Learn More From Failure Than Success by Max Nissen.

Featured image by Loco Steve.

PoorMehGoodVery GoodAwesome! (No Ratings Yet)


He obtained his PhD in Computer Science at UPC. David leads the analysis and design of our services and the user support area. He's interested in a variety of areas, including conceptual modeling, virtual reality, and 3D digital printing. He contributes to the WordPress community by participating in meetups, seminars, and the WCEU.

Leave a Reply

Your email address will not be published. Required fields are marked *