Last month I explained the details of an A/B test that we run on our main landing page. We created a variant with a different, simpler content layout, in order to see if it would render better results.
However, when we finished the A B test, we realized that although we liked the variant more, our audience did not agree with us. The variant’s results were worse than the original page. Fortunately, when you rely on data and not opinions, you avoid many problems.
At the same time that we had that A/B test running on the web, we also had another similar test on the landing page of one of our products. And that’s the test I’m going to talk about today.
When we are defining an A/B test, the first thing we need is a hypothesis that we can test to know if it is really true or not.
The hypothesis that we wanted to verify with this A/B test is that a landing with less text and a more visual layout should achieve better results. This is easy to verify, since we only have to define the alternative version of the landing page with a simplified and more visual content and put it to the test against the original.
With this in mind, in the following comparison you can see on the left the original landing page we have. On the right, you will see the alternative landing in which we have reduced texts and more images:
This is the summary of changes we made:
- We have removed the final section with the logos of the most popular page builders and the text indicating that we are compatible with them and with Gutenberg, as we believe that it has been a while since Gutenberg was released and perhaps it is not worth including this.
- Regarding customer testimonials, we have reduced the text of each of them for simplicity. We have also reduced the size of the video and accompanied it with a key text that allows us to eliminate the features section that appeared after the first block of the web.
- Finally, we have accompanied the sections with some images of people that we believe give a more pleasant touch to the content.
And with these changes, we only have to test how the new version of our landing page works.
Definition of the A/B test
To define the A/B test in WordPress we will use our Nelio A/B Testing plugin, which you can find in the WordPress plugin directory.
Once the plugin is installed and activated, we go to the menu Nelio A/B Testing » Tests and create a new A/B test of pages. We will find a user interface like the following:
The first thing is to select the original page on which we want to create the A/B test. In our case, we select the landing page you’ve seen before. Then we give the variant a name and click the Edit button to open the WordPress editor. There we just have to apply the changes I showed you a couple of screenshots above.
We save the alternative and return to the edit screen of the A/B test to define the actions we want to measure. In this case, we will consider as a conversion each visit to the pricing page from the page we are testing. We define this in the conversion goals and actions section.
With everything ready, we only need to start the A/B test. Once the test is active, half of the visitors will see the original landing page and the other half will see the version with the changes automatically. Likewise, every time a visitor who has seen a version of the tested page visits the pricing page, the A/B test plugin will count a new conversion and add it to the results.
We just have to wait for the results to accumulate and then see the final trend to decide if one of the two versions is better than the other.
Analysis of the A/B testing results
After having the test running on our website for more than a month, we have stopped it since the results obtained are significant. You can see them in the following screenshot:
The first thing to note is that in this test, the results are statistically valid and conclusive with a statistical confidence greater than 98%. This tells us that the results are not due to chance. In other words, one of the two variations is clearly better than the other.
In the case of this test, we see that the original version works better than the version with the proposed changes. Specifically, the alternative version works almost 12% worse than the version we already had. Thanks to these results, we have avoided using a version that we thought was better according to our opinion, but proven worse by the data. And this is what really counts.
A/B tests can help us both to find a better version and to prevent using a version that we think is better but that in reality is not. The cost of using this technique to ensure that the changes we make to our website make sense and actually work better than the current version we have is relatively low, as opposed to what we can lose if we only rely on our opinions.
I encourage you to try A/B tests on your website to gain confidence when applying real improvements based on data from your visitors. Surely once you get started into it, you won’t stop testing changes.