Today I bring you another A/B test that we have been running for a few months. As always within this series of monthly articles, we are going to see in detail the improvement hypothesis we have made, its application by creating variations of a page on our website, and the results we obtained.
Some time ago we were studying the color of the action buttons on the main page of Nelio Content. Thanks to an A/B test of CSS styles, we were able to see how the green color works best for our English-speaking visitors to that page.
We cannot assume that this will be the case for any page. That’s why today we are going to test the color and size of the checkout buttons from the pricing table that appears on the Nelio A/B Testing purchase page.
Definition of the A/B test
Our hypothesis for improvement is that changing the color and size of the checkout buttons in the Nelio A/B Testing pricing table will get more clicks on these buttons and, as a consequence, we will have more sales.
The original version of the Nelio A/B Testing pricing table shows the three plans that we sell, each with its price, details, and a button to subscribe to the service. You can see it in the following screenshot:
The buttons are of a gray color that stands out little, except for the professional plan, which has a dark blue button to give more emphasis to this plan compared to the other two.
The first thing we are going to check is whether changing the color of these buttons to make them more striking is better or not. For it, the alternative version that we have proposed includes a change of CSS styles with new colors, as you can see below:
We have kept the professional plan highlighted, but now the other plans are dark blue and this intermediate plan has a more prominent orange color. According to studies in color psychology, orange is a good choice for action buttons, so we decided to choose this orange that you have seen in the previous image for the professional plan.
As our hypothesis stated that the size of the buttons should also affect their performance, we have kept the changes in the colors proposed in the previous variation to create a new one in which the sizes of the buttons have been increased. You can see this change in the following screenshot:
Therefore, we have three different versions of the CSS styles of the Nelio A/B Testing pricing table:
- The original, without changes in styles.
- A first variant with more prominent colors on the buttons.
- A second variant with prominent colors and a larger button size.
With all this work done, we can translate this into an A/B test of CSS styles. For it, Nelio A/B Testing allows you to create an A/B test of different CSS styles with which you can easily test style changes in WordPress.
We create the new CSS A/B test and the three alternatives within it. In fact, the screenshots you have seen before of the variations are screenshots of the CSS style editor that Nelio A/B Testing includes to create the alternatives.
As you can see in the screenshot above, we limited the scope of the test to the Nelio A/B Testing pricing page, as well as the fact that we defined five goals to be measured within the test:
- Clicks on the buttons of any plan.
- Clicks on the buttons of the basic plan.
- Clicks on the buttons of professional plan.
- Clicks on the buttons of enterprise plan.
- Amount of purchases.
With all this created, something that will not take you more than 10 minutes, we start the test and wait for the results to come. Now it’s Nelio A/B Testing’s turn to do the job. Our plugin is in charge of splitting the incoming traffic between the different variations and counting conversions in each goal defined in the A/B test.
Analysis of the results of the A/B test
This A/B test run for three months on our website. The results can be seen below. We are going to analyze each goal in detail to understand the conclusions that we have drawn from this test.
The first goal was to measure the clicks that were made on any of the buttons of the plans in the pricing table. In this case, the results show that the variant with the changed colors achieves 17.2% more clicks. On the other hand, the variant that changed of colors and sizes is 15% worse than the original version of the page.
However, none of these numbers achieved a sufficient level of statistical confidence to clearly identify a winning version for this goal.
The second goal counted clicks on only the buttons of the basic plan. In this case, the results are similar to the previous goal. We have the version with changed colors as the best of the three and the version with colors and sizes as the worst.
Similarly, the statistics are not able to identify the version with the new colors as a clear winner with enough confidence.
The case of the third goal, where we measure clicks in the professional plan, is a little different. As before, the version with the colors is better and the version that combines colors and sizes is worse. However, now the statistics tell us that clearly the winning version has enough confidence.
We can say that by using more prominent colors we get a greater number of clicks. The orange button works better than the blue button that we had before.
In the case of clicks in the enterprise plan, here we return to the same state as with the first two goals. But here the differences between the original variant and the variant with the changed colors are negligible. The version that does seem much worse is the one that includes changes in color and size.
All this clicking stuff is fine. But if we only take a look at the previous results, we will be having a partial vision of reality.
For this reason, I added the last goal in which we measure the number of purchases achieved by each variant. You can get a variant that gets more clicks, but if it is not able to get more sales as well, you will be choosing a false winner.
In the following results we see that this is exactly what happens. The version with the new colors achieves 24.5% less sales, while the version that combines colors and sizes sells 16.6% less than the current version of our page (the one without changes).
With these results, what we have to see is that the conversion funnel is complex and, although we often divide it into phases and count the micro-conversions, we must not lose sight of the whole picture.
The advantage of creating multi-goal A/B tests is that we can have all the perspectives we want to get the complete picture of how well (or badly) our website is performing. Only by doing this we’ll be able to have the confidence to choose a winner with complete peace of mind looking at the data of the A/B test results.