A ruler and a pencil next to each other

In the highly competitive landscape of online sales, having a clear and compelling pricing page is crucial to converting visitors into customers. Do you know if many potential customers abandon their buying decisions due to confusion or lack of information about what they are getting for their money?

Today we present the new A/B test we’ve been running on our website that can address these problems head-on.

As always within this series of posts, we are going to see in detail the improvement hypothesis we made, its application creating variations of a page on our website, and the results we obtained.

Testing Hypothesis

Our A/B test introduces a detailed feature table on our pricing page that can provide a clear comparison of what each pricing tier offers, helping our customers make informed decisions quickly and confidently.

Our hypothesis is that by enhancing transparency with a clear understanding of what the potential customer can get and highlighting the unique value propositions of each plan, can underscore the value of our products, significantly reduce hesitation, and increase conversion rates.

So let’s put this hypothesis to test and see whether we were right or not.

Definition of the A/B test

To test the changes properly, let’s create an A/B test to test the current page against a new version with the changes we just described. To do so, let’s go to the Nelio A/B Testing menu on our WordPress dashboard and create a new A/B test of pages:

Editing the A/B test on the Nelio Content pricing page
Creating the A/B test on the Nelio Content pricing page.

First, we select the page we want to test and create a variant, which we can then edit to match our proposed layout. One thing that’s great about Nelio A/B Testing is that you’ll be able to edit alternative content using your page builder (in our case, it’s Gutenberg). So we edit the pricing page and insert the detailed comparison table as we would edit any page.

Adding featured table on pricing page
Adding featured table on pricing page.

Note, as you can see in the screenshot above, that the main difference between editing A/B test pages versus editing your original site pages is that editing the test pages includes Nelio’s A/B Testing sidebar that allows you to go back to editing the A/B test.

After editing the variant, we proceed to create our conversion goals. This is, what we want our visitors to do on this page. As you may have already guessed, we want to increase sales. For these, customers must first proceed to click on a subscription button for one of the plans shown and then make the payment.

In order to better understand the behaviour of our users, we decided to create five goals:

  • Click on any plan: we want to know the total number of clicks we get on any of the plans offered.

As you can see in the A/B testing creation screenshot, for this goal we have defined as conversion action that the visitor clicks on an element matching de CSS selector a[data-fastspring-product] which corresponds to the “Get Started” buttons of any plan.

  • Three goals to know the exact clicks on each plan: Click Basic, Click Standard and Click Plus.

In these three cases, we have defined as conversion actions the clicks on the elements matching the CSS selector, a[data-fastspring-product="nc-yearly"] and a[data-fastspring-product="nc-monthly"] for the Basic plan, a[data-fastspring-product*="standard"] for the Standard plan and a[data-fastspring-product*="plus"] for the Plus plan, respectively.

  • Purchase: we want to know how many purchases have been made.
Purchase conversion action
Purchase conversion action.

To do this, we have defined the conversion action as a visit to the thank you page to which the visitor always lands after completing the purchase.

With all this, we are now ready to start the test and let the A/B testing tool (in our case, Nelio A/B Testing) collect the data.

Analysis of the A/B testing results

We started this test the April 9th and have had it running for almost two months (1 month and 24 days).

First of all, we will focus on the results from a performance point of view. Which variant achieved the most subscriptions?

Purchase conversion action results
Purchase conversion action results.

As you can see in the screenshot above, the graph shows a 215.4% improvement in the conversion rate of Variant B over the original version. You might think: WOW, that’s awesome!

But don’t be too hasty, this result is worthless unless it is statistically significant. This is, that there is enough evidence from the statistics point of view that the new variant likely has a higher conversion rate than the original one. Unfortunately, the results show that “No variant seemed better than the rest”.

How is this possible? In this case, the sample is probably too small or there are not enough conversions in either version for the difference between the two to be considered sufficient to know with certainty that it will have a real impact in the future. So in this case, although it may seem to you that clearly variant B is better, from a statistical point of view it may be purely coincidental.

If we look at the results in terms of user clicks to the different plans, the results are also inconclusive as the analyzed samples of our visitors are also too small.

It might seem that, indeed, by including the comparison table, users are more likely to decide to click on any plan. However, as I said, we cannot be sure whether this result was pure coincidence and whether it will improve our conversion rate if we apply Variant B on this page.

Now, let’s take a look at the heatmaps to see if we observe any difference between the behavior of users landing at these pages.

As you can see in the two images above, there are no differences to be noted between the two variants. The hottest areas in both cases are the subscription buttons or changing the type of subscription (monthly vs. annual).

What about the scrollmaps?

Scrollmaps are useful to detect the depth level of the scroll. That is how far the users go when scrolling and what is the limit which there is no navigation.

In variant A, 33% of visitors have seen the section describing the most important features included in all plans. In variant B, 50% of the visitors have reached the previous point. Although the comparison table is after this section, it seems to generate more curiosity among users to dig a little deeper into the page.

Finally, let’s take a look at clickmaps. In these maps, in addition to showing where users click, you can also filter this information by certain characteristics such as type of browser used, country from which it is accessed, day of the week on which the click was made, type of device used, operating system used, time of day the click was made or time elapsed since the page was loaded until clicking, and window width.

On the clickmap filtered by time to click, we observe that, on average, users spent more time on Variant B over Variant A, which seems consistent with the results of the scrollmaps. Therefore, my assumption is that the fact of having added the table with the detail of the functionalities of each plan on the page, causes some users to spend more time looking at the detail of this information.

Conclusion

The A/B test conducted this month, unfortunately, did not allow us to be certain that the proposed change in our Nelio Content pricing page will lead to an improvement in conversion. The main problem is that the number of conversions obtained is too small for the results to be conclusive.

However, with the scrollmap and clickmap we have been able to see that users who have visited the page that included the comparison table, have stayed longer on it and have probably looked at the information displayed on it.

Although we have not been able to prove our hypothesis that improving transparency with a clear understanding of what the potential customer can get and highlighting the unique value propositions of each plan will increase conversion rates, we do see that users show interest in viewing such information.

Therefore, after performing this test, we decided to apply variant B as definitive.

Screenshot

The great advantage of using Nelio A/B Testing is that this is as simple as clicking the Apply button and all users will see this page.

Featured Image by Sven Mieke on Unsplash.

Leave a Reply

Your email address will not be published. Required fields are marked *

I have read and agree to the Nelio Software Privacy Policy

Your personal data will be located on SiteGround and will be treated by Nelio Software with the sole purpose of publishing this comment here. The legitimation is carried out through your express consent. Contact us to access, rectify, limit, or delete your data.