We started thinking about a native A/B testing solution for WordPress after the summer of 2013. Our first business plan, which came some moths after, described a SaaS business. As you may know, creating SaaS solutions for WordPress has never been easy, and it can easily get worse when you need to start it from the scratch (unless, of course, you “inherit” a name that’s well-known by the community, which was not our case back then). With big amounts of effort and development hours, we finally released the first version of our Nelio A/B Testing plugin and uploaded it into the WordPress repository. Since the release of our MVP, which only supported A/B testing experiments for posts and pages, Nelio A/B Testing for WordPress has evolved a lot, and it’s now a great success with our customers.
We’ve just reached our first year providing conversion rate optimization services for WordPress, and we wanted to analyze the results and statistics we’ve obtained during this period. And, of course, we love to share some of our findings with you here.
First, some facts about the development process
The evolution of downloads of Nelio A/B Testing from the WordPress repository during the whole first year have surpassed the 6,000 downloads threshold (as shown in the chart below), and all the positive comments and suggestions from our audience indicate us that we are on the right track with our service.
Since the beginning of Nelio A/B Testing, we’ve been committed to provide the best user experience to our customers. Our development roadmap this year has been very aggressive (and our plan is to continue with the same pace). We’ve released new features almost every month, including new types of A/B testing experiments and heatmaps, which we know our customers enjoyed.
We’ve launched a total of 11 major releases of Nelio A/B Testing, working very hard to fix all the issues and incompatibilities our customers pointed out and, of course, including new (and sometimes never-before-seen) features—for instance, A/B Testing Widgets in WordPress! We’ve learned a lot and, thanks to the experience gained during the last year, the quality of and the features included in Nelio A/B Testing have really improved. But we’re never satisfied, and are always trying to increase the quality of our conversion rate optimization service.
The most popular A/B experiments of our clients
In a previous post, I studied which were the most loved A/B Testing experiments (early may 2014). The results showed that most of our customers where delighted with A/B experiments of pages and Heatmaps and Clickmaps. These two were the most popular experiments used by our customer base, and we can confirm that this trend continues being truth.
A/B Testing a page and exploring its heatmap data is one of the entry point features of our service. It’s always a good idea to get started with split testing through an experiment of alternative pages. And know what? You can combine these two experiments. Just create an A/B experiment of pages and make it run. Once you get the first results of the experiment, you can explore the heatmaps and the clickmaps of any of the original or alternative pages that belong to the experiment. The integration of split testing and heatmaps was a win, and as you can see, it still is. No other WordPress plugin provides this testing combination at this level of simplicity.
Furthermore, the people who use Nelio everyday to improve the conversion rate of their WordPress sites have created almost 1,300 experiments during our first year. Don’t give up! You are doing really great increasing your conversion rate and getting a better understanding of your visitors.
These experiments have tracked a big amount of information from real visitors, which is the main tool we have to improve based on data and not opinions. And don’t be afraid about it, the data our plugin gathers (of course) does not contain any personal information. We just track information about page views, clicks and mouse movement around pages, form submissions (the fact that a form was submitted, not the data filled in the form), and visualization of titles and headlines.
In fact, our service has tracked more than 4.5 million interactions of visitors in WordPress sites, as you can see in the following chart. That’s a lot of information, but the important thing is you don’t need to keep it in your WordPress database. Everything is stored in our cloud servers, and because of that the impact of tracking data in your WordPress installation is minimal. Most of the A/B Testing plugins out there mess up with your database, and that’s something we decided to avoid from the beginning.
The last curiosity about the data tracked by our service is the different resolutions used by visitors when accessing a WordPress site. We take into account the resolution of their devices when tracking heatmap data. Actually, we classify the interactions in four groups with respect to resolution:
- Phone (up to 360px width)
- Tablet (up to 768px width)
- Laptop (up to 1024px width)
- Desktop (more than 1024px width)
The following chart shows the data distribution along these four resolution groups.
It is easy to observe that most of the tracking information comes from high resolution devices, which is a real trend in actual market. Nowadays, resolution sizes are growing (1080p, 4K, etc.). With that in mind, it is easy to understand that putting our efforts and resources on improving user experience in high resolution devices is something we must do at the very beginning. Then, we may focus on improving contents for mobile phone resolutions.
What kind of results can I expect?
We examined A/B testing experiments in the past year to measure their impact. We wanted to see if the experiments our customers were performing were really helping them to improve their conversion rates.
From the analysis, we discovered that in average a 47.22% of the A/B testing experiments found a winner version from the set of alternatives (including the original version) that were tested in the experiment. And from the set of experiments with a winner, a 58.44% of them found that the winner version was one of the alternatives created for the experiment (not the original one). But the question is, once one experiment finds an alternative that wins the original version, which is the improvement we get from it? The next chart summarizes the answer to this question, and it’s amazing!
In average, all kind of A/B testing experiments analyzed where one alternative won produced an improvement in the conversion rate of the element under test above 75% (even more for the case of experiments of themes, posts and titles). But, what does it mean? Well, let’s assume our WordPress site has an actual conversion rate of 1%, that is, one visitor out of 100 converts (he does what we want him to do, e.g. visits a page, clicks on a link, submits a form, etc.). The following chart shows what will be the conversion rate after performing a successful A/B testing experiment taking into account the improvement factor aforementioned. As you can see, the new conversion rates achieved are brutally improving the performance of the element under test.
We can go further and study which is the real benefit for a situation like the one described before. Again, let’s assume that with the original 1% conversion rate the hypothetical website produces a revenue of US$1,000. After applying the winner versions instead of the original content, and assuming we get the conversion rates indicated in the previous chart, we can construct the following chart. Here you can observe in green the additional revenue our customers may get thanks to the winner alternatives discovered by using our service!
That growth in the revenues you may get through split testing implies that Nelio A/B Testing clearly pays for itself. Data don’t lie, so it’s up to you to start now split testing your site in search of better versions of your content from the point of view of conversion rate.
Finally, if you’re considering A/B testing your site, you may be wondering how much time experiments have to run before obtaining relevant results. As you can imagine, this depends on the number of alternatives and the amount of visitors you have. Luckily, I can show you this data for the experiments that have been running in Nelio for the past year. The next (and final) chart shows the average number of days their experiments were running, with respect to the kind of experiment. In all cases, all the experiments were running for less than a month and most of them didn’t surpass two weeks. As you can guess, that’s not too much regarding the benefits you can get in return.
If you reached the end of this post, congratulations! Here you have the summary from a year of Nelio A/B Testing:
- Our plugin was downloaded more than 5,000 times.
- We’ve launched 11 major releases of it, and a bunch of minor ones. We are committed to improve it day after day.
- Most popular experiments are A/B Testing experiment for Pages and Heatmap experiments. And you can combine them!
- Our service has tracked more than 4.5 million interactions of visitors in WordPress sites.
- Most common resolution tracked is for desktop devices with more than 1024px width.
- In average, in all A/B testing experiments analyzed where one alternative won the original version you may get an improvement in the conversion rate of the element under test (in average) beyond 75%. It means that you can almost double the revenues you get with your site.
Do not hesitate and join the community of Nelio A/B testers. We’ll love to have you in!