A/B Testing Statistics -How Many Companies Use it in 2023?

Written by: Muninder

Updated: April, 18, 2023

A/B testing (also known as split testing) compares two or more variants of a webpage against the original to determine which one performs better. 

It is estimated that more than 70% of companies are running at least two tests a month, helping them make informed decisions on what changes should be implemented to boost user experience and increase conversions. 

Before we get into the details of A/B testing, its benefits, and success stories, here are the most important stats to remember.

  • 71% of companies are running two or more tests a month
  • 60% of companies are running tests on their landing pages
  • For ecommerce websites, A/B testing can increase the average revenue per visitor by 50%
  • At Bing, around 80% of suggested changes are run as controlled tests first
  • The A/B testing software market is set to reach $3.4 billion by 2032
  • 60% of companies consider A/B testing to be “highly valuable” for conversion rate optimization
  • Four out of five SEOs witnessed an increase in organic traffic after an A/B test

Now, check out the latest A/B testing statistics to get the most out of this powerful tool. 

A/B Testing Usage Statistics

1. 71% of companies are running two or more tests a month. 

Although the majority of companies (77%) are using A/B testing to optimize their websites, many also run A/B tests on their 

  • Landing pages: 60% of companies
  • Email campaigns: 59% of companies
  • Paid search: 58% of companies (Invesp)

2. Ecommerce companies are using split testing more than other industries. 

Judging by data collected by VWO, the ecommerce industry is the most active user of A/B testing—as many as 39% of their clients are in the e-commerce business. They are followed by consultancy and digital marketing agencies (25%) and SaaS companies (10%). 

When done right, split testing can pay off massively for ecommerce websites, increasing their average revenue of $3 per unique visitor by up to 50%. (VWO)

3. 75% of the top 500 online retailers use an A/B testing platform.

Leading companies like Microsoft, Facebook, Netflix, and Booking.com continually run thousands of online experiments. Even companies that do not have digital roots like Walmart, Singapore Airlines, and Hertz are performing A/B tests to improve user experience, identify the weak points in the conversion funnel, and increase their ROI from organic traffic.

In addition to online retail, split testing has found applications across various industries and sectors, from politics to brick-and-mortar locations. (Wired, Harvard Business Review)

4. Around 61% of companies run fewer than five tests a month on their landing pages. 

This means that more than half of organizations are not using the power of split testing to optimize their conversions even though only 22% are happy with their current conversion rates. (AdPushup)

5. Bing performs around 1,000 tests a month. 

In fact, at Bing, around 80% of suggested changes are run as controlled tests first. 

Changes implemented as a result of monthly A/B tests have helped Bing increase revenue per search by 10 to 25%. In addition to a revenue boost, split testing has also improved user satisfaction. This in turn has ensured that Microsoft’s search engine has remained profitable and increased its share of searches in the US from 8% to 23%. (AltexSoft, Marketing Mag, Harvard Business Review)

6. Google runs over 10,000 A/B tests a year.

Google also uses split testing—running their first A/B test back in 2000 to check whether the 10 results per page layout was optimal for users. Due to a technical glitch, the experiment failed but it did provide invaluable insight into how the loading speed affected user satisfaction. 

In 2011, the company performed over 7,000 A/B tests on their search algorithm and today the company reportedly carries out more than 10,000 tests a year.

General A/B Testing Statistics 

7. Only 7% of companies believe that split testing is hard to put into practice. 

A/B testing may sound like a very complex tool, but in fact, 63% find that implementing split testing is not difficult. (Invesp)

8. Titles are the most popular element to test on a website.

86% of marketers say they experiment with page titles making this the most popular on-page element to test. Titles are followed by meta description and headline tags, tested by 71% and 51% of SEOs respectively. 

Surprisingly, even though 80% of SEOs identify internal links as essential for on-page optimization, only 42% test them. Schema markup is another on-page element that is considered important (as stated by two-thirds of respondents) though it is only tested by 46% of marketers. (Semrush)

9. A/B tests can run from one hour to two months. 

This actual time frame depends on the element you are trying to test. At VWO, for example, 40% of marketers create a test within 60 minutes. Power testers, which set up almost 8 times more tests than average, are rare though, and only comprise 5% of VWO’s active users. (VWO)

10. A/B testing is the most popular CRO method.

According to a recent survey, as many as 60% of companies are already using split testing to optimize conversion rates. An additional 34% plan to adopt this method, while only 6% do not intend to use it. 

Website personalization, used by just 23% of the companies surveyed, is the least popular method to impact conversion rates. (Smart Insights)

Email A/B Testing Statistics 

11. Ecommerce companies that run A/B tests on their email campaigns generate 20% more revenue. 

Not only that, but brands that test each email in their campaigns get 37% higher ROIs than brands that never carry out A/B tests. (Litmus)

12. 39% of brands don’t A/B test their broadcast or segmented emails. 

A report from Litmus suggests that 39% of brands never or rarely test their broadcast or segmented emails. The same is true for automated and transactional emails—over 65% of organizations don’t A/B test their automated emails, whereas 76% never or rarely experiment with their transactional emails risk. (Litmus)

13. 39% of companies test their email’s subject line. 

Most organizations worldwide focus on split testing the key elements, such as subject lines (39%), content (37%), date and time the email was sent (36%), and preheaders (23%). (Litmus)

14. 89% of US companies run A/B tests on their email campaigns. 

The US takes the lead when it comes to A/B testing with 36% of responding companies in a Mailjet survey saying they run tests on their email campaigns most of the time. In France and Germany, on the other hand, over 20% of the organizations surveyed reported that they never perform split testing. (Marketing Land)

A/B Testing Software Statistics 

15. The A/B testing software market was valued at $1.1bn in 2022. 

The market is set to reach $3.4 billion by 2032 with a projected growth of 11.6%. The US is expected to hold the largest market share, going up to $1.2 billion by 2032 and expanding by an estimated CAGR of 11.4%.

Web-testing A/B software will be the highest revenue-generating category, registering a growth of 11.3%. (Future Market Insights)

16. Just 44% of companies use A/B testing software.

Despite the prevalence of split testing software, less than half of the companies out there use their services. As a matter of fact, according to Buildwith, only 1,103,705 websites are currently using A/B testing technologies. Even more surprisingly, only 3,484 of the 10,000 top-ranking sites by traffic are utilizing A/B testing tools. (Invesp, Buildwith)

17. Google Optimize 360 is the most popular A/B testing tool. 

With a market share of 51%, Google Optimize 360 is the most used A/B testing technology worldwide. It is followed by US company Optimizely (22% share) and business analytics service company Mixpanel (10%). (Buildwith)

Benefits of A/B Testing

18. More than half of companies believe split testing can improve conversion rates. 

60% of companies consider A/B testing to be “highly valuable” for conversion rate optimization. Likewise, split testing is highly rated among conversion rate optimization experts, who give it a score of 4.3 out of 5 and place it just below digital analysis as the most useful CRO testing method. (Invesp, CXL)

19. Companies that use A/B testing get 10% more weekly page views. 

Organizations that adopt A/B testing tools show better results than those that don’t, including a 5% higher chance of getting venture capitalist funding and 9 to 18% more products. (Harvard Business Review)

20. Four out of five SEOs witnessed an increase in organic traffic after an A/B test.

On top of that, 74% saw an uplift in organic CTRs while 49% reported an increase in customers. (Semrush)

A/B Testing Fail Statistics 

21. Only one in eight A/B tests drives significant change.

This seems to suggest that a lot of companies are running tests but not seeing results. At Google and Bing, for instance, only around 10% to 20% of experiments actually generate positive results. Overall, at Microsoft, just one-third of tests prove effective, whereas one-third provide neutral results, and a third of experiments have negative results. (Invesp, Harvard Business Review)

22. Around 57% of marketers end A/B tests once they confirm their original hypothesis.

It is estimated that over half of experimenters give up on split testing once their original hypothesis is proven. This is known as p-hacking, a technique that misuses data analysis leading to unreliable results.

On top of that, Convert estimates that only 20% of tests reach the 95% statistical significance mark, which means that around 80% of experiments are stopped before they yield relevant results. 

A/B testing statistics further indicate that waiting 3 to 14 days to complete the test properly is more likely to produce significant changes, more in the range of -50% to 200% rather than a 1 to 2% change. (Adoric, VWO, Convert)

23. An online experiment needs 25,000+ visitors to provide relevant results. 

To get pertinent results, a company would need access to large customer samples and huge amounts of data on user interaction on various websites and apps. While some marketers suggest that reliable results can be obtained with a minimum of 5,000 unique visitors, a larger sample size will yield more trustworthy findings. (VentureBeat, AB Tasty)

24. 68% of companies say they don’t have access to the right split testing tool.

Based on a 2021 survey, only 65% of responding companies stated that they run A/B tests on their SEO strategies. Most of them (68%) said they do not have the right tool or time to perform relevant tests, while 51% stated that they have no budget allocated to run SEO tests. 

The biggest challenge, though, is deciding on a clear result—51% of marketers surveyed said they couldn’t determine when a test is statistically significant, making it difficult to interpret results and evaluate whether the experiment was a success. (Semrush)

25. Almost 53% of CROs don’t have a standardized stopping point for A/B testing. 

Using prioritization frameworks is one of the essential steps in running A/B tests. According to the most recent data, a little over half of CROs (52.8%) use a test prioritization framework. (CXL)

A/B Testing Case Studies 

Former US President Barack Obama raised $75 million from donations thanks to A/B testing. 

In 2007, Dan Siroker, digital adviser for the Obama presidential campaign used split testing to confirm the success of design changes on the campaign website. Thanks to the rigorous controls and data collection involved in the split test, the campaign managed to collect four million emails out of 13 million addresses on the email list and raise $75 million in donations. 

During the 2012 presidential campaign, A/B testing helped the Digital team raise $250 million in 6 months and increase donation conversions by a whopping 49%. (Wired, BrightEdge)

Hubspot’s personalized sender test increased open rates by 0.53%.

Hubstop tested the effects of sending cold emails with a personalized sender name rather than a generic company name. The personalized version of the email had a 0.53% higher open rate and 0.23% higher CTR. This small tweak also helped them gain 131 leads. (Campaign Monitor)

Bing’s increased revenue by 12% through A/B testing.

In 2012, a Bing employee ran an online controlled experiment to evaluate the impact of changes made to displayed ad headlines. The new headline variation proved to be a huge success, increasing revenue by 12% and netting more than $100 million for Microsoft’s search engine.

Another controlled experiment run by Microsoft resulted in a 5% increase in clicks per user. The test, which involved opening links in new tabs, originally ran in the UK where engagement rates increased by 8.9%. (Harvard Business Review)

Yahoo’s display ads experiment. 

A/B testing is not only used to boost revenue—when performed in the right way, it can actually point to wrongful assumptions and save companies from making thousands of dollars worth of mistakes. 

Before their experiment, Yahoo estimated that display ads for a brand shown on their sites increased searches by 871% to 1,198%. However, the A/B test indicated the growth to be 5.4% only. If it hadn’t been for the test, the company might have put the boost in searches on ads alone rather than counting in other variables. (Harvard Business Review)

Testing Google’s shades of blue added another $200 million in revenue. 

Google famously performed over 40 different tests of all the shades of blue until it found the right hue for their CTA button. This controlled experiment allowed the company to rake in an additional $200 million in revenue. (Medium)

AWeber’s creative subject line.

In a test that ran across 20 subject lines and involved 45,000 subscribers, AWeber discover that clear and simple subject lines on emails got 514% more responses than the creative, catchy versions. (MarketingSherpa)

Dell’s landing page testing 

When Dell tested their landing pages against web pages, the company witnessed a 300% increase in conversion rates. (DMI)

Slight wording changes in PPC ads can double conversion rates, A/B testing stats reveal. 

Sometimes by following instinct marketers can make the wrong choices—that’s where A/B testing comes in. For instance, when considering these two options “A. Get $10 off the first purchase. Book online now!” and “B. Get an additional $10 off. Book online now”. Marketers in charge of the project would go for option A, but A/B testing actually showed that option B converted better and doubled the CTR. (Wordstream)

Bottom Line 

When done right and consistently, A/B testing allows companies to assess several ideas quickly and accurately at a negligible cost and without affecting the key aspects of user experience. 

Keep in mind that split testing can’t fix all issues on a website and might not be the most suitable method when it comes to quick decisions and judgment calls. However, if you have a notion of what works and what doesn’t, as well as the data to support your ideas, you can make easier and more effective marketing decisions through A/B testing.

by

Copyright @2021 Dailyblogging.org