8 top mistakes marketers make with website A/B testing

8 Top Mistakes Marketers Make with Website A/B Testing

Website A/B testing – the process of comparing two page variations against each other to determine the more effective alternative – can be an insightful tool when it comes to site optimisation and user experience. 

However, if you’re not getting the results you expected, you could be making one of these top eight mistakes:

Mistake #1: Not testing with enough traffic

In order for A/B testing to be effective, you need enough traffic to prove statistical significance.

If your site receives fewer than 1000 unique visitors per week or 5-10 conversions per week, you may be better served by investing in traffic generation than A/B testing. 

Mistake #2: Not defining a hypothesis

Without a clear hypothesis as a starting point, it will be difficult to track and measure your results effectively. With A/B testing, you always need something to measure against. A few sample hypotheses to get you thinking include:

  • Images and Graphics: Will different designs, pictures, or colours produce more conversions?
  • Headline and Copy: Will short copy perform better than long copy? Will using bullet point lists instead of paragraphs increase engagement? 
  • Call to Action (CTA) Placement: Will changing the location of a CTA button or link on a page result in better performance?
  • Sign-Up or Purchase Form: Will a different form design, number of fields, or number of steps lead to more completions?

Taking a detailed look at your page analytics and understanding where the problems are can help you to define the right hypothesis to be testing. Differing opinions between team members can also help. Who’s actually right? Let your A/B testing decide! 

Mistake #3: Not defining multiple hypotheses

Don’t lock yourself into a single hypothesis. Instead, you may need to explore multiple hypotheses to thoroughly explain and diagnose a problem.

For instance, imagine a lead magnet opt-in page. Creating testing hypotheses about the opt-in form – such as whether it should have fewer fields or be spread out onto multiple screens – is a good starting point. But if you get too caught up on optimising the form, you might miss issues associated with the value proposition of the lead magnet, the copy you use to describe it, or the CTA you use to drive opt-ins.

That’s the value of defining multiple possible hypotheses. Let your data dictate where your focus should be.

Mistake #4: Not testing long enough (or testing too long)

Testing conducted over too short a time period will not produce comprehensive or reliable results. Don’t cut the testing short, even if you feel you have a certain result in the first few days.

At the same time, testing too long risks polluting the data you’ve captured. Be sure to end your tests as soon as statistically significant winners are identified.

To determine the correct testing duration, you will need to correctly weigh up factors such as your existing traffic, existing conversion rate and expected improvement. Some sources recommend a testing period of at least two weeks. However, if your site traffic is low, that may not be enough time to secure enough visits. Calculators such as this one from VWO can help you determine whether you’ve reached statistical significance.

Mistake #5: Not leaving a control variant in place

As you start to see improvements in your engagement and conversion rates, it can be tempting to race ahead, implement more changes, and forget about utilising control variants.

However, not retaining control variants will see you lose your capacity to keep measuring the impact of all changes accurately.

Imagine that you want to test two new versions of an interface element. However, if you test both against each other, both variants can be considered ‘new’. Without the existing version (i.e. the control), how will you know whether the new variant or the original is better? It could be that not changing the element was the better option, but you won’t know it if you don’t include it in the test.

Mistake #6: Testing something insignificant

Make sure you’re testing elements that are significant enough to have an impact on your results, given your site’s usual traffic volumes. 

Using a different shade of blue for your button, for example, isn’t likely to have as much of an impact as changing where that button is placed in the overall page layout (unless you’re Google, and you have sufficient traffic for tiny changes to pay off).

If you suspect that what you’re testing isn’t significant, pick your battles more carefully. Review your analytics and ascertain a more prominent element to test instead. 

Mistake #7: Changing too many variables at once

With website A/B testing, it’s important to keep a clear focus on changing one thing at a time.

When you change too many elements at once – even if you keep a control variant in place – it’s impossible to get a clear picture as to which changes produced what results. A/B testing should not be confused with multivariate testing. It can also be jarring to users who are already familiar with your interface. 

Don’t negatively impact users’ experience whilst you’re trying to improve it. A slow, steady, and measured approach works best, and allows you to fully track your improvements.

Mistake #8: Not accounting for irregular occurrences

Seasonal traffic spikes or other irregular occurrences can skew your results. Take note of anything irregular that might be causing unusual statistics when testing. For instance, if you’re running A/B tests at the same time your website optimisation service is making updates, be aware that changes to the types of traffic you receive could skew the results of your test.

Additionally, testing should take place in comparable periods to produce the most meaningful results. If your organisation experiences seasonal fluctuations, for example, measuring on-site performance during peak times and low periods will not generate useful insights.

Still not sure how to get more accurate results from your website A/B testing? Get in touch with Sitback’s experienced team for expert guidance.