A/B testing is one of the key things you can do to improve user experience, engagement and conversions. It involves testing a single variable with segments of your audience and will give you data directly from your users and customers. It can identify small changes which often make a big difference to your results, and help you create the most effective version of your website or email.
It can be tempting to make design decisions on what looks good or feels right or to make assumptions about your audience. What’s best practice and works for one company might not be right for you. A/B testing gets right down into individual elements which can be as small as the location of a CTA button, to show you want works best with your audience.
A/B testing is a process consisting of a number of elements. Follow these steps and you’ll soon be gathering data to make the best decisions.
- Analyse your existing data
- Set your goal and parameters (length of test, how significant results have to be)
- Select and create your variable
- Run the test
- Analyse the results
- Implement, if results meet significance threshold
- Test again for a different element
Analyse your available information
A good grasp of how your website or emails are performing will help you decide which area needs improvement. Use your analytics and other tools such as heat maps, to see which areas are underperforming, and where your users are getting stuck or bouncing away.
Understanding how your pages and campaigns are performing and how users are interacting with your site will give you a solid basis on which to start.
Set goals and parameters
Set your goals on the basis of your analysis of current performance and what your company’s most pressing goals are. Is to increase conversions? Improve dwell time, or user engagement? Increase your website traffic? Reduce cart abandonment?
With a clear goal in mind you can move forward to defining the parameters of your test: how long you’ll run the test for, and what level of results you’ll consider significant.
With an email test it’s easy enough to send out two versions to a random, equally sized subset of your audience. The winning version then gets sent to the remaining bulk of the audience.
If you’re testing elements on a webpage you need to decide how long you’ll leave the test version up and what your results threshold will be. If there’s only a slight improvement you may not consider it significant enough to warrant making a change as it could be down to random variance.
Select and create your variable
Make sure you test one individual element at a time. This is crucial, as testing more than one element means you can’t be sure what’s driving any change, so this will muddy the results.
There are many elements you can test, depending on your goal. These include:
- Headlines
- Email subject lines
- Text
- CTA buttons
- Images, video, audio
- Social proof such as testimonials
- Background colour
Something as simple as the wording or colour of a CTA button might make a big difference to your conversion rate.
Run the test
Once you have created your variable, assign the Control and Challenger versions to two random but equal-sized groups of your users. Run the two versions simultaneously.
There are many great A/B testing tools you can use out there. This article gives a roundup of the best. https://www.ventureharbour.com/best-a-b-testing-tools/
Many email providers offer in-build testing tools so you can make this part of your regular process. At Wordtracker we often test our subject lines with smaller subgroups before the main send. We use Mailchimp, which offers a number of testing options.
Analyse the results
Focus on the goal you set for the test. Did it improve your goal metric and was the improvement significant enough to implement?
If your results are inconclusive it’s not a wasted test. It just means that this element is not crucial to improving this metric and that’s also useful information to know.
Implement
Take action informed by your results. Your results may be applicable in the longer term. For example if you learn that a certain type of CTA button generates better results, you’d use this version for future designs.
Rinse and repeat
Once you’ve completed your A/B test, analysed the results and implemented the necessary steps, it doesn’t stop there. Move on to a different goal and element to test.
A/B testing should be an ongoing process of testing a variable, assessing the results and implementing the winner. Often minor changes can make a surprising difference to your rates of conversion or engagement.
If you’re not doing this, you’re effectively leaving money on the table.