Getting started with A/B testing: Part 1 – Preparation

A/B Test Wireframes

A/B Test Wireframes

Whilst personally this isn’t my first foray into A/B testing, it is for my current place of work and as so I thought it was an ideal opportunity to document the process. With doing this, I will be able to share best practice, pitfalls and everything in between.

For each part of the process I will write a blog post. First up preparation.

 

It only takes a minute to set up

This is the statement all A/B testing tools will tell you, and yes that could be the case. Simply add this line of code on all pages and its ready to go. The reality is you will have a web presence made up of several sub domains, some CMS driven, some not. Throw into that you probably don’t have access to the code to add the extra line of code, that will be your IT team who will have processes and other priorities.

Basically to get all pages to have the code will take time. But that’s time you need to utilise…

 

Track everything, Measure whats important

For an A/B test to be truly measurable you need to have website objectives/goals to measure against in addition to the conversion rate of the test in question. Why? Well your new alternative might be better than the original in terms of conversion, however this might of had a negative effect on the site key goals (e.g. increase in downloads via a/b test however now we have drops in sales). PRWD have written a good post on this – don’t be a slave to conversion rate.

NB: Most A/B testing tools can integrate with GA  providing custom variables on the test in question which you can then use to segment goal data or in custom reports. Having this will provide you with the effects of testing on key objectives as well as conversion rate (available in the testing tool)

First up though is implementing a sound measurement model containing your goals/ objectives. So while developers add your testing code, get your models in place (Give Avinash’s DMMM post a read to know what to include). From personal experience getting this agreed takes longer than adding code!

 

Remember A/B tests are not a toy, you need a reason to test

Sounds obvious, but just because you can test, doesn’t mean you should test. You need a reason to test, a hypothesis, which you can derive from the insights gleaned from measurement models, user feedback, user research etc. Unbounce with Michael Aagaard talk about this on the new CTA Podcast.

Educate those involved in the tests

There are several things you need to explain to people involved in the tests (either those who help put them together, or those that see the results). Make sure they:

  • Understand the design process and where A/B testing fits in
    Several good posts on this (Google Ventures – product design sprint is a good one) , but in essence A/B tests come at the end of the cycle (i.e. a page that has had user research gathered on it, a page that has had idea generations, a page that now has alternate solutions, a page that has had user testing and is now being “evaluated”

 

  • Understand that testing (and thus the design process) is a continual process
    I see this so many times, that this one A/B test will be a magic wand and show page X is the best (and solve all our problems). Yes it might lead you to better conversions but A/B testing isn’t a one trick pony. A/B tests are about gleaning insights and form part of the cyclical design process and so are not done as a one off, but as iterations. Each A/B test will provide more insights (good and bad) which put together will slowly but surely give you incremental improvements.

 

  • Understand tests are mainly limited to pages with high footfall
    Pretty logical, however one that will stop you testing any page. For a test to validate it needs to be looked at by a relatively large group of people. For pages with low traffic there are a number of other alternative tests to use.

 

  • Understand that tests wont be 100% conclusive
    Again a common misconception, normally with senior management, I’ve seen people expect an A/B test to have an outright winner. Maybe this is simply based on the name “A/B” so they think its “A” or its “B”, but the reality is this doesn’t happen. Tests will show the increases/ decreases in page conversions and other goal conversions for each alternative. Both can be positive and both can be negative, what you are after is one better than the other and then learn from the test, make changes and repeat to see if you can make the next design iteration that little bit better. Rinse and repeat…

 

  • Understand duration of tests should be matched to the length of business cycles
    A/B tests, and testing tools, give a metric called Statistical Significance. This is calculation, in theory, tells you the number of users to hit to make the test significant. For some of you this might be achieved in hours or days, so is this the time to stop the test? Nooooooooooooo. Peep Laja has written an excellent post on this issue including why test duration should be linked to business cycles.

 

  • Understand that even the best of us (creating the test) can’t predict what the user will prefer
    Whilst “we” have a good knowledge base and best practice at hand, we are not the user. Users can do the most unexpected things because of the situation they operate in. They could be stressed, they might have a time constraint whilst on the webpage, they might need to make an important decision. “We” are not in their situation so will come to different outcomes, hence the need to test, even if its just to validate our instinct.

 

 

If you haven’t already noticed, the other thing to do with your time is to read form other peoples experiences. Your tests will be unique as will be the results but lessons learnt will be common.

 

Tagged with: , , , , ,
Posted in Insights, Optimisation

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About me

Leading Digital Services @LJMU. Advocate for UX and customer-centric "insight" driven design. #UX #Analytics #CRO #SitecoreMVP

Follow me:

Enter your email address to follow this blog and receive notifications of new posts by email.

%d bloggers like this: