We hope you’re finding the data Formisimo provides useful. Are trends emerging yet? When you know what’s wrong with your form it’s a lot easier to come up with solutions to make it better.
This is lesson 5 and it’s time to put your data and optimisation ideas to use. We’ve got a challenge for you; design and test a new form that increases conversions.
In lesson 5 you’ll learn:
- Why A/ B testing is necessary
- How to use your data to guide your optimisation strategy
- The A/B testing toolkit
Why A/B test?
Just because you can see where your users are dropping off doesn’t mean you know the best way to combat this. Indeed, the greatest lessons learnt from the practice of A/B testing is that it’s hard to accurately predict the winning variation.
Website, Which Test Won revolves around that premise.
Paul Boyce, of Popcorn Metrics, says removing the guess work is exactly why he loves A/B testing:
“What I love about A/B testing is that you get solid facts so you know what works and what doesn’t and more, it’s liberating – because you don’t need to have all the answers!” – Paul Boyce, Co-founder of Popcorn Metrics, @paulmboyce
How to use your data to guide your optimisation strategy
With the help of lessons 3 and 4 you should be getting used to reading your reports and understanding what problems your form is suffering from. These are the areas you need to improve first because they have a direct link to lost conversions.
For example a form with many visitors on mobile devices but that has 100% fail rate on that device type.
This should alert you to a big problem with your form on mobile devices.
Once you know everything that needs to be improved you can get to work tackling your form’s issues. Start with the areas experiencing most drop-offs for the biggest potential gains. Kathryn Aragon calls this the ‘low hanging fruit’.
In some cases it could be a straightforward fix to resolve a technical error, this includes browser and device compatibility issues.
For everything you need to come up with hypotheses and test them e.g. “I think if I introduced a concertinaed design to the form it would convert better.”
The A/B testing toolkit
Tools to gather data
Formisimo is your number one piece of kit for gathering in-depth quantitative data about user behaviour in your forms.
User testing or session replay
Formisimo shows you what is wrong and points you in the direction of why. Actually looking at the issue in the context of a user session can offer greater insight. Watching a user’s behaviour up to point of encountering an issue and then observing how they react to it is extremely insightful.
Tools to plan the test
The data from A/B tests is only useful if it’s reliable. This means having enough traffic to your variables and allowing enough time to see trends stabilise.
Sample size calculators
Evan’s Awesome A/B Tools
Evan’s Awesome A/B Tools is a set of statistical calculators, developed by Evan Miller.
The most useful for you at this stage is the sample size calculator. This is designed to tell you how many visitors you need to each version of your test which allows you to predict how long the test will take to run. Input your goals and specify the parameters of the test.
The rest of the tools in Evan’s suite let you query your experiment, either hypothetically, aiding the planning process or with real data, helping to illustrate the results.
Optimizely’s Sample Size Calculator
A/B test champions, Optimizely, also have an easy to use sample size calculator.
What is statistical significance?
Statistical significance or ‘statistical confidence’ means the chance that the results of your test aren’t random. It has nothing to do with whether a variant will achieve higher conversions, only how likely it is that the results are true and not a fluke.
Best practice advice is that you can’t trust a test result with under 95% statistical significance. At that point there’s only a 5% chance that your results mean nothing.
Throughout your test the significance level will fluctuate. A common mistake is stopping the test early, believing a winner has already emerged. In fact, you need to wait until you’ve reached your target number of visitors. 95% significance on only a small amount of data is just as useless as low significance.
Both Optimizely and Evan Miller’s Sample Size Calculator are set to a default level of statistical confidence that can be changed. If you choose to lower statistical significance your test will finish more quickly but there will be greater doubt over the winner.
Tools to run your test
Google Analytics Content Experiments
Added a few years ago to replace Google Website Optimizer, Content Experiments is an integrated feature of Google Analytics. It’s useful because you can set up experiments within your Google Analytics account and monitor them there.
This tool doesn’t help you create your variant pages. You must have prepared these before you can set up a new Content Experiment. Instead it’s a tracking tool that gives you live results throughout the duration of your experiment.
Although not compulsory, the option to add goals makes Content Experiments stand out from other A/B testing tools. There are 4 types of goals you can set for your experiment:
- URL destination goals
- Event goals
- Session duration goals
- Pages per session goals
Goals make the success or failure of your experiment more tangible.
The reports generated are in the same style as other reports in Google Analytics (GA). The familiar layout means it’s easy to use for anyone already familiar with GA.
Optimizely on-page editor
If you’re looking for greater insight than Google can offer then it’s time to look at paid options. Optimizely is a comprehensive service that helps to create multiple variations of your web pages and provides a platform for tracking.
“It used to take days or weeks to set up an online experiment, now this can be done with fewer resources and less time.” Fabian Liebig, Marketing Manager, Optimizely, @FabianLiebig
Optimizely acts like an on-page editor. You see your web page but can change text and imagery, move elements around.
Click an element to add to, remove, change or rearrange it in the page structure. Editing right on the page means you can see your change in the context of the rest of the design. It’s also much faster than coding changes.
Create any many variations of your web page as you like and begin the experiment at the click of a button.
Unbounce landing page creator
Unbounce is an easy-to-use landing page builder with in-built A/B testing capability. Marketeers can produce landing pages without design or development time.
Landing pages are a great place to play around with different designs and copy. Unbounce has lots of templates to choose from. The most successful variant in your A/B test will tell you what works best to convert your customers.
Formisimo and Unbounce integrate with each other too. You need both a Formisimo and an Unbounce account. Follow our instructions on Setting up Formisimo in Unbounce (PDF 800KB).
Test your test – is working properly?
40% of A/B tests could be marred by coding errors, accordingly to data collected by Craig Sullivan.
Close results between your variants might be sign that your test isn’t working properly. If there are no errors in the test set up, then it’s likely that:
- Traffic to your form is too low, resulting in too few conversions or
- The difference between variant A and B isn’t influential to conversions.
You need to achieve close to your recommended sample size for significant results. Read Rich Page’s article How to Test and Improve Your Website If Your Traffic is Too Low for A/B Testing.
Copywriter, Jen Havice, has written about her experience of A/B testing. Her results were inconclusive. She puts this down to two things, low traffic and subtle changes, concluding that more dramatic changes could have separated the variants.
Know someone who could benefit from this lesson? Feel free to share this lesson.