A/B testing - Why and how?
[caption id="attachment_1920" align="alignright" width="233"]
A/B testing flowchart[/caption]
A/B testing is a way to experimentally decide which of two designs is the more efficient at driving traffic to a specified goal. In it's simplest form it's set up to divide the visitor flow into two groups, sending one group to each design and measuring which of the designs that has the highest conversion rate.
Most often it's done for an already existing page (A) acting as control, with (B) being the variant design to be tested.
In Google Analytics the process for setting up an A/B test is fairly simple, and it also presents the results in a straightforward way.
Of course the correct way to do this is to document it in a case for the blog, so here goes: a quick guide as how to A/B test in Google Analytics.
[caption id="attachment_1903" align="alignright" width="300"]
"Above the fold" wireframes of the variant pages used for the A/B testing in Google Analytics[/caption]
The first thing is always to figure out just what to test and why.
To have a simple example for this experiment I chose to see if the time on site (TOS) for a minor site I run on the side could be increased by showing more content and a bit more links on the frontpage than had been before.
This is actually just as simple as logging into Google Analytics, choosing Experiments in the dashboard, and following the instructions.
One of the key elements to testing is the patience to let the data accumulate. The winning design will be decided by the statistics and the usual minimum running time for an A/B test in Google Analytics is two weeks, but depending on site traffic it can show indications of the final results after just a few days. When the experiment has run it's course I will do a follow up with tips on how to interpret the results and other data from the testing you can use to gain other insights.

First and foremost: the preparition
[caption id="attachment_1903" align="alignright" width="300"]

- Goal: increase TOS.
- Means: redesigned frontpage
Setting up the A/B testing in Google Analytics

- Choose the address you want to experiment on.
- Choose the alternative pages you want to have in the experiment - up to nine variants can be added, but for smaller sites I'd recommend at the most two or three variants as the results depend on how many people partake in the experiment. More visitors equals a more definite result.
- Set your goals - define your experiment objective and how large part of the visitors who will be redirected to the alternate page (I created the goal of a time on site one minute longer than the one I have now, and that half the audience would go to the alternate page).
- Get the experiment code from Google Analytics and insert it into the control page, where Google Analytics will test it, and you're just about done.
And now …we wait
One of the key elements to testing is the patience to let the data accumulate. The winning design will be decided by the statistics and the usual minimum running time for an A/B test in Google Analytics is two weeks, but depending on site traffic it can show indications of the final results after just a few days. When the experiment has run it's course I will do a follow up with tips on how to interpret the results and other data from the testing you can use to gain other insights.