Skip to main content

What is the AB Test?

The AB test is essentially an experiment in which two or more variants of a page are shown to random users.
Burcu Binici
Burcu Binici
5 min. read
AB Testi Nedir?

What is the AB Test?

A/B testing (also known as split testing or bucket testing) is a method of comparing two versions of a web page or application against each other to determine which performs better. AB testing is essentially an experiment in which two or more variants of a page are shown to random users, and statistical analysis is used to determine which variation performs better for a given conversion goal.

Running an AB test that directly compares a variation to a current experience allows you to ask focused questions about changes to your website or app and then collect data on the impact of that change.

The test takes the assumptions in website optimisation and makes informed decisions with data that shifts the business conversations from "we think" to "we know". By measuring the impact of changes on your metrics, you can ensure that each change yields positive results.

How Does the A/B Test Work?

In an A/B test, you take a web page or application screen and modify it to create a second version of the same page. This change can be as simple as a single headline or button, or it can be a complete redesign of the page. Then, half of your traffic is shown the original version of the page (known as the control) and half is shown the modified version of the page (the variation).

When visitors are presented with the control or variation, their interactions with each experience are measured and collected on an analytics whiteboard and analysed with a statistical engine. You can then determine whether changing the experience had a positive, negative or no effect on visitor behaviour.

Why Should You Do A/B Testing?

A/B testing allows individuals, teams and companies to collect data about the results when making carefully considered changes to their user experience. This allows them to form hypotheses and better learn why certain elements of their experience influence user behaviour. In other words, they can be proven - their view of the best experience for a particular purpose can be proven wrong through A/B testing.

Rather than answering a one-off question or settling a dispute, AB testing can be used to continuously improve a given experience over time by improving a single goal, such as conversion rate.

For example, a B2B technology company may want to improve the quality and volume of sales opens from campaign landing pages. To achieve this goal, the team will experiment with A/B testing changes to the headline, visual image, form fields, call-to-action, and overall layout of the page.

Testing one change at a time helps you determine which changes affect the behaviour of your visitors and which do not. Over time, they can combine the impact of multiple winning changes from the experiments to show the measurable improvement of the new experience over the old experience.

This method of introducing changes to a user experience ensures that the experience is optimised for the desired outcome and can make key steps in a marketing campaign more effective.

By testing the ad copy, marketers can learn which version gets more clicks. By testing the subsequent landing page, they can learn which layout best converts visitors into customers. If the elements of each step work as efficiently as possible to win new customers, the overall expenditure on a marketing campaign can be reduced.

A / B testing can also be used by product developers and designers to demonstrate the impact of new features or changes to the user experience. As long as the goals are clearly defined and you have a clear hypothesis, the user engagement, models and in-product experiences that take place on the product can be optimised with A/B testing.

A / B Test Operation

The A/B test framework that you can use to start the tests is given below:

  • Data Collection: Your analytics often provide insights into where you should start optimising. It helps to start with high-traffic areas on your site or app, as this can allow you to collect your data faster. Look for pages with low conversion rates or high discount downloadable pages.
  • Setting Goals: Your conversion goals are the metrics you use to determine if the variation is more successful than the original version. Goals can be anything related to clicking a button or linking to product purchases and email signups.
  • Formulate a Hypothesis: Once you have set goals, you can start generating "why test A/B" ideas and hypotheses because you think they will be better than the current version. Once you have a list of ideas, prioritise them in terms of expected impact and difficulty of implementation.
  • Create Variation: Make the changes you want to an element of your website or mobile app experience using your A/B testing software (such as Optimised). This could be changing the colour of a button, changing the order of elements on the page, hiding navigation elements or customising it completely. Many leading A/B testing tools have a visual editor to facilitate these changes. Be sure to validate your QA to make sure your experiment works as expected.
  • Trial Run: Start your trial and wait for visitors to join! At this point, visitors to your site or app are randomly assigned a control or variation based on your experience. Their interactions with each experience are measured, counted, and compared to determine how each performs.
  • Analyse the Results: Once your experiment is complete it is time to analyse the results. The A/B testing software will present the data from the experiment and show the difference between how the two versions of your page perform, whether there is a statistically significant difference.

If your variation is a winner, congratulations! See if you can apply the learnings from the experiment to other pages of your site and keep trying to improve your results. Don't worry if your experiment produces a negative result or no results. Use the experiment as a learning experience and create new hypotheses that you can test.

Whatever the outcome of your experiment, use your experience for future actions and continually iterate on optimising the experience of your app or site.

A / B Testing and SEO

Google allows and encourages A / B testing and has stated that performing an A / B or multivariate test does not pose any risk to your website's search ranking. However, you can jeopardise your search ranking by abusing an A / B testing tool for purposes such as privacy. Google provides some recommendations to ensure that this does not happen:

  • No Cloaking - Cloaking is the practice of presenting different content to search engines than a typical visitor would see. Cloaking can cause your site to be demoted or even removed from search results. To prevent cloaking, use visitor segmentation to display different content to Googlebot based on user agent or IP address.
  • Rel = use "canonical" - If you split test with multiple URLs, you should use the rel = "canonical" property to revert the variations back to the original version of the page. Doing so helps prevent Googlebot from being confused with multiple versions of the same page.
  • Use 302 Redirect instead of 301 - If you run a test that redirects the original URL to a variation URL, use a 301 (permanent) redirect with a 302 (temporary) redirect. This informs search engines such as Google that the redirect is temporary and that the original URL should be indexed instead of the test URL.
  • Run Trials Only as Long as Necessary - Running tests longer than necessary can be seen as an attempt to deceive search engines, especially if you serve a large percentage of a variation on your page. Google recommends updating your site immediately after the test is complete, removing all test variations, and avoiding unnecessarily long tests

Our Offices

Drupart Locations

Our Officess

London

151 West Green Road, London, England

442038156478

[email protected]

Drupart R&D

GOSB Teknopark Hi-Tech Bina 3.Kat B3 Gebze - KOCAELİ

+90 262 678 8872

[email protected]

Newark

112 Capitol Trail Suite, A437 Newark DE, 19711

+17406666255

[email protected]

Wiesbaden

Hinterbergstraße 27
65207 Wiesbaden
Deutschland

+49 (0) 6151 – 492 70 23

[email protected]