AB testing

A/B testing is comparing two versions of either a webpage, email campaign or an aspect in a scenario to evaluate which performs best. With the different variants shown to your customers, you can determine which version is the most effective, with data-backed evidence. Exponea offers A/B testing in scenarios, web layers and experiments. A customer is assigned a variant that will be shown to them instantly when reaching an AB split node in a scenario or when matching the conditions for displaying a web layer/experiment/email campaign.

There are two types of AB tests - Automatic winner distribution or Custom distribution. These allow you to either let Exponea choose the more effective variant which will be automatically run for most of the audience or you can manually specify the distribution for the probability of occurrence for each variant.

NOTE: THE FOLLOWING AUTOMATIC WINNER DISTRIBUTION WILL ONLY BE AVAILABLE AFTER THE RELEASE IN LATE FEBRUARY. UNTIL THEN THE CUSTOM DISTRIBUTION IS THE ONLY DEFAULT OPTION:

Automatic winner distribution

Automatic winner distribution tests your variants just on a small share of your target audience, determines which variant is more effective and then sends only the winning variant to the rest of the audience.

If the success rate of the two variants is equivalent, the remainder of the campaign will be sent in the original ratio.

The winning variant is chosen based on a metric and timeframe (usually hours) selected in the Winner determination. The metric can be conversion rate, click rate, open rate, or any custom metric (event).

Automatic winner distribution is available both in scenarios and in email campaigns.

Custom distribution

Custom distribution allows you to manually specify to what percentage of your total audience will each variant be shown. If you want to evaluate the success of your AB test you can either go to an auto-evaluation dashboard or you can do it manually with the AB test basic evaluation guide.

Calculating statistical significance

You can calculate the performance and statistical significance of your AB test with our AB Test Significance Calculator.

Control group in web layers

The control group by default does not show any variant. If there are any custom conditions specified in the JS code, it is important to create a custom control group that will have the same conditions despite the fact it doesn't show an actual web layer. This is important because otherwise, the compared groups would not be homogenous and the evaluation would be inaccurate.

Auto-evaluation dashboard

AB tests also contain an auto-evaluation dashboard. In the report, there are events that are within the evaluation window and in the uplifts they are compared to the average performance of all variants from the sample.

Technical notes about AB tests

You will not be able to edit any variant that has already been visited by a customer.

  • When going through the same A/B split in scenario, a customer is assigned same variant as before.
  • In web layers, each customer is assigned the variant in event “ab test” and will have the same variant in subsequent visits.
  • It is possible for a customer to be assigned to more variants if she visited the website from different browsers or devices. In that case, the system does not know it is the same customer until she is identified (through a login for example) and hence can assign different variants on visit.
  • For correct A/B test evaluation, all groups should be identical in conditions except for the variant.
  • Web layers offer automatic A/B testing, which allows a preference of the best performing variant, based on the chosen goal.

AB test event attributes

You can read on ab test event attributes in the System events article.

Updated a day ago

AB testing


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.