A/B/n Testing app

Magnolia A/B/n Testing is currently available as a Beta only.

This page explains how to set up and run a single A/B/n test from your Magnolia instance.

In order to start and run an A/B/n test in Magnolia, you must have installed the A/B/n Testing module as well as the Analytics module.

Creating a new test

When you create a new test, you are guided through the process in the app. Each step has a dedicated tab where you:

  • Set up the test

  • Define goals

  • Create variants

  • Select audience and allocate traffic

  • View test results

Once you have created and started your test, the app shows you where you are in the process and how much time is remaining to get significant results. The run time estimate typically takes the estimated traffic on the site into account.

At any point during the creation of the test, you can save it and carry on later. You can always see the test status in the A/B/n Testing app.

Basic setup

Before you start the test or create a variant, make sure the original page is published.

To set up a new test:

  1. In the A/B/n Testing app, click the Create new test action.

  2. In the Setup tab, enter a Name and Description for your test.

  3. Select the Original page on which the test is based.

  4. Use the sliders to choose the:

    • Base conversion

    • Uplift

    • Confidence level


When an original page is being used in a test, the status of the test in displayed in the Pages app. If necessary, you can continue working on the original page while the test is running. When the test completes and you must decide which variant to use, Magnolia informs you if the original has changed since the test began.
What is base conversion, uplift, and confidence level?

Base conversion is the existing conversion rate of the control variant (variant A).

Uplift is the target improvement (%) relative to the base conversion.

Confidence level is the likelihood (%) that the uplift reported by the test will be correct. 95% is a typical choice.

Use your left and right arrow keys to move the sliders.

Once you have completed the basic setup, go to the next tab to define a goal for the test.


To define a goal for your test:

  1. In the Goals tab, enter a name for your goal.

  2. Select the Click rates goal from the Event Type dropdown menu.

    The only currently available event type is Click rates.

Once you have defined a goal for your test, you can move on to the next tab to create the variants to be tested against the original page variant and select the click targets.


To select and define the different variants and click targets involved in your test or to add a new variant:

  1. Click the Variants tab.

  2. In the Variants tab, click Add variant.

  3. Provide a brief description and Save your variant.

    When you create your first variant, it is named variant B. The original page selected is named as variant A. Both are listed in the variants tab.

  4. Select your newly-created variant and click Edit variant.

  5. Edit your variant as required.

  6. Select your variant again and click Add click target.

    Add click target

  7. Use the Open page button to select a click target on a page.

  8. Choose your click target and confirm your selection.

In this example, an additional sentence has been added to variant B compared to the original variant A:

image image

Once you have created your variants and selected the click targets, move on to the Audience & Allocation tab.

Audience and allocation

In the Audience & Allocation tab, you select your targeted segments (such as demographics, region, and so on). You also allocate how much site traffic is dedicated to the variants by moving the slider to your desired traffic percentage.

Audience and allocation tab

Calculating run time

You can calculate the estimated run time (based on estimated site visits) in the A/B/n Testing app. The more traffic your site gets, the faster tests complete.

  1. Click Calculate run time at the bottom of the testing page.

  2. Enter the estimated site visitors per day.

    Typically, you can estimate this based on your site analytics.

  3. Click Calculate.

    The estimated run time is displayed.


If the estimated run time is unknown, it is because the estimate is outside the configured threshold. This can be configured in the config.yaml file under resources. See the Configuring section on the A/B/n Testing module page for more details.

Starting a test

After you have selected your audience and allocation, start the test with the Start test button.

You can start a test immediately or schedule it to start later.

After starting a test, the test runs until you complete it yourself, or until it has reached significant results.

Start test button

Test status

Once you have finished setting up your test and started running it, you can see its status in the A/B/n Testing app Status column. The test status is also displayed in the Pages app A/B/n test status column next to the original page selected.

Status Description

Not started

Your test has been created but not started. It may still require some setup.


Your test is currently running. An estimate of the remaining time is displayed.


You have paused your test.


You have cancelled your test.

Test data is deleted when the test is aborted.


You completed your test, either manually or because you reached a significant result.

Viewing results and completing tests

You receive results when a significant result has been reached, or when you complete the test manually.

Significant results

Significant results are when at least one of your variants has met the preconfigured confidence levels based on the base conversion and uplift. This means that, based on preconfigured confidence levels, the variant has met the uplift target based on the initial base conversion rate.

What is confidence level?

Confidence level is the likelihood (in %) that the uplift reported by the test is correct. 95% is the normal choice.

When significant results are reached, a green icon appears next to the Status of the test as well as a message letting you know that significant results have been reached.

To see the results, click the results icon for your test.


Completing the test

  1. Select your test.

  2. Click Complete test on the bottom ribbon.

  3. Select a variant if you want to replace the original.

  4. Click Complete.

  5. View your results.

If significant results have not been reached based on the configured base conversion, uplift, and confidence level settings, no results will be available.

Complete test dialog

If the original page selected in the test has been edited while the test was running, a message is displayed in the Complete test dialog.


You access your results either in:

  • The Results tab in your test.

Results tab

  • The Results icon in the All tests page.

Results icon

The charts provide an overview of the metrics related to your test.

Test results

dev days event sign up
dev days event sign up