Create A/B native experiments

Maximize your native performance by A/B testing different native styles. Test visual elements and other updates in two styles to see which one will perform better before making a change.

How it works

The original style that you want to test against a new design is the "control." The new design is the "experiment" style. You update the experiment’s settings in an attempt to improve performance compared to the control style. You can then analyze the two styles’ performance and determine which settings you want to keep.

Native experiments can only compare two native styles on existing native placements. You can’t compare banner and native ads in the same ad placement.

If an experiment targets a control native style that mixes both programmatic and traditional traffic, your reservation traffic will be affected.

Run an experiment

  1. Sign in to Google Ad Manager.
  2. Click Delivery, then Native.
  3. From the table, click a native style that meets both of these requirements:
    • Has a value of "Native content ad," "Native app install ad," "Native video content ad," or "Native video app install ad" in the "Format" column.
    • Has a value of "Programmatic & traditional" in the "Deal eligibility" column.
  4. From the "Style your native ad" page, click Create A/B experiment.
    The "Run A/B experiment" settings appear in the right panel.
  5. Select when the experiment will run from the date dropdown.
  6. Under "Traffic allocation," enter the percentage of impressions to allocate to the experiment style during the experiment. The rest will go to the control style.
    • For example, if you allocate 60% of impressions to the experiment style, the control style will get the remaining 40%.
    • Enter 50% for an equal allocation of impressions between the experiment and control styles.
  7. Click Experiment and make updates to the experiment style for the experiment.
    Update with the changes you think might make the resulting native ads perform better.
  8. Click Continue, make any needed targeting changes, and click Save and finish.
You can also run a manual native ad style experiment from the Experiments page.

Analyze your experiment and take action

After the experiment has run for two days, the system should have enough results. You can start to review the data and decide if you want to switch to the experiment settings.

  1. Click Delivery, then Native.
  2. From the table, click the native style that is running an experiment.
    All such native styles have an "Experiment running" label in their row.
  3. On the "Style your native ad" page that appears, click View experiment on the right side of the page.
  4. (Optional) If the experiment is still running, pause it by expanding the "Running" dropdown and clicking Pause.
    When you pause an experiment, 100% of traffic will go to the control (original) native style.
  5. (Optional) Click Preview styles to see what a resulting ad would look like from each native style.
  6. Review the data to see how the experiment is performing compared to the control (original) style.
    Remember to keep the traffic allocation in mind when analyzing the results. The allocation appears in the lower lefthand corner.
  7. After the experiment ends, all traffic is allocated to the control (original) native style. At any time, you can choose to apply the experiment settings or keep the control (original) style’s settings.
    • Apply variation: The control native style is updated to match the experiment style.
    • Decline variation: The control (original) native style retains its settings.
    After you click either of these, the control (original) style receives 100% of traffic allocation, and the experiment native style is deleted.

Understand experiment results

Experiments display the following metrics along with a "+/-% of control" value, which helps you compare the performance between the experiment and control native styles.

Example
An "Experiment revenue" of "$10,000 / +10.0% of control" means the experiment style is estimated to receive $10,000 in revenue, which is 10% higher than the estimated revenue for the control (original) style.
  • Experiment revenue
    Net revenue generated from Ad impressions served (with adjustments for Ad Spam and other factors). This amount is an estimate and subject to change when your earnings are verified for accuracy at the end of every month.
  • Experiment eCPM
    Ad revenue per thousand Ad impressions
    Ad eCPM = Revenue / Ad impressions * 1000
  • Experiment CTR
    For standard ads, your ad clickthrough rate (CTR) is the number of ad clicks divided by the number of individual ad impressions expressed as a percentage.
    Experiment CTR = Clicks / Ad impressions * 100
  • Experiment coverage
    The percentage of ads returned compared to the number of ads requested.
    Experiment coverage = (Matched requests / Ad requests)

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Google apps
Main menu
8770935405540049969
true
Search Help Center
true
true
true
true
true
148
false
false