Single ad group ad testing

Modified on Fri, 27 Dec, 2024 at 9:16 AM



What is single ad group testing?


A single ad group test runs only on one ad group, comparing the ads within that ad group to each other. This is the most popular ad testing approach and is applicable to all PPC accounts. For other types of ad testing, please look at multi ad group testing.


In Adalysis, you can test your existing ads based on their historical performance. You don't need to start any tests or create new ads. Instead, Adalysis tests existing ads automatically, wherever two or more ads compete. New ads will be included in tests automatically, for both text and image ads.


A single ad group test needs at least two active ads (and a maximum of 15) to run. For image ads, only ads with the same size are tested. By default, Adalysis compares six metrics for every test. However, you can customize all metrics.


Please note: By default, ad tests don't run for paused campaigns, ad groups, or ads. If you want to override override this setting, please click here.

 

How do I run single ad group tests?


Adalysis will automatically run single ad group tests, wherever it finds competing ads. This is a major time-saving compared to manual approaches. However, you can also create manual tests.


Automatic testing


Your Adalysis account is synchronized with your PPC account every day at around 4am. Adalysis will then:
  • Scan your ad groups and calculate date ranges during which the active ads within an ad group were running simultaneously. This is based on the Enabled Date of each ad. 
  • Run an ad test using the performance data of the date range.  
  • You can see statistically significant results (compatible with your thresholdsunder Ad testing > Single ad group. Test results from the previous day are removed.



Manual testing


You can also run unlimited manual tests:

  • Specify a date rangeUsing the "common date range" option mimics how the automated tests run.
  • Choose to see all test results irrespective of confidence level.
  • Override the current threshold values for this test run. (Your global threshold values will stay unchanged.)



Here are some common reasons Adalysis users choose additional manual tests:
  • Compare test results with different thresholds.
  • Test date ranges, to compare to the automated tests.
  • Test a subset of ads within the ad group, e.g. only mobile-preferred ads.
  • See results that aren't statistically significant.


Understanding the test results


The single ad group test results show:

  1. The number of active ads found and tested for each ad group.
  2. The date range used for the test.
  3. The algorithm's confidence for each test metricConfidence less than 90% is displayed as --.
  4. The aggregate performance of all ads tested.



Click on any ad group name to see:


1. The winner and loser ads for each metric:

  • One winner ad for each metric with enough data, highlighted in green.
  • One loser ad for each metric with enough data, highlighted in red.
  • Neutral ads (neither winners nor losers), highlighted in yellow.
  • Metrics with not enough data won't show a winner/loser.

2. The confidence the algorithm has in the result
3. The projected performance boost if you pause the loser ad.



Please note: You can navigate through the test result using the left/right arrows, or click Back for all test results.



Acting on your test results


Actions per ad group


  1. Pause the losing ad. All changes in Adalysis are pushed immediately to Google Ads or Microsoft Ads.
  2. Create a new ad to replace the losing one.
  3. Delete a test result or mark a test result as 'analyzed'. This changes the test result's color in the main list, so you can keep track of the results you've looked into.



Please note: When you pause or edit an ad, a copy of the test result is archived for future reference. Click here for your history of archived test results. You can also archive any test result yourself.


Bulk actions for all test results


Click Pause all CTR loser ads to pause all losing ads with one click. This also applies to the other test metrics.



Related content:

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article