Skip to main content
The Creative Comparison tool lets you select multiple creatives and evaluate them side-by-side across key performance metrics. It is designed to help you quickly identify which creatives are driving results and which need to be replaced — without manually cross-referencing individual creative reports.
Creative Comparison page with performance metrics and compare creatives link

Selecting Creatives to Compare

You can initiate a comparison from two places:
  1. From the Creative Library — select the checkboxes on two or more creative cards, then click Compare Selected.
  2. From the Comparison page directly — use the creative picker to search and add creatives by name or ID.
There is no hard limit on how many creatives you can compare at once, but for clarity, comparing two to five creatives at a time gives the most readable side-by-side view.

Side-by-Side Performance Metrics

Once creatives are selected, their metrics are displayed in aligned columns so you can compare values directly.
MetricDescription
ImpressionsTotal number of times the creative was served.
ClicksTotal clicks generated by the creative.
CTR (Click-Through Rate)Clicks as a percentage of impressions.
ConversionsNumber of leads or conversions attributed to the creative.
Conversion Rate (CVR)Conversions as a percentage of clicks.
RevenueTotal revenue attributed to the creative.
Cost per ConversionAverage cost to generate one conversion through this creative.
Metrics that are significantly higher or lower than the group average are highlighted automatically, so top and bottom performers are immediately visible.

Comparing Across Campaigns and Sources

By default, comparison metrics are aggregated across all campaigns and sources the creative has been used in. You can narrow the scope using the filters at the top of the comparison view:
FilterDescription
CampaignRestrict metrics to a specific campaign.
SourceRestrict metrics to traffic from a specific source.
Date RangeCompare performance over a specific time window.
This is useful when a creative has been used across multiple campaigns and you want to evaluate its performance in one specific context rather than in aggregate.
Tip: If a creative performs well in one campaign but poorly in another, use the Campaign filter to isolate performance by context. The issue may be the audience or placement rather than the creative itself.

Selecting Creatives for A/B Testing

Once you have identified creatives you want to test against each other, you can designate them as an A/B test directly from the comparison view:
  1. Select the creatives you want to test.
  2. Click Set as A/B Test.
  3. Assign the test to a campaign.
  4. Configure the traffic split (e.g., 50/50 or weighted).
  5. Save the test configuration.
Pingtree will distribute traffic between the selected creatives according to the split and track results independently, so you can return to the comparison view later to evaluate the outcome with live data.

Saving Comparison Selections

Comparison sets can be saved so you do not have to rebuild them from scratch each time you want to revisit a group of creatives.
  • Click Save Comparison and give the set a name (e.g., Q2 Banner Variants).
  • Saved comparisons appear in the Saved Comparisons sidebar on the left of the page.
  • Open a saved comparison at any time to reload the same creatives with fresh metrics for the current date range.
Tip: Save a comparison set after a creative review meeting so you can quickly pull it up in the next session and see how metrics have evolved since then.

Identifying Top Performers

The comparison view is particularly useful for evaluating similar creative variations — for example, the same ad copy with different imagery, or the same design in multiple sizes. Common use cases:
Use CaseHow to Use the Comparison Tool
Choosing the best banner from a set of variantsCompare all variants side-by-side and rank by CTR or CVR.
Evaluating a creative refreshCompare the old creative against the new one over the same date range.
Reviewing creatives before pausing a campaignCompare active creatives to determine which ones to keep live.
Post-A/B test analysisLoad both test creatives into the comparison view and review final results.
Tip: Sort the comparison table by Conversion Rate rather than clicks alone — a creative with fewer clicks but a higher CVR is often more valuable than a high-click creative that fails to convert.