Ready to boost your digital presence?


Test

Comprehensive Analysis of Brand Awareness Advertising

July 28, 2021
Analyzing display advertising is more complex than it may seem.

Comprehensive Analysis of Brand Awareness Advertising

July 28, 2021
Analyzing display advertising is more complex than it may seem.
Oleksii Liakh

Many companies still evaluate it based on the number of clicks and direct conversions. If clicks are low, the campaign is considered ineffective. As a result, display advertising is often seen as an “expensive branding exercise” that is difficult to measure — and even harder to justify to management.

The second challenge is data fragmentation. Display campaigns are typically launched simultaneously across Google, Meta, YouTube, programmatic platforms, and direct placements. Each system reports its own metrics, and consolidating them into a single, coherent view becomes a task in itself.

As a result, companies either focus solely on clicks or limit themselves to basic media metrics — failing to see the full impact of advertising on demand and sales.

In this article, the newage. team shares its comprehensive approach to display advertising analysis:
how to verify data quality, which metrics truly matter, and how to connect media campaigns to real business outcomes.

What is a Comprehensive Analysis of Display Campaigns?

Comprehensive analysis of display advertising is a methodology for evaluating campaign effectiveness that provides a holistic view of its real impact.

Within this approach, we analyze:

  • placement quality,
  • key media metrics,
  • measurable user response to advertising across digital tools,
  • and apply an agile approach for ongoing campaign optimization.

This framework allows us to move beyond isolated metrics and evaluate display advertising systematically — from the initial user contact to actual business outcomes.

How Comprehensive Analysis Works

Comprehensive Analysis is the core approach newage. uses in managing display advertising campaigns. It follows an iterative cycle through which we analyze and continuously optimize campaign performance.

Each iteration consists of four key components.

  • Placement Quality Control

Before analyzing any data, we ensure it is accurate and reflects the actual campaign performance. We verify viewability levels, confirm whether the ad contact was genuinely visible to users, and assess whether impressions can be considered valid and meaningful.

  • Media Metrics Analysis

Display advertising has its own core performance indicators: impressions, target audience reach, frequency, video views, brand lift, and more. At this stage, we evaluate how the campaign performs in terms of reach and audience engagement.

  • Ad Response Analysis

By analyzing post-click, post-view, and cross-device interactions, we determine how exposure to advertising influenced user behavior. Essentially, this is a performance-driven approach to brand campaigns — often referred to as brandformance.

  • Agile Approach

Our goal is to generate insights as quickly as possible and apply them to campaign optimization. An iterative process allows us to reallocate budgets, adjust settings, and improve performance while the campaign is still running.

Next, let’s take a closer look at each component.

Placement Quality Control

Inaccurate data inevitably leads to flawed conclusions and poor decision-making. That’s why the first and most critical step in Comprehensive Analysis is verifying data quality.

Display campaigns are typically launched across multiple channels simultaneously: Google, Meta, other global and local platforms, as well as direct media buys. Each platform has its own reporting system, and without additional verification, discrepancies are almost inevitable.

To maintain control over all placements, we integrate an independent tracking system (auditor) that measures key metrics using a unified methodology. Data from ad platforms, auditors, and publishers is aggregated into a single dashboard, where we verify that the numbers align and remain consistent.

First, we assess the “cleanliness” of the data: whether impressions, spend, and other metrics match across ad platforms, web analytics systems, and the independent auditor. A situation where the ad platform reports one million impressions, while the independent tracking system records only a fraction of them, is a clear red flag and cannot be ignored.

We also evaluate the quality of the placements themselves:

  • whether the ad exposure was actually viewable to the user;
  • whether the delivered format matches the planned one;
  • which publishers or websites the ads appeared on;
  • and how frequently the audience was exposed to the ads.

One of the most critical metrics here is viewability (Active View). If viewability is, for example, only 10%, it is not accurate to speak about total reach, but rather about the portion of the audience that had a real opportunity to see the ad.

What should be done with data that fails validation? It should not be used for analysis. If the foundation is flawed, any conclusions drawn from it will also be inaccurate.

That’s why we take a preventive approach. Before the full campaign launch, we run a pre-test with approximately 1,000 impressions and use an internal tracking checklist to verify data accuracy. This helps us identify technical issues early and avoid distortions in reporting.

As a result, tracking inconsistencies have become a rare exception in recent years.

Media Metrics Analysis

At this stage, we analyze the core media performance metrics of the campaign:

  • impressions;
  • target audience reach;
  • reach adjusted for frequency;
  • video completion rates;
  • brand lift;
  • key brand health indicators;
  • trends in branded search queries;
  • growth in direct traffic, and more.

A critical aspect here is evaluating reach based specifically on Active View (viewable) impressions. In other words, we consider only those exposures that were actually visible to users. In practice, this factor is often overlooked, which can significantly distort results.

Some media metrics are based on research methodologies — for example, focus group surveys measuring brand awareness or campaign perception. While these are standard industry tools, they generally involve a higher margin of error compared to technical metrics captured directly by analytics systems.

That said, both research-based and technometric metrics should be analyzed together. Within the Comprehensive Analysis framework, we rely on both, as they complement each other and provide a more complete picture of campaign effectiveness.

This approach is especially relevant for brands where direct digital response is difficult to track, but evaluating overall impact on demand and brand awareness remains essential.

Ad Response

In display advertising, users rarely click on an ad and make a purchase immediately. There are several reasons for this.

First, display formats often reach users when they are not actively searching for a product. For example, someone may open YouTube to watch an interview, not to look for a specific brand or make a purchase.

Second, display advertising typically works through cumulative exposure. A single impression rarely builds trust; brand recall requires frequency and repeated contact.

Third, products with longer decision-making cycles require gradual nurturing. It’s unrealistic to expect someone to purchase an apartment or a complex service after seeing just one video.

As a result, the impact of display advertising is often delayed. While this may seem abstract without data, in practice, it can be measured.

To connect media activity with performance outcomes, we analyze user behavior after ad exposure. In this context, we distinguish three types of conversions.

  1. Post-click refers to visits that occur directly from an ad. A user sees the creative, clicks on it, and completes further actions on the website within the same session or attribution window.
  2. Post-view refers to conversions that happen after a user views an ad but does not click on it. For example, a user may see a banner or video today, take no immediate action, and later visit the website through a branded search or direct visit. In this case, the system records the ad impression and the subsequent action within the same browser over a defined attribution period.
  3. Cross-device refers to delayed actions taken on a different device. For instance, a user may see an ad on their smartphone and later visit the website from a laptop or tablet. With the appropriate analytics tools in place, these interactions can also be incorporated into the overall impact assessment of a display campaign.

Most advertisers evaluate display campaigns solely based on post-click conversions, since this metric is readily available in every ad platform. However, clicks represent only a portion of the true impact of display advertising.

Based on our experience, among users who visit a website after being exposed to an ad, direct clicks typically account for only 20–30% of total visits. Among those who ultimately move through the funnel and complete a final conversion, the share of post-click interactions is usually less than 10%.

This means that by analyzing clicks alone, a company is seeing only a small part of the overall picture.

If post-view and cross-device interactions are not taken into account, campaign effectiveness can be significantly underestimated. In such cases, it is either necessary to implement additional measurement tools or to rely on media metrics, while clearly understanding the limitations of that approach.

By analyzing post-click, post-view, and cross-device data, you can not only measure user response but also generate actionable insights for campaign optimization.

This approach helps answer key questions such as:

  • Which segments or targeting strategies perform better, and which fail to deliver results?
  • What frequency is optimal for a specific audience?
  • Which creatives not only drive clicks but also generate delayed conversions?
  • How long after exposure does a user return to the website?
  • Which channels do users use to return after ad exposure (organic, direct, paid, etc.)?

Answers to these questions allow companies to move beyond general campaign evaluation and make informed, data-driven management decisions.

Optimal campaign frequency

By analyzing the data on the frequency and number of users reached, and how they accessed a website, viewed a product, or communicated with our client, we can evaluate how much each view costs us. Using this data, we can answer the question of what is the optimal frequency for the campaign. In the first example below, we can see that the frequency of more than 4 impressions per week per user is no longer optimal for a client ( employment website), while in the second chart (retail website) we can see that it is important for the brand to run “high-profile” campaigns.

The Efficiency of Creatives

We often want to know which creative worked better. Sometimes we can try to evaluate this by the click-through rate (CTR), but this is a flawed approach at its core. Based on the data collected through Comprehensive Analysis, and using metrics, we can easily demonstrate which banner attracted more audience to the client’s site and led to more conversions/desired actions.

In the example below, we see that in some cases, banner ads can be more effective than more expensive options such as video campaigns. In this case, the data allowed us to optimize the client’s advertising campaign by more than 30%. Many users had already seen the video ad on TV; the banner ad served as a reminder, driving the audience to action.

You do not have to guess which one of the banners works better, just run a pre-test, collect statistics, and keep the one with the best results.

Want to drive more traffic to your website with banner ads? Our team of experts can help you create a compelling banner ad that will grab the attention of your target audience and drive conversions. Fill out the form on this page below to get started, and one of our specialists will contact you shortly to discuss your advertising goals and how we can help you achieve them.

How Often Should Users See Ads

This chart shows how users react to your ads over a certain period.

With this data, you can make the necessary conclusions about when ads are ineffective and when the users need to see them again.

Placements and Targeting optimization

Not all websites and targeting types are effective. Therefore, you need to evaluate which placements, targeting, and audience segments work best. The actual results may surprise you.

Channel Optimization

It’s important to understand that a brand awareness campaign only generates demand, and effective campaigns must satisfy it. To evaluate this and build an attribution map, it is essential to collect data on how users visit the site after brand awareness ads are shown and which ad chain is the most effective.

In conclusion, take note of the data and change your campaigns according to trends. Data can answer a lot of questions that are important when planning and optimizing your campaigns.

Previously, expensive focus groups and field research were used to answer such questions. Now it is possible to analyze it on a live campaign and make edits on the spot, change settings, and test hypotheses. To learn more about real-time editing of your ad campaigns to improve efficiency, increase ROI, and pinpoint your target audience, contact us today.

Agile Optimization

Comprehensive Analysis of display advertising is a complex and cyclical approach. Its purpose is not to evaluate a campaign once and document the results. Its real value lies in continuous optimization, regular testing of hypotheses, and systematic budget management.

All previous stages of analysis should be repeated throughout the campaign lifecycle. Data should serve decision-making, not just reporting — informing adjustments to frequency, budget reallocation, creative updates, or targeting refinements.

We have repeatedly modeled scenarios in which campaigns continued running with their initial settings and no further optimization. The comparison showed that each iteration of improvements can increase budget efficiency by an average of 20–30%, while the overall performance relative to the campaign’s starting point can more than double.

Thanks to this approach, display advertising stops being perceived as a “branding expense” and becomes a controllable tool for driving demand.

By the way, delayed demand is a separate mechanism that allows businesses to forecast and maintain sales stability. You can learn more about it in our article, “What Is Delayed Demand and How to Use It Effectively.”

Conclusions

The effectiveness of display advertising can and should be measured. To do so, it is essential to evaluate campaigns holistically — this is exactly how Comprehensive Analysis works.

The methodology is built on four key elements:

  • Placement Quality Control

Verifying data accuracy is the foundation of any analysis. If data is collected incorrectly, the conclusions — and the decisions based on them — will also be flawed.

  • Media Metrics Analysis

Assessing the overall “health” of the campaign: reach, frequency, video completion rates, brand metric dynamics, and other indicators that reflect communication impact.

  • Post-click, Post-view, and Cross-device Conversion Analysis

This stage connects media activity to performance outcomes and makes it possible to measure the delayed impact of advertising on demand and sales.

  • Agile Optimization

An iterative, data-driven process that enables continuous campaign improvement, hypothesis testing, and more efficient budget utilization.

A comprehensive approach allows companies to move beyond the formal analysis of impressions and clicks toward systematic management of display advertising as a true business growth driver.

newage. is a full-service digital agency specializing in display advertising and performance marketing. We help brands build awareness, generate demand, and measure the real business impact of their campaigns. If you want to better understand how your display advertising truly performs, we would be happy to discuss it further.

FAQ: Questions About Display Advertising Analysis

Can display advertising effectiveness be evaluated based only on clicks?

No. Post-click conversions usually account for only a small portion of a display campaign’s overall impact. A significant share of users respond with a delay — through post-view or cross-device interactions. If you analyze clicks alone, you may significantly underestimate the true effectiveness of your advertising.

Why is viewability more important than the number of impressions?

An impression does not necessarily equal a real ad exposure. If an ad was not actually visible to a user, it is difficult to consider it to have a meaningful impact. Therefore, accurate campaign evaluation should be based on viewable impressions rather than the total number of ad loads.

How long does the effect of display advertising last?

The effect can last from several days to several weeks, depending on the product, contact frequency, and the length of the decision-making cycle. Analyzing post-view and cross-device conversions helps determine the actual lag between ad exposure and conversion.

What is the optimal frequency of ad exposure?

There is no universal answer. The optimal frequency varies depending on the audience and campaign objectives. That’s why it’s important to analyze data by segments and identify the point at which additional exposures improve performance rather than overwhelm the user.

Is Comprehensive Analysis suitable for small and medium-sized businesses?

Yes. The approach can be adapted to different budget levels. Even with limited resources, it is essential to control placement quality, analyze media metrics, and account for delayed conversions. This helps avoid misleading conclusions and enables more efficient budget allocation.

Share with those who need it

Get deeper into digital!

Subscribe to the newage. digital digest and receive exclusive bonus content

Leave a Reply

Your email address will not be published. Required fields are marked *