Home Content Studio It’s Time For A Smarter Approach To Measuring Supply-Side Signals

It’s Time For A Smarter Approach To Measuring Supply-Side Signals

SHARE:

For years, the pervasive uncertainty surrounding cookies pained publishers.

DSPs used cookies to reach defined audiences without meaningful involvement from publishers. This dynamic disintermediated publishers, disconnecting them from their ability to drive better results for advertisers through their robust understanding of their site’s experiences, contexts and audiences.

Now, the future value of cookies as a core targeting and measurement mechanism is in doubt.
In theory, this represents an opportunity for publishers. In the absence of third-party cookies, they have a chance to meaningfully strengthen their buy-side relationships through their own signals (or those of partners). Using signals that originate on the supply side can provide a better and more measurable substrate for bidders. Savvy publishers are already investing in addressability strategies using their own supply-side signals.


In practice, however, it’s rarely that simple. As many publishers know all too well, understanding the value of their supply-side signals is easier said than done. Publishers face significant challenges in understanding which supply-side signal providers are generating real results. And this lack of understanding holds them back from achieving maximum ROI on their ad tech investments.

Let’s take a deeper dive into the signal-related challenges publishers face – and what they should look for in a solution.

The status quo isn’t working: The problem with measuring supply-side signals

The best practice is for publishers to compare first- and third-party signals before investing in a first-party signal strategy. But it can be difficult for publishers to know which third-party signal providers are actually driving results.

Publishers’ options to glean insights on the impact of supply-side signals are generally expensive, cumbersome and inexact. To measure the lift that different providers are generating, publishers need to design and execute experiments that involve manually switching different providers on and off for a set period of time in the hopes of analyzing the impact on revenue.

Given the stakes, most publishers want to approach these initiatives with more scientific rigor. But resourcing challenges make comprehensive experimentation a difficult – and very costly – proposition.

Moreover, signal impact is not a point-in-time effort. As cookies decline, the value of other identifiers will likely increase – but not necessarily uniformly across providers. And with all the tinkering that publishers do, the setup from when you ran your manual signal test in March may not be working by May. How can you know for sure unless you test continuously?

Combined, these issues make it nearly impossible for publishers to capitalize on the opportunity before them: investing in and merchandising the right supply-side signals that offer upstream benefits to advertisers to help them deliver the right message to the right user at the right time.

What publishers should look for in a signal measurement tool

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Forward-thinking publishers are already using new measurement tools to combat this addressability challenge. But publishers can’t afford to waste their valuable ad tech budgets on tools that don’t measure up. Here’s a checklist of what publishers should look for when weighing their signal measurement tool options.

*Out-of-the box, automated A/B testing framework. Building a rigorous testing framework to provide significant results is key. Their tool should come with predesigned options built from expert data scientists that allow publishers to begin deriving insights on signal lift simply by turning the tool on.

What’s more, the tool should be able to deliver these insights on a per bidder basis, allowing them to troubleshoot as needed and further strengthening their ability to work productively with advertising partners.

Rigorous and flexible solutions are much more likely to be actuated server-side versus client-side scripts or plug-ins that cannot manage experiment decisioning across internet scale.

*Always-on functionality. Confining lift analyses to specific time periods will likely taint publishers’ insights with temporal skews and fail to capture longitudinal shifts in value. In contrast, an always-on tool provides clean, perpetual insights into when and how different providers are impacting the bidstream.

This level of consistent visibility into different providers also makes troubleshooting much simpler and faster. In the event that one provider isn’t working properly, the right tool will alert and empower publishers to take corrective action right away.

*Real insights without the risk. Manually analyzing signals across multiple vendors often involves disengaging individual providers for extended periods of time, which can have a volatile impact on revenue. More sophisticated tests can produce better results with a fraction of the traffic, giving publishers better insights with less risk.

The next generation of signal insights

The good news is that these types of tools already exist. At Amazon Publisher Services, we  wanted publishers to better understand their supply-side signals and the impact of third-party signal providers on their ad inventory in the bidstream. So we built Signal IQ. With its rigorous server-side A/B testing framework, its reporting illustrates how third-party IDs are influencing key metrics such as bid rates, bid CPMs, and revenue from specific bidders in the APS Transparent Ad Marketplace (TAM).

After launching in beta this past June, publishers have been able to confidently ramp third-party ID signal density by 59%. Publishers in this test group are seeing third-party IDs contribute to an average increase of 20%-or-greater in their TAM earnings.*

With the right tools, publishers can build a breadth of insights designed to drive results across a number of key metrics. Even better, they can use these insights to refine their addressability strategy at a low cost, making “adaptability” a core competency, even as the digital advertising landscape evolves at a greater pace.

*Source: Amazon internal data, US, August 2024. Results are representative of APS Transparent Ad Marketplace average earnings across n=58 sampled publishers.”

For more articles featuring Scott Siegler, click here.

Must Read

Google in the antitrust crosshairs (Law concept. Single line draw design. Full length animation illustration. High quality 4k footage)

Google And The DOJ Recap Their Cases In The Countdown To Closing Arguments

If you’re trying to read more than 1,000 pages of legal documents about the US v. Google ad tech antitrust case on Election Day, you’ve come to the right place.

NYT’s Ad And Subscription Revenue Surge As WaPo Flails

While WaPo recently lost 250,000 subscribers due to concerns over its journalistic independence, NYT added 260,000 subscriptions in Q3 thanks largely to the popularity of its non-news offerings.

Mark Proulx, global director of media quality & responsibility, Kenvue

How Kenvue Avoided $3 Million In Wasted Media Spend

Stop thinking about brand safety verification as “insurance” – a way to avoid undesirable content – and start thinking about it as an opportunity to build positive brand associations, says Kenvue’s Mark Proulx.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
Comic: Lunch Is Searched

Based On Its Q3 Earnings, Maybe AIphabet Should Just Change Its Name To AI-phabet

Google hit some impressive revenue benchmarks in Q3. But investors seemed to only have eyes for AI.

Reddit’s Ads Biz Exploded In Q3, Albeit From A Small Base

Ad revenue grew 56% YOY even without some of Reddit’s shiny new ad products, including generative AI creative tools and in-comment ads, being fully integrated into its platform.

Freestar Is Taking The ‘Baby Carrot’ Approach To Curation

Freestar adopted a new approach to curation developed by Audigent that gives buyers a priority lane to publisher inventory with higher viewability and attention scores than most open-auction inventory.