Home The Sell Sider A Window Is Opening To Give Publishers Control Over Their Audiences

A Window Is Opening To Give Publishers Control Over Their Audiences

SHARE:

jamescurran-updatedThe Sell Sider” is a column written by the sell side of the digital media community.

Today’s column is written by James Curran, CEO and co-founder at STAQ.

Advertisers have amassed huge first-party audiences in their data management platforms (DMP) and through third-party data partners integrated with their media-buying technologies.

This has allowed advertisers to accurately cherry-pick their audiences on publisher sites in real time. Publishers are forced to open their sites to advertisers who will have their way with them.

DMPs haven’t been as valuable for publishers, who use them more like a catalog of historical audience information, unattached to the actual buying and selling process. If advertisers want to buy an audience, publishers look up past volume rates, put a deal together and cross their fingers that the makegoods aren’t too extreme.

The overlap between audiences is often nowhere near 100% as each company uses different methodologies to categorize sports enthusiasts or new moms. Using a third-party data solution doesn’t always help due to match rate issues in a daisy chain of disconnected systems between the buy and sell sides.

There are two opportunities emerging in the market that can help change that with enhanced automation: the formation of a shared first-party ID from Digitrust and server-side header bidding.

One ID To Rule Them All

An innovative idea that’s becoming a reality is a shared first-party ID. The nonprofit Digitrust has buy-in to lead this effort from more than 20 demand-side platforms (DSPs), supply-side platforms (SSPs) and other tech middlemen, and is now recruiting publishers to be part of the first tests.

A single ID obviously helps the middlemen by increasing match rates and eliminating the need for syncing, but these same features help publishers just as much. With a shared ID, match rates on publisher audiences would be nearly 100% and the ability to sync audience data with third parties would be much simpler and quicker.

With synced audience data available during a programmatic auction, publishers could be much more confident in their ability to book an audience-targeted campaign. In addition to reducing makegood issues, publishers could more accurately forecast volume and price. They could even pre-book impressions on their sites for buyers.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Publishers could start to control pricing rather than be at the mercy of advertisers, who have historically held all the cards by having the de facto DMP of record. Publishers could start to analyze performance more granularly, and even price audiences in different subsegments. More recent or in-market audience groups could be pricier than stale leads, for example.

With a connected DMP, publishers would start to learn over time and be able to price attractive audiences higher as scarcity goes up. They could start organizing their DMP by advertiser attribute to understand how someone is valued by different buyers, making the auction harder for advertisers to game, and easily create meaningful people-based media plans at scale.

Server-Side Side Effect

When publishers consider the value of moving programmatic advertising partners from header bidding to server-side header bidding, they often leave out an exciting opportunity that server-side header bidding brings with it: real-time access to audience data.

With a server-side header bidding solution such as Media.net, Amazon or Sonobi, publishers can create a link between their DMP, sell-side platforms and DSPs that can then connect to the advertiser’s DMP and other data partners.

For example, imagine that Target wants to use its vast offline data from Acxiom, to target a digital campaign to moms 25-34 years old with one child between the age of 1 and 3. Acxiom, through its LiveRamp subsidiary, would find enough publishers to create the scale required for the budget, through a cookie sync during the media-planning phase.

Then let’s say a publisher uses Krux for its own audience data. If LiveRamp and Krux sync, and if Krux syncs with the server-side header bidder, the circuit is complete, allowing media buyers and sellers to compare their audience data more accurately in real time.

Without server-to-server technology on the publisher side, LiveRamp is one of the few companies than can sync people-based marketing campaigns with publishers directly, but it requires a sophisticated business development effort spearheaded by the Arbor and Circulate teams that it acquired recently. Server-side technology will theoretically remove the friction and facilitate scale in a virtuous circle.

While a Digitrust ID wouldn’t remove the need to sync user profiles, it would help everyone outside of stable ID environments, such as Google, Facebook or LiveRamp, achieve scale on these synced campaigns by unifying the ID.

Most of the top 100 digital advertisers are now deeply engaged in people-based marketing efforts. These campaigns evolve rapidly, with increasing sophistication and complexity in their taxonomy. The smart advertisers are learning from their spend and enhancing their practice in an iterative process.

Doing a manual cookie sync with a large publisher freezes the plan at the moment of that sync, which could be obsolete very quickly as the advertiser’s marketing plan evolves. Server-side technology at the publisher would enable advertisers to keep the sync live, so that expectations on price and scale can be gauged without any business development efforts. There is a real potential that it would automate the last remaining friction between buyers and sellers of data-targeted inventory.

Publishers have the opportunity to measure revenue per user accurately for the first time ever, but be warned: A server-side header bidding solution isn’t an instant cornucopia of insight for publishers. Event-level reporting results in huge data sets that require technology and analytics capabilities to process into meaningful reporting.

Publishers will want to join their DoubleClick for Publishers data transfer file to their server-side bidding solution so they can roll up dimensions and metrics such as URL, placement, segment, advertiser, device and geo. From there, the next frontier is measuring ROI on paid content syndication, which has been an elusive goal.

Follow James Curran (@james_curran), Staq (@STAQ) and AdExchanger (@adexchanger) on Twitter.

Must Read

NYT’s Ad And Subscription Revenue Surge As WaPo Flails

While WaPo recently lost 250,000 subscribers due to concerns over its journalistic independence, NYT added 260,000 subscriptions in Q3 thanks largely to the popularity of its non-news offerings.

Mark Proulx, global director of media quality & responsibility, Kenvue

How Kenvue Avoided $3 Million In Wasted Media Spend

Stop thinking about brand safety verification as “insurance” – a way to avoid undesirable content – and start thinking about it as an opportunity to build positive brand associations, says Kenvue’s Mark Proulx.

Comic: Lunch Is Searched

Based On Its Q3 Earnings, Maybe AIphabet Should Just Change Its Name To AI-phabet

Google hit some impressive revenue benchmarks in Q3. But investors seemed to only have eyes for AI.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Reddit’s Ads Biz Exploded In Q3, Albeit From A Small Base

Ad revenue grew 56% YOY even without some of Reddit’s shiny new ad products, including generative AI creative tools and in-comment ads, being fully integrated into its platform.

Freestar Is Taking The ‘Baby Carrot’ Approach To Curation

Freestar adopted a new approach to curation developed by Audigent that gives buyers a priority lane to publisher inventory with higher viewability and attention scores than most open-auction inventory.

Comic: Header Bidding Rapper (Wrapper!)

IAB Tech Lab Made Moves To Acquire Prebid In 2021 – And Prebid Said No

The story of how Prebid.org came to be – and almost didn’t – is an important one for the industry.