“The Sell Sider” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Scott Messer, SVP of media at Leaf Group.
As the cookie crumbles, publishers are on the precipice of a power-balance reset, and the opportunity is theirs to lose.
In regione caecorum rex est luscus: In the land of the blind, the one-eyed man is king.
Publishers covet their understanding of their audience, and they strategically regulated access to foster direct relationships with advertisers. After a flurry of middlemen separated audiences from publishers and resold them to buyers, publishers have finally kicked out most offenders and shored up data leakage. Buyers can still find “the right person,” but no longer is usurping valuable publisher data in the playbook to do so.
Today, crumbling cookies diminish the relevance of audience data leakage, but a lurking threat rises from the rubble: contextual scrapers. What’s more persistent than a cookie or even a hashed ID? The URL itself.
Typically, content is hosted at a specific URL, and rarely does that URL or the content topic within it change. Each URL or article has a context consisting of its categories, topics, verticals and more. Thus, each URL holds information about “the right place” for finding potential customers.
In the land of context, those who know it will be king
Contextual scrapers, AKA bots, spiders, crawlers, are already crawling publisher pages every day to figure out what is on those URLs. The crawlers grab the content with the URL, analyze it and then save the URL-topic match table for later. This contextual indexing does not require any onsite integration. Publisher content is public facing, and while it has a copyright itself, it does not have protection from classification. Publishers cannot prevent this from happening.
So why is this important? Publishers are about to lose one of their greatest sales tools: knowing more about their pages than buyers. If a DSP or platform can easily identify the content URLs they want to buy, what do they need a contextual PMP for? What do they need your sales team for?
Surely a buyer’s knowledge about publisher inventory is important and drives unsolicited revenue through open auctions and DSPs, right? Perhaps, but what if some machine miscategorizes publisher content as something of lower value or relevance? How would a publisher even know? Further, this new-again middle layer is re-fattening up the ad tech tax: What’s the price of contextual targeting and how much are publishers being paid from it?
Most importantly, the opportunity for publishers to reclaim power in the buyer/seller relationship is at stake. Third-party data – and its brokers – ushered tremendous spend from DSPs looking for audiences, but it also disintermediated publishers from the sales process. Brands and DSPs became incredibly good at buying the right person at the right time without publisher guidance or assistance, and a mastery of contextual targeting would give them access to the third pillar of “the right place.”
Publishers may not be able to stop the contextual mapping of their sites, but they can demand transparency of vendors, the ability to correct inconsistencies or oversights and the right to employ the very same data for themselves.
Further, in order to defend against DSPs’ armed weapons of contextual value extraction, publishers must consider returning to their contextual yield strategies. This shift will include mapping yield against context, reworking pricing rules to match context and realigning sales and buyers for contextual dealing.
Publishers must be heads up and cannot allow themselves to be separated again from the value that their content creates.