Charting the Path to Direct Sold RTB Advertising

“Data-Driven Thinking” is a column written by members of the media community and containing fresh ideas on the digital revolution in media.

Today’s column is written by Tom Chavez, founder and CEO of Krux Digital.

When he was running the Right Media Exchange, Ramsey McGrory posited what I call the McGrory Conjecture during a panel discussion at an IAB conference in 2009.  He suggested that publishers and marketers were using the RightMedia exchange not just to buy and sell impressions, but also to gain access to data.  The circumstantial evidence was pretty convincing:  Remnant channels were clearing publisher inventory at south of fifty cents, while average prices through RightMedia were north of a dollar.

The only thing that explained buyers’ willingness-to-pay for media at those rates was access to data without having to take the impression.  Buyers would get a seat on the exchange to ensure access to the bid flow, buying at much lower volumes and thus correspondingly higher average CPMs, in exchange for access to ‘free’ data.  For sellers putting their media on the exchange, this represented a type of data leakage I like to call ‘cookie licking’:  bidders lick the cookie, put it back in the jar.  What we’ve seen during the last three years in our industry abundantly confirms Ramsey’s early speculation.   McGrory’s Conjecture is now McGrory’s Law.

Fast-forward now to August 2012.  As publishers and marketers search for ways to build more valuable, more distinctive experiences for consumers, they naturally take heightened interest in exchanges, such as those from Google’s Doubleclick and AppNexus, and RTB-enabled SSPs like Rubicon, Admeld (Google), and PubMatic. What’s different is that savvy publishers and marketers have implicitly absorbed McGrory’s lesson and are seeking to build programmatic revenue without sacrificing control of their most valuable asset – their data.

  • For the average publisher, the fundamental question becomes, “How do I earn revenue from programmatic media while ensuring strategic control over my data both tactically and in the long haul?”
  • And for the marketer, “How do I bring my own first-party data to these new channels without scattering it to the winds?”

Consider the 2×2 below. First-party, direct-sold media has been the province of the ad server in digital media for the last decade. In the principal-agent relationship that underpins exchanges and SSPs, publishers essentially cede control of the transfer and monetization of their inventory to the exchange, which is why most of us think of third-party and programmatic revenue as logically equivalent. Private exchanges introduce a new variation on the same theme, what you might consider second-party, i.e., the exchange of media and maybe data among trusted, named partners.

Publishers are long on supply, short on revenue.  Marketers need ROI and scale.  Moving data from a first-party system (whether it’s a vendor or homegrown DMP) into an exchange or SSP, either on a public or private basis, is the most urgent imperative for both publishers and marketers today.

SSPs and exchanges are receptive. Savvy SSPs I’ve talked to understand that to improve their own yields and to make their marketplaces stickier and more valuable for buyers, they need to make their pipes smarter. The only way for them to do that is to make their bid protocols more expressive. URLs and placement information was a nice start, but it’s not enough. Segment data (e.g., interest, engagement, propensity to purchase, demographic) needs to be put to work to improve the effectiveness and value of programmatic inventory.  It needs to happen ideally via a mechanism that keeps the data safely in the hands of its first-party owners and available to designated partners only on terms, times, and conditions of their choosing.

Should the data flow directly into the exchange, like ad inventory?

I don’t think exchanges and SSPs will become the central mechanism for streaming publisher and marketer data for three reasons.

First, principals with data want control.  When Neil Ashe was at CBS and they were considering participation in various data exchanges, he told his team:

  • They’d better know the lifetime value of the cookies they make available to third parties; and
  • More importantly, they’d better extract that full value upfront.

Ashe understood that, once that data was put into any RTB channel, others could use it at will without giving CBS credit or compensation. Data’s marginal cost is zero. Once a third-party gets access to it, it can be put to possibly millions of potential uses without wear or tear, all behind the veil of third-party cookie stores.

If you’re a publisher or marketer, the leakage risks and the widespread cookie licking that accompany RTB make pumping your data directly into the exchange a perilous prospect.  Among the alternate models is one from Centro that delivers data safety to media owners by weaving data protection into the fabric of their exchange.  It’s certainly a better mousetrap, but most exchanges do not yet offer such services.

Second, third-party brokering doesn’t play well in the current regulatory regime.  The drums continue to beat, culminating in the DMA’s recent petition before Congress not to constrain data brokering on the web.  This most recent kerfuffle – the latest in a long, unrelenting series – indicates that, even if the principals were game, the evolving regulatory regime maintains a watchful eye over third-party data exchange.  If you want to take a contrarian view and bet that the whole issue will simply just go away – well, good luck with that.

Third, and most important, I think there is a path to filling that dormant cell of our 2×2 to create direct-sold RTB revenue for publishers and direct-sold RTB advertising for marketers, all of it energized by first-party data.  Portability, conceived generally as the ability to move data safely across systems, sources, and devices, is a defining feature of a fully web-enabled DMP.  With the portability principle firmly in mind, I think it’s time we start channeling the many neurons and dollars sloshing around our industry to develop systems that intelligently connect data between first party owners and their partners in an SSP or exchange context. It feels like a technically solvable problem.

In physics, after Conjectures and Laws come Devices, and it appears that consumer web data follows that pattern. Big ups to McGrory for posing the right question and getting the conversation started. I look forward to working with like-minded partners to build the Devices that bring his early observations to fruition.

Follow Tom Chavez (@tommychavez), Krux Digital (@kruxdigital), (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. Tom has an amazing ability to brand dark things like ‘data blood diamonds’ and ‘cookie licking.’ What a funny visual. I agree with the general place Tom ends up here but how I get there is different.

    My point in 2009 wasn’t specific to Right Media but a more general one – in the absence of clear language and accepted practices regarding data usage by buyers, silence in the IAB T and Cs at the time created ambiguity that more aggressive buyers took advantage of, claiming rights of re-usage to data associated with an ad served. This may have led buyers to bid higher b/c their lifetime value calculation changed with the subsequent usage of data. Right Media average yield was higher b/c of good bidding algos (developed in ’04) and b/c of Yahoo’s quality inventory, not b/c of rampant cookie licking. Remember that RTB was not at real scale in ’09, so ‘cookie licking’ wasn’t in the picture.

    Yahoo and others worked diligently with the IAB and with each other to come up with acceptable terms that allowed buyers and sellers to negotiate anonymous data usage (business terms consistent with privacy) for ‘normal’ and RTB ads served. I spent a lot of time on this in ’09 and ’10 so I believe ‘cookie licking’ is the exception rather than the rule among large buyers and sellers.

    Media exchanges and ad servers (not data exchanges) are in a position to ensure technically that this doesn’t happen. It’s not there yet, but will be I believe. Here’s why – data in all its forms is the lifeblood of advertising and publisher, and this data must be at the point of the ad server or content management system to inform what content will promote the most engagement with a consumer.

    What Tom calls devices, I call infrastructure, and the faster the ad server and a CMS are connected to the core data infrastructure that holds both 1st party data and access to permissible 3rd party data, the better we will make consumer experiences and the better marketers we will be. If the infrastructure is not in place to activate and protect this data, everyone suffers, so it’s going to happen.

    If there’s a law from what I said, it’s that innovation advances faster than the rules that govern it…in advertising, music, financial derivatives, war, everything.

  2. Thanks, Ramsey. I certainly agree. While we might have taken different paths to the same conclusion, we end up in the same place.

    You’re right: innovation frequently advances faster than our ability to govern or even make sense of its consequences. The tricks and methods people are using to gain value from ad exchanges, either contractually or in the shadows, were quite likely not even anticipated when you and others at RightMedia first willed them into existence.

    From what we can see, data collection on exchanges is more the rule than the exception.

    Markets are efficient, which is why this makes sense The fact that data collection has become so widespread is evidence of its inherent value, but as you observe, one of the hazards of the web’s openness and speed of innovation.

  3. Thanks Tom. This is interesting tracking, but I’m not sure this proves that data collection is happening. It shows piggybacking is happening, but assuming illicit data collection is a sizable leap. The piggybacked pixel would have to know the criteria that targeted the user to be able to ‘lick’ anything. And it would have to be javascript really. Even if it’s js, publishers are often double i-framing the ads to protect themselves.

    A lot of this could simply be explained as cookie syncing where one party is working with another party legitimately, and to target, track, report correctly, the cookie IDs have to be sync’d. This lack of proper syncing is a fairly big problem in the industry.

    I don’t have a dog (or exchange) in the hunt here anymore, but I’m not convinced this is a 3 alarm fire in a movie theater. I think it’s more a dude standing five feet outside the theater smoking a cigarette where the smoke is wafting in.

  4. I would question Ramsey McGrory’s suggestion that he no longer has a “dog in the hunt”, having read the AddThis (of which Ramsey is the CEO) Terms of Service. The following section makes interesting reading…

    “…you agree that we may collect data related to an End User’s use of the Services including an End User’s sharing of Publisher Content”

    See full copy here: