Why Prices Of Real-Time Bids Are Overinflated

Data-Driven Thinking - Rob Leathern“Data Driven Thinking” is a column written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Rob Leathern, CEO of XA.net, an online advertising company.

It’s a convenient fiction: the CPMs garnered from real-time bidding (RTB) are high and going higher right now, taken by enthusiasts to mean that it is working and driving CPMs up for publishers. After a fair bit of personal experience with RTB bidding and buying and the requisite analysis, I can confidently say that this view of the world is still fanciful. Certainly, the dynamic of higher CPMs is happening right now, with prices being higher for RTB impressions in comparison to similar  non-RTB inventory, but the reason for this cuts to the heart of the fundamental problem with display inventory: advertising space is not the simple commodity we may think it is where every impression from a given site (or even a given placement) is worth the same.

Let’s set aside all the oft-mentioned problems about cost, scalability, number of possible bidders in the marketplace and so on (statements by some market participants about these issues show a fundamental misunderstanding about how this stuff works – I’ll come back to these issues another time) and just look at how things are running right now. At Leadscon NYC in July 2010, Michael Barrett (CEO of Admeld) said that “all of [their] publishers have price floors in place” and that in no case is a publisher “making all of their inventory available” to real-time bidding. He also mentioned that their publishers don’t want more fill from ad networks and DSPs at low prices.

Here are the factors in turn that are leading to the over-inflated RTB prices:

  1. Publishers are still concerned about RTB and not entirely given over to it.
  2. Thus publishers are setting price floors in RTB systems, and allocating only a portion of their inventory to the RTB systems mainly because their inventory still lives inside of their own publisher ad server and coexists with other fixed-price or guaranteed inventory deals.
  3. RTB bidders are seeing an incomplete set of inventory, and making certain decisions about what to bid on and when, based sometimes on seeing unique user IDs to determine user frequency BUT the bidders don’t have the full picture of this user’s activity on a particular website; bidders have an incomplete picture of them across websites, but the lowest hanging-fruit data is session depth within a website.
  4. Because of 3, and not seeing all the inventory on a site, most RTB systems won’t tell the bidder which impression of the user’s session we are seeing.
  5. Because bidders don’t know what session-impression of the user they are seeing, the first impression that the bidder sees of a given user on a given site is indeterminate with respect to the site session-depth. (e.g. the first “bidded” impression may be the first or the tenth or the fiftieth impression in the impression queue. This matters. A lot)
  6. The only way to measure and assess this issue (and hence have a fair valuation for similar-seeming traffic) is for both of these to occur (a) the RTB bidding system is aware of every single impression shown to a user on the site and (b) the RTB bidding system passes along with the bid request, a “local” depth count that tells a bidder how many impressions that user has seen on that site so far. Neither of these is particularly difficult, but neither is trivial either especially when it requires “seeing” all impressions for a publisher and living/coexisting with older publisher ad systems.

The Doughnut Shop Analogy

It’s like going to a doughnut shop and seeing a bunch of doughnuts. There are fresh ones just out of the oven, and day-old ones and everything in-between. But, if they are not arranged so as to make it obvious, we can’t really tell until we buy one and bite into it what the quality is. The shopkeeper can make up a box of fresh ones for us to buy but we don’t quite know at that point if they really are fresh or not. Impressions for a given site are just like these doughnuts, except I’m bidding for the doughnuts and not knowing if I’m bidding on a 2-minute old one or a 2-day old one, and no way to tell even if I buy them one by one, how old the next one is that I buy.

If someone sticks around the shop and sees the shopkeeper take the doughnuts out of the oven and place them on the rack, they would have a way to predict which are the newer ones, but they have to watch like a hawk (analogous to having to see/be aware of all ad inventory).

Perhaps us doughnut-buyers could get together and either make the shopkeeper tell us how old each one is, or make sure someone sticks around to watch the doughnut-making process constantly! Right now there is a novelty going on about these doughnuts, that unfortunately means a lot of us are perfectly fine eating stale ones and not knowing any better.

Follow Rob Leathern (@robleathern), XA.net (@xa_net) and AdExchanger.com (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. Mr. Anderson

    Rob, good stuff. Nothing to detract from it, the usual suspects will do that later today.

    All of these DSP’s that are all-in on RTB are going to have to reach into their deep pockets and pleade with these pubs to get ALL of their traffic RTB enabled so we can really see what happens. Yes, I know there is a lot of work involved, just do it. The demand side dollars I control won’t go after RTB until it is done.

    Until then, rock on in the API development world, Yahoo and Google are globally liquid, plenty of good fruit is being picked and that will continue.

  2. togilvie

    Rob –
    Great insights. One further piece of evidence is the pricing impact of DSP buying recently reported at both RMO and Leadscon. They both indicated a 3X pricing increase on the _same inventory_ that was previously sold to ad networks.

    This doesn’t include the cost of the data. If the data and the media cost the same, there’s a 6X premium being paid to reach that audience. Plus the fee to the DSP, plus the fee to the agency.

  3. Good post Rob. I’ve been hearing more RTB inventory quality and this piece helps me to understand the issue better. thx

  4. Noel McMichael

    Thanks, Rob. Very helpful. Sounds like additional evidence for the Publishers to bring more data to the table for their own good. Whether it be audience data or session data, they need to help us to help them. How else are we going get better prices for those doughnuts?

    Also, I’d like to add another bullet to your list for reasons of inflated pricing:

    7. Hyped up advertisers. I know, hard to believe, but I persoanlly know advertisers participating in RTB trials via another well known DSP, but they are doing most of their buying without any assistance from the DSP’s expert staff. In other words, they are out there bidding excitedly without much control (certainly not enough controls to remove bias in the trial), so the early reports that RTB is showing success because of much higher CPMs clearly isn’t telling the whole story.

    At RMO, it was said that higher CPMs were a result of better audience targeting. That doesn’t add up to me. I’ve been targeting pixel-ed audiences for many years on RMX, and I wouldn’t expect a different lift on these audience just because I was buying them in real-time. I would give more credit to freq controls than to audience targeting, especially if I am going beyond 1×24. My two cents.

  5. Varoujan Bedirian

    Wonderful insights, Rob. And great analogy.

    Some publishers will have certain pages they don’t put into RTB, as those are sold differently. E.g. site and channel homepages are easier packaged, sold and bought as roadblocks. A bidder will not see those impressions. Also, publishers will probably have another set of impressions that they’d rather not put in RTB as those would expose the depth of a user’s session, say the 100th daily impression, on this site (the deeper, the less valuable the impression). Publishers are better served chuncking up these impressions and selling in bulk to say 20 non-RTB 3rd parties, who each see 5 impressions/user; or if the publishers did put these out to RTB, to rotate the bidders, so no one bidder finds out the true session depth.

    So, outside of these impressions (roadblocks and session depth obfuscated), a critical factor in a bidder getting an understanding of depth is whether after choosing to serve a direct-sold guaranteed ad, the publisher’s ad server takes the impression into RTB (with guaranteed being the floor). If it doesn’t, visibility is lost. Assuming that’s the case, another piece that skews session depth visibility is the mechanism in which the publisher’s ad server optimizes direct-sold guaranteed ads. Some options worth mentioning are that it a) spread the guaranteeds across the hours of the day based on simple pacing or b) spread on gross profit based on maximizing margin. Given a bidder doesn’t know the details of this mechanism it won’t have a full/true picture of session depth.

    And let’s not forget that all else being equal, an impression’s worth is also predicated on total user session depth across multiple sites, and not just one publisher. That’s why this becomes a tough nut to crack.

    What the bidder will always have though is the session depth for requests it has access to. So, the technical problem becomes how to get more request access, while the business risk remains that unless one owns and operates the primary ad server utilized by a publisher (e.g. DFP) one can get cut off of access anytime.
    Either way we might be heading into a future of inventory consolidation into own primary ad servers to mitigate risk of loss of inventory, which would address the issue of session depth visibility.

    Basically, since one can’t know whether the doughnuts are fresh or stale just standing at the counter (bidder), one still can build and franchise doughnut stores complete with ovens and trays so as to have visibility into their processes (primary ad server) and know which doughnuts are fresh.

  6. Noel – to your point (from RMO), I am very skeptical of broad statements like “the CPMs coming from DSPs were 3X the CPMs coming through ad networks”, because it’s not clear how much data this was, who the participants were (some of the announced pilot participants may not have been running yet for example), over what time period these ran, what the comparison period was for the “ad network” data, was this normalized by site, etc. etc. But I’m also sure that this is part of the point, that the very newness of this medium and the hacky way in which RTB is bolted onto more traditional tag-based serving is going to lead to these kinds of higher-priced outcomes that may not be according with actual value.

    On the other hand perhaps it should make publishers happy to know that buyers are sometimes going to overpay for the same stuff while they’re still figuring out what they’re getting 🙂

  7. Eddie L

    Thanks Rob, interesting piece. Two comments:
    1) What makes you think that session depth is such an important data point on whether to serve an impression? Of all the impression-level data points available i find it hard to believe that this one is really all that important relative what else is available about an individual user and her online behavior.

    2) There is one DSP that is uniquely positioned to see every single impression – InviteMedia + Google + DoubleClick (DFA/DFP) is going to understand every impression purchased and served in/outside of the exchanges. Google, DFA, DFP, and InviteMedia will all share the same cookiespace and see everything.

    Guess that’s what happens when the shopkeeper gets acquired by the company that also owns all the ovens, sugar, baking trays, storefronts, and coffee. Doesn’t bode well for a vibrant, transparent doughnut ecosystem…

    • Thanks Eddie – good questions to ask.

      1) Frequency and session depth have been proven in the market (in our experience buying billions of media impressions via exchanges, site direct and both at the same time for some pubs) to provide differential performance. Think about it this way; say a site indexes high for widget-lovers – if advertiser A has a product appealing to widget-lovers and shows their ad first, they are more likely to get a response. Advertiser B may have a similar product and will have a lower response rate by being second in the queue. The real question is what is the price uptick needed to jump to the next position – just like in search, sometimes I’d rather be 5 in the queue if CTR delta < Price delta.
      2) Throw Google Analytics in there too 🙂 but potentially, if you have code on every page of a publisher, ad tag or otherwise, you could pull this off. They are clearly best positioned to do that, and the question is will anyone else in the market have anything to say about it or not? The challenge has been thrown down and so far I don't see anyone else stepping up to counter it.

  8. DSP Jockey

    But couldn’t a stale doughnut become much more enticing for that same hungry shopper if the icing (data) is plentiful and fresh?

  9. @DSPJockey: I think a lot of people are slathering on all-too-sweet data on top of stale doughnuts (crappy inventory) and coming up with something that is okay for now but is going to cause indigestion for many, and has probably led a few companies to raise some millions of dollars 12-24 months too early. Or maybe not — sugar is powerful!

  10. mike pubrealist

    2 points.
    1. If you want the freshest food go to the source and pay the premium for the first round. Quality ingredients and people who care. I hardly think that if you went direct that publishers wouldn’t want to hear from you. What makes you think you have a right to demand premium but pay for discount? And that extra cost is what you pay for piece of mind.
    Another view on that is that you might not even pay such a premium since the publisher is getting paid based on reporting from that same baker. The baker comes back day after day with the same story that the public isn’t really paying much for your food, so here’s your tiny share of it…but no there’s no audit or accounting or third party. Just these coins.
    Everyone would be best served with full transparency from end to end if that’s the direction. Too many deals go through multiple black boxes of networks, split and horsetraded among the data insiders such that it’s impossible to track back or view through the money and in the end 3/4 or more of the spend ostensibly to gain audience goes to folks adding zero value. You’d get more if more made it to the supplier of what you’re buying. This is not just a publisher, but also an advertiser issue…not to mention an issue of cash not driving enough production in the economy. That spend could be driving more advertising, and get the economy going again. The skim is keeping us all down.
    When does the internet get rid of the middlemen and make the markets efficient?

    Again, with all those stacked fees, going direct might take a little more cost in house to manage the bundling, or look for publishers that bundle themselves and stay transparent. if the multiples you mention are really there, i’ll bet it’s not greedy publisher sucking it up so you might find a partner depending on the arena you’re buying in.
    Another bonus is you get clean data and know where you stand, and a partner you know and that has the ability to provide special service, non-standard inventory and things tailor-made for your client. The networks don’t want truly valuable inventory they can’t hide, horsetrade and commoditize. I bet your client does. A publisher can do special things for you with access to all her own inventory as well as your true objectives and a real idea of the client and the client’s goals.
    And for the real bonus, you’ll be ahead of the social game. As folks are rightfully and finally questioning the silver bullet bs of retargeting and lookalikes and extending audience networks it’s becomeing clear already that while certainly an audience can be extended it is nowhere near the same to hit a ceo spending time on Forbes and to retarget her when she’s ordering pizza. The social relatioships translate to ones between a publisher and user. And hopefully publishers are seeing how they can deepen that trust such that it’s not easily mistaken for a random skyscraper on the landscape.
    Seems like we’d also learn a lot if anyone is testing the difference in response between junk ads served on a primary site and junk ads served on a junk extended network site to a prime candidate. I bet that falls off big time.
    Overall, this is great discussion. i encourage everyone to think about the party at the other end perhaps as having more shared interest than they think.