Beware Of The Risks Of Using A Single Demand-Side Platform (Part 1)

Pascal Bensoussan of Aggregate Knowledge“Data Driven Thinking” is a column written by members of the media community and containing fresh ideas on the digital revolution in media.

Today’s column is written by Pascal Bensoussan, VP of Products at Aggregate Knowledge, a buy-side optimization platform.

This is Part 1 of a 2-part series. In Part 1, I will make the case for why agencies and large advertisers should spend more time thinking of deploying a media buying architecture that can accommodate multiple Demand-Side Platforms (DSPs). In Part 2 (my next post), I will provide specific guidelines on how they can do it.

Today, there are about 8 billion impressions available daily on RTB-enabled ad exchanges (I am excluding ~5 billion impressions from Right Media since it is not yet RTB-enabled) and this number is expected to double or triple in the next 24 months. That means 300,000 biddable impressions every second, with peaks at 3x-7x in the order of 1 to 2 million impressions per second. That’s a lot of biddable inventory that a real-time bidder would have to scan through to cherry pick impressions!

Here is the catch. Most DSPs operate in the 6,000 to 12,000 bid requests per second range, a fraction of the inventory available. This number will certainly grow, but not as fast as the inventory available on the exchanges.

Don’t expect full visibility into all biddable impressions from one single buying platform anytime soon

I don’t expect DSPs to achieve much greater scale in the near future for multiple reasons:

  1. Early adopters of real-time bidders want cheaper access to larger pools of targeted inventory. As such, they use DSPs as retargeting platforms, *not* as trading platforms. An illustrative example can help make the point. Take a campaign delivering 100 million impressions to retargeted cookies over 4 weeks. That means buying an average of 40 impressions per second, far from the hundreds of thousands available.
  2. RTB-enabled ad exchanges and DSPs are just getting started. Their technology is relatively young and they are still toying with very exploratory bidding and optimization algorithms (if any). None of them have the large-scale experience and the analytical infrastructure to draw reliable and statistically significant lines separating the good inventory from the bad inventory by campaign, by audience segment, and by publisher.
  3. Retargeted inventory is not differentiated beyond standard dimensions such as time of day, day of week, ISP, or connection speed, which means that the grid to scan inventory is still too coarse to warrant accessing larger pools of seemingly equivalent inventory.
  4. Volume of biddable inventory will keep growing faster than the technology capable of scanning all of it. Although raw talent is absolutely necessary, it takes a lot of resources to build a scalable infrastructure capable of processing up to a million impressions per second. So far, DSPs have not had the time or the money for this.

Be prepared to test and use multiple demand-side buying platforms in parallel

For agencies, betting on one DSP so early in the game involves a significant risk, particularly:

  1. Coverage risk: DSPs will only access 5 to 15 percent of all biddable impressions available on ad exchanges every second.
  2. Platform/technology risk: There is no clear winner, with platforms still evolving significantly, and new DSPs entering the market monthly.
  3. Competitive risk: Hedge your bets against DSPs being acquired by a direct competitor or by your “frenemy” or if they start competing directly with you. In his latest post, Rise of the Demand-Side Service Layer, Michael Walrath writes that even though their relationship is very symbiotic today, DSPs may come to look more like today’s digital agencies, while digital agencies will act more like DSPs.
  4. CPM risk: The more clients DSPs have, the higher they will pay to win exchange impressions. Google must love that! This is a direct result of the having more demand compete for the same thin slice of inventory with an optimization grid to coarse to differentiate anyone bidder. Demand Side Platforms are not set up to protect individual clients by scanning separate pools of bid requests (each at a rate of 6K to 12K per second) for the exclusive benefit of each client individually. Instead, they scan a single pool of bid requests and decide which agency, advertiser, or campaign, competing for the same impression, wins the bid based on bid parameters, pacing constraints, and inventory quality. Is this fundamentally different from how an ad network arbitrates media across competing sources of demand?

To increase inventory coverage, minimize platform/technology risk, and better position themselves for future arbitrage opportunities, agencies should be prepared to deploy a media buying architecture that is able to support multiple concurrent DSPs with low switching costs.

Understand what it takes to deploy a multi-DSP business and system architecture

Implementing a business/system architecture that supports multiple DSPs is not easy. There are numerous challenges to overcome. Here is a list of six critical goals that you should set for yourself as you embark into this journey:

  1. Think plug-in from the get-go: Define a plug-in architecture allowing low switching costs across data providers, media channels (including direct publishers, ad networks, and DSPs) and dynamic creative vendors without silo-ing the “who” (your audience), the “where” (your media), and the “what” (your creative).
  2. Centralize audience management: Have a coherent data strategy across all data providers and centralized audience management across all media channels (including all DSPs).
  3. Remove frictions across multiple media channels: Deploy your DSPs partners in a way that minimizes duplicate work, allows for global frequency capping, and avoids overpaying for exchange inventory.
  4. Track every penny, whether spent on data or media: Implement a robust accounting system to track meticulously and allocate all your media buying and data buying costs whether you make bulk buys across advertisers or campaign-specific buys.
  5. Unify reporting: Integrate your own ad server data (the only one that really counts) with detailed user profiles and a complete picture of all your data and media buying costs.
  6. Close the loop: Define your our own attribution models and the key metrics allowing audience ROI analysis, campaign delivery tracking, and apple-to-apple comparisons across all your data providers and media channels.

Look for specific guidelines on how to address those challenges in my next post on In the meantime, I would love to hear your thoughts on this topic.

Follow (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. Interesting post Pascal. We’ve heard talk of the exchange of exchanges model, and of the metanetwork, now we’ve got the DSPs’ DSP. If this is an issue I’d like to think that the well funded DSPs would be raising the additional funds necessary to build the required scale to handle the millions of ad calls and apply algorithmic optimisation across multiple data points. They will be rewarded with additional business for doing so.

    As a business firmly sitting in Walrath’s service layer we do aim to keep our back end systems ‘DSP agnostic’ and we also continue to run our Right Media seat which will run through our DSP so we are not totally dependent on our partner.

    I don’t like to think of integrating multiple DSPs though. We’d be so far into managing the complexity of the set up that our ability to service our agency and direct advertiser clients would be affected.

    As I see it the service layer is all about using existing technology effectively and being able to change easily to a new provider if one draws ahead on the product development front. Yes, we need to keep risk to a minimum but building another adserver and reporting system on top of multiple DSPs seems a step too far.

    Look forward to part 2.

    • This is getting ridiculous. Daisy chaining DSPs and creating more layers between the buy and sell side? WTF? Does anybody else that the space is getting too convoluted?

      • Ciaran – You’re absolutely right. Daisy chaining DSPs is a bad idea. Ad ops complexity, ad serving latency and reporting discrepancy would be out of control. Note that my post does not say or recommend daisy chaining DSPs.

  2. Curious

    Just a simple dumb question here Pascal – if you use 2 DSP’s, aren’t you actually setting yourself up to bid against yourself in any auction where those 2 DSP’s share the same data point?

    Seems like a poor decision to me…

  3. Pascal, your numbers are about 6 months out of date. The QPS range of a good DSP is now 150-300,000. The whole point of using a dsp is to have a single platform for buying media to manage global frequency, unified reporting and a single data integration point. Using multiple DSP removes all the value. You might as well just use a flight of ad networks.

  4. Hilarious has turned into a bogus soapbox for everyone in the display 2.0/3.0 world. These articles have lost their credibility as is this website.

    I do thank Adexchanger for all of the great content, but these recent articles are a complete joke; a higher editorial standard needs to be set so we can actually trust what we read.

    After talking with many DSPs and hearing all of the vaporware claims and BS that was spouted on the phone, it now seems to have made its way onto this website. Once you spend a dime in this space, you realize what’s true and what’s not and most of what’s written and said during pitch calls is complete BS.

    Seriously, guys, please stop this crap and provide good content. It’s zero secret that what you are doing is just shilling for your own company — that’s ok, that’s what content PR is about, but at least you could provide some more value along with it.

    Having demo’d all of the top DSPs platforms, I suggest you guys focus more on improving your own product vs. spouting all of this nonsense — the industry is young, yes, but you guys have A LOT of more work to do for it to become actually usable with ease., the ball is in your court to clean up this mess and get your editorial path back on track. We all will benefit from it as will you.

  5. John Were – You make a good point, but I don’t think there will be a dominating DSP capable of accessing and servicing *all* media available anytime soon. Existing DSPs will differentiate, verticalize their offering, become the best in one particular market. As such, DSPs will become new media channels for agencies. I also agree that you don’t want another ad server inserted in the ad flow. As for reporting, until DSPs can implement industry-standard click filtration and robust cross-channel attribution, a more integrated reporting system is fully justified.

    Curious – Your example is right on. Giving yourself the flexibility to use multiple DSPs comes with a few challenges. My next post will offer a few ideas for how to address them.

    Hilarious – What you are saying is exactly my point. This industry is very young and there is way too much noise and inaccurate statements to invest in one DSP too early. We are in the middle of a transitional period, where early adopters have tried one DSP on a few campaigns. Now what? Two options: (1) Sign a master agreement with that DSP because you have proven you can buy from multiple exchanges or (2) continue to work with multiple DSPs to get smarter about what it takes to be successful in the long term. My next post will offer ideas to help you decide.

  6. Christian M

    It all fairness no one is doing 150-300,000 QPS. If they are then they are taking way too few variables into their decisioning. Heard good things about Aggregate Knoweldge but this is clearly content PR. A true DSP wouldn’t ever make the case for using more than 1 DSP unless they are getting cut out of conversations as being the main provider.

  7. ecosystemadvocate

    This article is so well written Pascal. Working with one DSP, which is probably the best thing that can happen to agencies, will put middle men, that position themselves as DSP aggregators, out of business. I agree, it is in the best interest of the eco system to have more mouths to feed. Having DSP aggregators in the mix and giving them a % of the spend will employ more people in this industry

  8. Competition

    Hi there:

    “3. Remove frictions across multiple media channels: Deploy your DSPs partners in a way that minimizes duplicate work, allows for global frequency capping, and avoids overpaying for exchange inventory.”

    Sounds like a great idea. I have no idea how you’d get competitors to all agree to avoid the same best performing inventory though…

    Also, many DSPs provide the service you describe in 4, 5, and 6 of the last segment. Sounds like you’re suggesting agencies build a DSP to account for their sub-DSP ad delivery… when does the nesting doll stop?