The Algorithm Is the New Decision Maker: Communicating with the New Demand Side

Ad Agents“Ad Agents” is a column written by the agency-side of the digital media community.

Greg Hills is platform manager of media trading agency, Varick Media Management.

My work focuses on the economics of advertising, but recently I’ve been thinking about the political economy of advertising. After all, advertising dollars don’t have a mind of their own. They need industry professionals to push them around from one company to another. Trusted personal relationships have historically been the conduits through which ad dollars flow.

This relationship driven world of advertising is now being replaced by the data driven world of advertising. RTB, DSP, SSP….these acronyms have got sales leaders thinking. How do you staff up for this alphabet soup of new business models? Who do you call on and what do you tell them? What does the advertiser need and how do you win their business as a publisher?

When I was a media planner, I had the answer to the last question. I could tell you what the advertiser needed and I would decide whether or not you met that need. This is the type of arbitrary power that brings a recent college graduate towering seafood platters, custom sneakers, and a taste for fine scotch.

Mad Men vs Algos

Working on the new demand side, I still meet with major publishers and tell them what my clients need. The needs themselves don’t change that much, in fact. Brands still want effectively priced advertising units in safe environments that will get consumers to engage with and buy their products. The key shift is that I no longer decide whether any specific publisher or web page fits that need. The algorithm is the new decision maker.

The algorithm is a better decision maker. It bids rationally by fully incorporating learnings from past performance. It can value tens of thousands of individual impressions per second based on multiple data points.

Algorithms also remove the physical constraints that limits agencies to a small list of publisher partners.  An actual media planner using phone, fax and email can evaluate proposals from a couple dozen properties at most. From a time management standpoint, it is inefficient to actually go through with buying and optimizing more than a dozen sites/networks. But with algorithmic buying, the agency can buy and optimize in real time across thousands of sites.

Advertising is the art of persuasion but personal persuasion has now been taken out of the media buying process. Is it fair to make publishers compete on raw performance and not give them any appeals process when they lose? I would argue yes. It’s certainly a better deal for the brand whose marketing dollar is now working harder. I’d also argue that it makes things more fair for publishers, since they are no longer competing on the basis of access to decision makers. The algorithm is the decisionmaker, it will evaluate all publishers in the secondary channel, and its unbiased.

The traditional sales conversation — scheduling a conversation, determining if the product is a fit, then negotiating price — still happens but it occurs between agencies and technology vendors, not between agencies and publishers.

Now, the agency-publisher conversation is less of a sales conversation, and more of a collaborative problem solving conversation. Both agency and publisher are solving for the same thing: getting as much inventory as possible in front of the true decision maker, the buyside’s bidding algorithm.  Below is the complex equation – the now infamous ecosystem slide:

GCA Savvian

Ideally, this chart would be much simpler. There would be one big pool inventory that everyone plugged into, and bidding optimization would be entirely automated. This is not the case, unfortunately. Between the brand and the publisher, there are lots of different TradingDesk+DSP+Exchange+PubOptimizer+Publisher permutations, some of which may lead buy-side actors and sell-side actors to be disconnected. Coarse-grained optimization, like eliminating entire contextual channels or entire exchanges, also removes individual publishers from consideration.  So the conversation becomes about managing the supply chain to minimize these disconnects.

To use the old media buying paradigm as a metaphor, its as though agencies and publishers are administrative assistants, working together on logistics so that the the publisher’s inventory can get in front of the ultimate decision maker, the algorithm. That doesn’t sound glamorous, but getting the supply chain right offers much greater rewards than even the biggest direct deal.

Follow Greg Hills (@gregoryhills) and (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. Matt Barash

    Long before Al Gore invented the internet, Leonardo da Vinci made a prophetic statement, “Simplicity is the ultimate sophistication”. And while simplicity might be ultimate sophistication, data and technology afford the ultimate leverage in a very complicated market. We all spend a lot of time getting to know one another in a dynamic media playground and the digital sandbox is a crowded place. Most of you reading this would agree that John Wanamaker’s statement “Half the money I spend on advertising is wasted; the trouble is I don’t know which half” sums up the traditional means of transacting. Now as we’ve evolved into the era of the Acronym (RTB, DSP etc), I’m not sure we’ve solved 100 percent of the question, but as a marketplace we have certainly made tremendous strides to trim the fat. Does automation solve a complex problem with a simple solution? Or is it a simple problem with a highly sophisticated solution? Welcome to the digital renaissance. Nicely said, Greg.

  2. “But with algorithmic buying, the agency can buy and optimize in real time across thousands of sites.”

    Ah, would that it were that simple. It is going to take more time than I had hoped for this to happen. Right now what we have is a series of algorithms, some very old living in legacy publisher adservers, unaware of one another (if they were aware of each other you could say they were fighting for impressions but alas, no) and struggling to match consumer attention with advertising content befitting of said attention.

    Per my count, less than a third of US online impressions are biddable in any way, and a large portion of the biddable impressions are perforce of lower quality. The data we’ve seen buying direct vs. buying on exchanges proves it out again and again.

    Until publishers put premium user impressions at the very top of the session queue up for bid (and/or make premium inventory easy to buy in an automated way), we’re going to be feeding our carefully-crafted algorithms the grass-fed Filet Mignon data along with the Commercial-grade ground chuck inventory.

    • Greg – it seems like every new entrant in the data centric market is focusing on harnessing the data and evaluating the data side of the equation (including me), yet media is just media categorized in a few big clumps. Are any of the DSP’s or exchanges working towards further identifying the media side so intelligent decisions can be made about the impression quality too and not just the data? It would be great if every time a publisher put an impression on an exchange, the impression had identifiers built in that could tell the buyer information like what type of page the ad is on, how many other ads the visitor had seen in the surfing experience on the site so far, how many other ads are on the page where the impression will appear, above the fold, below the fold, etc…

      It seems to me that if we had deeper impression info available at the time of a bid, we could match up the Filet Mignon data with the Filet Mignon impressions (I prefer NY Strip but thanks for the analogy Rob) and find extremely effective inventory for the advertisers.

      • Interesting, Alan, I tend to think the opposite. I think media valuation is much more advanced that data valuation. Even though the impression meta data available on the bid request may be incomplete, its still better than the metadata attached to data.

        Agreed though, more impression metadata would be a lot better.

      • Alan, that data on the impression is available now through RTB. Most good exchanges now pass a unique tag id for each placement. A DSP can use that to break out all the possible placements along with urls and domains to ascertain the media value. You can also track the user id as it moves around a site to get a sense of where in the session the user is. Some of the exchanges pass above and below the fold data. A good dsp should be able to use all that data to bid differentially on each impression according to its predicted media value as well as the user value.

        Rob, but it is getting better… Remember the crap we were buying on the exchanges this time last year? As prices have been deaveraged with RTB good publishers and good impressions are being rewarded.

      • Seni Thomas


        My company Last Mile Networks, does exactly what you suggest. We are in stealth mode at the moment, but if you want to chat offline I’d be happy to share some details.

        drop my a line,



    • Rob — Agreed, we haven’t reached the end state.

      Great supply of higher quality biddable impressions will bring us closer. You set biddable in opposition to direct (agency?) sales, but there are also networks with pub relationships. My thesis is that impressions that used to go to networks will go to biddable inventory sources.

      Also, agreed, we have lots of algorithms working independent of each other and sometimes against each other. That is the “supply chain management” problem that I focus on. I don’t think its insurmountable.

      Also, you identify structural issues with exchanges where the bad might force out the good. I don’t really think that’s a structural market issue. The transparency problem is in many ways a technology problem that lots of companies are working on. Granted, if at the end of the day the sell side believes that the sales channel conflict presented by transparency outweighs the increased eCPM that transparency brings in the exchanges, this is going to be a big uphill battle.

      Also, I think we’re in a transitional state where price volatility can make exchanges a less desirable sales channel. I suspect that volatility will decline as everything matures. Regardless, the volatility represents a business opportunity for SSPs.

      I agree that providing richer impression information on the bid request is the solution. What I’m concerned about is setting up highly static publisher direct contracts for data driven display. Regressing to manual processes for the sake of “premium” inventory access is a Faustian bargain.

  3. Great post again, Greg! It would be interesting to get your thoughts on data-centric companies, like ours, to the extent that making data more optimal is more important than optimizing data processing or ad delivery. How do you think the future of data will play into the efficacy of algorithms, and how will algorithms be valued in a world where the differences in quality of data may be enormous?

    • Thanks Albert. I don’t write algorithms, but you’d think that the heterogenity and inadequate metadata associated with any given cookie-based audience segment would be a hindrance to developing effective buyside data valuation algorithms.

      But who knows…publishers may start passing more audience data on bid requests to increase their yield, effectively rebundling audience and impression. This would inject more data into the ecosystem, since the buyside wouldn’t have to decide to pay for the data upfront independent of media. By reducing the number of data aggregators and intermediaries, this could improve data quality and homogeneity.

  4. A nice, well articulated view, Greg. Whilst the role the media buying executive plays is clearly changing, so too is the overall ‘value proposition’ that media buyers need to pitch. ‘Processing power’ is the new ‘buying power’, and ‘aggregated data’ is the new ‘aggregated spend’.

    • ‘Processing power’ is the new ‘buying power’, and ‘aggregated data’ is the new ‘aggregated spend’. — I like it!

      Here’s how I see the evolution:

      First – Direct Buys: Aggregated Spend = Pricing Power = Competitive Advantage

      Then – Non-RTB Exchanges: “Egalitarian” landscape where having the impression level valuation occur within the exchange black box puts players of all sizes on an equal playing field with regard to impression valuation.

      Now – RTB: Your aphorism demonstrates how RTB is causing scale advantages to make a comeback in data driven display.