In display media, direct response tactics are gaining momentum toward the concept of audience buying. This approach, reliant on cookie-based targeting rather than using content as a proxy for audience, directs attention squarely toward third-party data aggregators, the core providers of this service. Most press and commentary in the data marketplace has been focused on online privacy and self regulation, but another area worth examining is the difference between the cost and value of targeting data as a product, and how disparity between the two may impact pricing in the future.
In order to best think about the concept of data cost, it is helpful to recognize that in the DSP landscape, ad impressions and targeting cookies are two separate entities, frequently provided by two wholly separate companies. A key benefit of DSPs is combining both pieces in a bidded environment in the interest of cost efficiency and granular testing. Typically, while the impression comes from an Ad Exchange like Right Media or Google AdEx, the targeting cookie comes from a provider like Blue Kai, eXelate, or TARGUSInfo.
Adding data to target a specific type of individual within the Exchanges incurs an incremental fee on top of the inventory cost; the data provider is compensated separately from the inventory provider. This is a positive, because it allows very granular testing of specific data segments in context to the inventory alone. In the past, when data and inventory were sold as a coupled product, it was difficult to tell whether the targeting or the media was having the greatest impact on performance.
That said, within a test, there needs to be an incremental lift in performance to justify that cost, or in other words, a sufficient amount of value. The disparity between cost and value within third party data is an ongoing debate in the landscape, as they are not always equal. Further, many within the industry are pushing to reconcile the two concepts in the interest of greater overall performance for advertisers.
Solving for this problem is something that technology providers in the space are increasingly making a priority. Zach Weinberg, President & COO of the Google owned Invite Media, a DSP in the space, asserts that “Data providers and DSPs need to get a little closer. What Invite/Google has been focusing on has been to make the purchasing process and evaluating data segments in our interface easy, so users of the product can freely experiment and test.”
That said, data companies appear to be willing to entertain the notion of marrying measures of effectiveness with the cost of their product. Mark Zagorski, CEO at eXelate, has asserted that, “If we really want to put our money where our mouth is, performance models that take into account quality aren’t out of the question. If we’re generating lift, it opens the door for us to get paid more, and will help the business grow. That way, everything moves towards a market driven and transparent pricing point, where it’s no longer a process of a sales person trying to get the most they can for a product.”
This momentum is also being driven by the agencies and advertisers themselves. Technology startups that are emerging around deeply granular measurement and reporting allow agencies to ask questions they couldn’t before. For example, just how precise are targeting segments? Can we be sure that a buy for ‘males 18-24’ is just reaching those young men? “We have found that there are some new companies such as Aggregate Knowledge that have promise in this space. We’ve also found the data companies themselves realize their models are evolving and are opening up to these discussions,” says Brett Mowry, Vice President/Group Director of Strategy and Analytics at Digitas. With regard to impact on cost, Mowry asserts that, “We believe the data industry will move towards a more transparent model that accounts for bleed in the data buys. Ultimately, though, quality will likely be defined by performance and data will be priced relatively.” The ‘bleed’ that he refers to, is instances where data is imprecise.
Zagorski asserts that the data companies are already addressing these issues. “The way to combat bleed is to look at frequency. We’re capturing data constantly, and create frequency and recency against each data point.” In other words, the more often and consistently you refresh your data, the more likely you are to be accurate. Regular auditing is important as well. “We can determine what anomalies may exist, and can determine relevancy. We typically show less than 5% of anomalies,” he continues.
As this targeting methodology continues to mature, agencies tend to recommend prudent investment in testing, analysis, and infrastructure. Further, it is important to align the performance indicators of a campaign with the tactics that make up the overall strategy. Many are beginning to find that the more appropriate applications of this data are more mid-funnel performance indicators. For example, the best use of this targeting may be to build awareness and preference among a target audience. Driving users of a certain category or segment to an advertiser’s site or offer may feed the funnel and grow a base of potential converters. After all, finding a logical way to fit all available and appropriate tactics into a media plan is the art within digital advertising, often an ongoing process relying on testing and optimization.