Data is everywhere. There are iconic ecosystem slides dedicated to it. Billions of dollars in capital have been invested to make sense of it all. As a result, we have seen maturity to the point of hyper-specialization and very attractive exits. But even with all of the advancement, we still haven’t solved the billion dollar problem of determining which drives the most optimal marketer outcomes. It’s the reason certain media dinosaurs dismiss data, targeting, and analytics all too quickly. Solving this problem requires completely new thinking about the challenge.
It’s important to start upstream and look at the digital media planning process, where the problem originates. The way digital media is planned is archaic; it is based on the way that traditional media has been purchased, where consumer data capture is extremely limited. Popular planning and measurement tools leverage demographic data to describe audiences that index high on certain sites and while this information is directionally helpful, these tools fail to address the challenge properly.
To highlight this, take a look at pretty much any RFP sent/received. There is a good chance that the target audience defined on the RFP leverages demographic targets (i.e. Males 25-54, HHI $100k+). Simply put, the problem with this is that not all consumers who look alike will act alike. So these targets represent placeholders and it is left up to the vendor to figure out who to target. As such, billions of dollars are wasted in media every year because we are applying an analog approach to a digital problem.
A number of different audience solutions attempt to address this problem. We can separate them into four categories.
- Audience description solutions. These platforms were built to collect and analyze clickstream data and use descriptive statistics to highlight what audiences look like demographically. In the analog media model, demographics have served as a reasonable proxy for how to allocate media spend - mainly because there is no better option. Offline, there is no way to track someone opening up the refrigerator door, recognizing that they don’t have their favorite food, and then driving to the store to buy it. So marketers will attempt to create proxies that describe the consumer, but again, the problem is that not all of the consumers who look the same act the same. Online, the story is very different as just about anything a consumer does is captured and made available to marketers. Therefore the challenge online is to identify the behaviors that consumers’ exhibit that signify impending action and then build predictive models based on actual interaction.
- Offline models applied online. Often clients try to apply their offline models to their digital swim lane, but there are a couple of fundamental differences that must be considered. First, in offline models, data capture happens far less frequently so segment membership can take up to 6 months to update. By then, the consumer may have already purchased the product or service that they were looking for, and thus the data derived is usually not actionable. Online, data capture is real-time and far more frequent across many more data players. And because users can see an ad, interact with it, and transact immediately, online models must take into account real time behaviors and segment membership must be updated in near real-time. Additionally, prediction in real time environments where there is heavy competition for a scarce number of highly valuable consumers becomes even more important, which is non-existent in offline models. Finally, digital models can take into account data captured from both interactions between consumers and brands as well as consumers to consumers (social).
- One dimensional audiences. Many have built predictive solutions based on online data, or even a combo of online and offline data. After all, “behavioral targeting” is nothing new to digital marketing. But most of the solutions that have been in market were built to construct audiences based on a single data type or source, so they have a very limited understanding of consumers since people aren’t one dimensional. Because of this, these approaches have failed at providing scalable audience solutions due to their limited data set. The optimal solution requires an in-depth understanding of the consumer, which is contingent upon: 1) A huge data set of many diverse data points; 2) The right data structure.
Combining a bigger and more diverse data set with the right structure allows for better treatment of data and ultimately more accurate models - models that deliver better performance, no matter how it’s being defined, and are scalable.
- Modular Tech Stack. The popular setup involves a DMP, a DSP, and an analytics suite. This has been the solution of choice at many of the agency and trading desks for the past couple years. Since the DNA of an agency is to manage concentration risk among vendors, why wouldn’t they want a solution where they can plug-and-play any number of vendors? The reason is that rationale doesn’t apply to technology partners. The lack of deep integration actually creates more risk and opportunity cost than finding a competent solution provider and developing a true partnership.
The problems are numerous so I will attempt to highlight the most poignant.
- When exclusively relying on programmatic buying opportunities, it is important to keep in mind that the inventory available does not reflect the total universe of available inventory. As such, availability of brand-safe, high quality inventory is inconsistent and the highly valuable users may be unreachable, or may be reachable in environments that are sub-optimal. For example, pages where there are excessive ads and the placement is below the fold. Cost may be cheap but it doesn’t translate to value.
- Without investing considerably in developing data partnerships, knowledge about consumers and the data that drives them will be extremely limited. As discussed earlier, models will be flawed.
- The culmination of these first 2 points results in extreme sample bias which translates to a high error rate and either poor performance or lack of scale.
- Independent platform integrations which are largely pixel-based will have high rates of data loss.
- Historical segment-level performance analysis on users is limited, so models are incomplete. It’s important to know what triggered the ad call, following that user from impression to action and analyzing the data construct of the users who successfully converted. Even with server-to-server (S2S) integration, the DMP will not receive all of the DSP event-level data or vice-versa which makes it impossible to get a complete understanding of why things happened.
- Finally, workflow inefficiencies between two separate and unrelated systems create operational barriers to success.
As I highlighted, many solutions fail to address the challenges in audience targeting holistically. In order to be successful you need the right combination of data, media, and technology. Data - you need a large & diverse data set; you also need the right data structure. Media - in addition to accessing automated buying platforms you also need access to inventory directly on websites. Technology- you need a completely integrated tech stack purpose-built to ingest any type of data, tying user and outcomes back to these data points.