“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Nathan Woodman, senior vice president of strategic development at IPONWEB.
There’s a scene in the 2002 Tom Cruise movie, “Minority Report,” that has become legendary in marketing and ad tech circles.
In the scene, Cruise’s character, John Anderton, walks through a crowded mall. Retina scanners and other technologies identify Anderton in real time and serve him entirely personalized ad experiences and product messages. Upon walking into a Gap store, a hologrammatic shopper bot immediately recognizes him, asks about previous transactions and recommends additional items he might like.
For advertisers, this looks like the holy grail of marketing: true one-to-one communication – though in a more dystopian, big brother kind of way. For mar tech companies, it represents the challenge everybody is trying to solve, and fast.
The average enterprise uses more than 12 of these different solutions to power its digital marketing efforts, with some companies using more than 30 unique tools, according to a 2015 report from Winterberry Group and IAB. With 70% of companies planning to increase their mar tech spending in 2017, one can only assume those numbers will climb even higher.
Most of these tools are smart, powerful and effective at doing what they’re designed to do. Most leverage machine learning to create value, drive performance or both. What happens, though, when they aren’t talking to each other? How smart can they be when they don’t see what the other tools are doing?
Take a DSP, for example. A standard machine-learning application in a DSP may record and use a variety of attributes to decide which impression to buy and how much to bid on it. One such attribute might be the length of time since a user last saw an ad impression.
If an advertiser is using multiple DSPs – as most are – how would any single DSP know the last time a user was exposed to an ad without understanding what the other DSPs are doing? Now imagine compounding this situation with social, search, email, web visits, TV and all other forms of media exposure.
This lack of communication between systems creates massive data gaps, where each system is blind to the data of the other systems being used. This data blindness results in each system’s unique machine-learning applications making decisions and learning behaviors based on incomplete or inaccurate data.
DMPs came along to fill this void, but the application of the data contained in DMPs has largely failed because the preferred method of distribution has been through heuristic rules that guide segmentation and then one-way pushes of one-dimensional audience data into existing DSPs. This process flattens the contrast of the underlying data set and degrades performance.
One approach to solving the data gap problem, at least in the programmatic channel, is to use a single stack with an integrated DMP and DSP. This seems viable, but most brands and agencies use multiple DSPs to achieve mass reach and drive efficiencies. It also doesn’t solve for the problem of nonprogrammatic media channels.
Alternatively, a brand or publisher could build its own stack tightly integrated with its own data sets. This would require an enterprise to build, replace and maintain the 12-plus tools they currently use with homegrown technology – a daunting task, to say the least.
Perhaps the most viable path lies somewhere in the middle, between consolidation and building one’s own stack. With this approach, the same data that is already being captured in a DMP can be used to build a holistic decisioning engine that looks across an increasing number of digital marketing tools and consumer touch points to decide what message to present to what user in what channel.
The result is an enterprise-centric learning system that sits atop a series of ad tech stacks and executes against marketing directives in as close to real time as the points of distribution allow.
This is not yet another mar tech product, but rather a process. It requires analyzing current available data sets, including gaps between systems, as well as the distribution interfaces of the existing media channels. Think of it as marketing stack management, powered by enterprise machine learning (bring on the acronyms).
Wider adoption and support for open machine learning or the concept of brand-controlled decisioning algorithms (brandgorithms) that are portable across distribution channels would only accelerate this process. But first, brands and agencies need to ask for this level of openness of their mar tech and ad tech vendors.
Only when all channels are considered holistically can media decisions be made intelligently, bringing the promise of “Minority Report” and one-to-one marketing closer to reality.