Ad network InterCLICK announced last week the “second production version” of the company’s audience targeting platform known as Open Segment Manager (OSM). According to the release, “OSM can create any audience that a client can articulate, and determine how that audience impacts their campaigns’ objectives.” Read more.
InterCLICK president Michael Katz discussed the product release and its differentiation within the marketplace.
AdExchanger.com: How do you differentiate Open Segment Manager (OSM) 2.0 from other audience targeting solutions in the marketplace?
MK: It’s first important to clarify the types of solutions that exist in the marketplace currently:
- First, there are earlier generation solutions that were purposefully built to consume vast quantities of a single type of data (i.e. content consumption or shopping data). Simply stated, these platforms were never architected to be highly flexible which ultimately prevented the business application from achieving any significant scale.
- Second, there are a handful of platforms that use basic Boolean logic to build audiences by combining data sources. These are certainly a step in the right direction, but they do not address how to effectively value data, which is fundamental in determining which sources to combine.
- Third, there are the usual vaporware providers. Great marketers but not technologists… “High definition audiences”, “3D audiences”, etc.
As I alluded to, the biggest issue with audience segmentation in the past has been the inability to truly deliver scale. OSM was built to address the shortcomings of other solutions by having the ability to ingest any type of data, quantifying the influence of data on media, and lastly, analyzing the results properly. All of this ultimately leads to far greater efficiency – finally at scale.
Maybe the biggest differentiator of OSM is that it actually does the math completely differently from any other solution on the market. We largely believe that everyone that we have spoken to in the industry has been applying flawed treatment and analysis of data and inventory. We have uncovered some extremely ground breaking findings which have led to tremendous campaign performance and we will be releasing case studies in the coming months.
We are very fortunate to have an amazing product team which architected this solution over 20 months ago based on a wealth of industry experience and a tech team which has successfully delivered its 3rd major platform within the past 12 months.
Are there any sweet spots for the platform in terms of target markets? Is it best for e-commerce marketers, brand/DR marketers, etc.?
Actually, I think this is a question based on assumptions from earlier generation solutions that ingested a single type of data. These solutions were limited in scope because of the flawed design principles (as well as improper analytics) so often times they would align well with a very specific type of campaign such as automotive or shopping based on the single type of data that was consumed. OSM was designed to be able to ingest any type of data, allowing us to build a very complete understanding of the users that we see within our network. This enables us to develop solutions for a wide range of clients no matter what the campaign objective or vertical may be.
What is the platform able to do with creative?
Again, everything that the platform does centers on quantifying and optimizing the influence that certain inputs have on a campaign’s performance given any campaign objective. In some cases we’ll help the marketer develop messaging strategies based on acquisition or retention objectives while other times we’ll analyze creative exposure data to see what type of impact that has on the campaign.
How has real-time bidding considerations impacted the development of OSM?
I think you have the question backwards. OSM and the successful application of data is what drives inventory procurement, not vice versa. Real-time or not, it actually makes no difference to us or our clients for that matter.
Matt Greitzer wrote a great article on ClickZ last month on how “real-time” is not what is driving successful innovation in the marketplace. It would appear he and I are on the same page these days… His point is that micro-segmentation and de-averaged pricing are the keys to driving success. I would add that it’s actually more important to de-average the value of impressions rather than the price for a number of reasons. That said, in order to successfully determine the value of each impression, you must understand the combined value of inventory and data. Deciding what to pay for an impression is simply just a bidding strategy and is very different from determining what an impression is worth.
This is one of the reasons that the DSP’s are failing as a whole, and yes they are failing. Inventory aggregation by itself does not fully address marketers’ needs; it makes no difference whether you access inventory in real-time via an API call or whether you swap ad tags. Proper inventory aggregation must be lead by a coherent data strategy. Such a strategy requires both a platform as well as an organization with domain expertise.
By John Ebbert