Adometry Takes a Step to Integrate Attribution Data With DSPs

By
  • Facebook
  • Google Plus
  • Twitter
  • LinkedIn

It's a common refrain that last click attribution is broken, but what should replace it has yet to be resolved. Algorithmic attribution modeling is perhaps the leading candidate, and vendors such as Visual IQ, Convertro, and Adometry are competing to provide solutions in this area. But these attribution specialists still struggle to make fractional attribution data available to marketers in ways that are actionable on their ad buying platforms of choice, without giving away too much data to those platforms.

This week Adometry took a step that may ease this problem. It released Attribute Catalyst Framework, allowing clients to share data feeds indicating the conversion credit assigned to specific media placements. The capability is currently available through Advertising.com, Google's DoubleClick Bid manager, MediaMath, and Videology, and Adometry is actively reaching out to other buying platforms.

We asked MediaMath CEO Joe Zawadzki about the Adometry framework, and he had this to say: "If you half-ass attribution then you end up with competing metrics and yardsticks, and that makes it hard to make rational decisions in terms of reallocating budgets and getting CMOs to approve plans. If you get attribution in place and its simple enough to implement and everyone can rally around a model, then you can do the right stuff."

Zawadzki said Adometry's Attribute Catalyst Framework is unique among attribution vendors in that it doesn't require MediaMath to do custom, non-standard integrations. "It is equivalent to Open RTB standards in exchange-based buying. It makes it easier for other people to consume it. It's first mover in the sense that it's standardized and syndicated," he said.

Adometry CEO Paul Pellman spoke with AdExchanger about the framework.

What's the Attribute Catalyst Framework about?

PAUL PELLMAN: This is our first effort to allow clients to leverage the various transaction platforms that they are already using. We never want to be a transaction engine, we never want to sell data or transact data. Our goal is to be an agnostic analytics platform that advertisers and agencies can fully rely on, that houses their most sensitive, most valuable data – which is, what’s actually working and what’s not actually working.

Our clients want to make sure that we can push just the right amount of insight out to their various partners so that they can do what they do best, which is buy the right media, transact the right media, and find better performing media for their clients.

Can you share a use case on how this works?

Sure. We’re gathering all of this user level data on all the media that an advertiser is buying, and providing very specific insights on what’s performing using our fractional attribution methodology. What this implementation allows an advertiser or client to do is push those attribution results to these platforms. Typically, MediaMath only sees the media that they’re selling to an advertiser. So their ability to do attribution themselves is hindered, and on top of that, most advertisers are unwilling to give more of that data to someone like MediaMath. They only want to give the right data.

We're providing data to them with specific…fractional attribution insights. Of the media that MediaMath bought, what were the conversion credits of all of those media elements at a very granular level? MediaMath can replace the last of that conversion data that they are using in their bidding outputs, with far more accurate fractional attribution insights. But the only data that MediaMath sees is the conversion credit -- the fractional credits that their media got. So it allows them to see the accurate data to make decisions, but the advertisers are only sending them data back on the media they bought.

Which of your competitors are supporting standardized data sharing with DSPs?

There are plenty of folks that are doing integrations with DSPs. That integration is typically pushing out cookie lists, which is a good thing. A lot of DMPs do that level of integration. As far as we know, and this is based on the discussions we have had with all of these platforms, we’re the first to be able to push this level of attribution insight within these platforms. And we know that because we’ve worked with all of them on building the specs and the process for doing this. It’s not like any of these folks came to us with the standard spec and said, here’s how we’ve done this with other folks.

We’ve created that spec, and we’ve done it a few times. That’s why it’s a "framework," because it’s a standard certification level we’ve been able to build. There’s still effort involved, but now it’s really straightforward for them to implement.

Adometry does not do media execution. Do you see that changing?

No. We draw a bright red line between providing analytics and transacting media. I think most big marketers are going to want to have that segregation. This is not to denigrate any of these folks. All the partners we’re working with now and others we’re talking to, they’re all great partners. They want to do what’s right. They want to apply more of the right media to advertisers, and if they get better data that helps them do a better job of doing that, they win. They show better results for the advertiser.

I just think, as a prior marketer myself, that you want to have segregation between the [attribution] platform that is giving you agnostic insight on really valuable conversion data versus the transaction platforms that you buy media from.  But you don’t want to give them more information than they need.

If you have a company that is trying to do both, they have dual purposes, dual focuses of how they make money.

What else?

One thing that sets us apart is using a data-driven approach, versus simple predetermined rules. Folks that use simple predetermined rules could do an integration like this with various DSPs more easily. But to do it in a data-driven approach is really a challenge. The reason for that is, we’re providing this insight on a daily basis, which means, we’re taking all of this data we’re collecting from our customers, and we’re running attribution on a daily basis.

Running attribution with a data-driven approach is a big data processing challenge to begin with. When you start running it on a daily basis, that becomes even more difficult. You can’t provide this data on a weekly basis or on a monthly basis.  You have to do it daily in order to be effective.

  • Facebook
  • Google Plus
  • Twitter
  • LinkedIn

Email This Post Email This Post

By on at

Leave a Reply