Home Data-Driven Thinking Event-Level Data Enters The Spotlight

Event-Level Data Enters The Spotlight

SHARE:

aramchekijian“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Aram Chekijian, senior vice president of analytics and insights at Accordant Media.

The recent purchases of Adometry and Convertro by Google and AOL, respectively, signify anindustry that is finally taking big steps toward dismissing last-interaction and rule-based attribution models.

But there’s another exciting story behind this news that involves event-level data. The big media companies are squarely focused on the value of being able to access, manage and evaluate event-level data, captured and logged on a rolling basis, in an analytical tool kit never before possible in traditional media.

Top Down Vs. Bottom Up

The majority of media measurement in the past – even full-portfolio, econometric marketing mix modeling (MMM) – has been a mostly theoretical practice. “Actual” media impression delivery is estimated and statistically fit to sales patterns, controlled for other factors when possible, and then adjusted up to a total population level to simulate the real world. These models are top down. Due to the nature of the buys, and the laggard “actualization” of the data, it usually takes weeks – even months – to interpret and execute against it. They are macro by design and their applications are big picture and thus relatively academic in practice.

On the other hand, the current ubiquity of event-level data, which captures media exposure at the near-subatomic level, enables better analyses and faster actionability. The benefits of this bottom-up approach built on granular, observed data are numerous: Delivery estimates are no longer necessary; models become more stable; statistical significance is easier to attain; and again, results are actionable in a tight, closed-loop implementation.

Now, the bottom-up paradigm is taking hold holistically across mediums as new digital channels emerge and programmatic exchanges become more widespread, standardized and scaled. Moreover, as investment dollars continue flowing into digital in general, and programmatic in particular, the bottom-up model will begin to capture the maximum possible information about the user, as the “sample” grows toward 100%.

From Accurate Accreditation To Budget Simulation

A bottom-up analysis may be an inevitable replacement for top-down macro models.  Hopefully the fallacy of last-interaction attribution is crystal clear. Would you attribute every unit sold at Walmart to the person greeting customers at the door? Why this method took hold as a standard is a question for the ages, but it is a poor proxy for real-world analytics.

The “rule based” successor to last-interaction is an improvement only in the sense that it acknowledges that something else may have occurred, prior to that last click, which led to conversion. Unfortunately, the rules that determine the weights (first/even/last/etc.) are self-fulfilling prophecies and arbitrarily assigned.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Proper fractional attribution is the first major step toward accurately accrediting conversion. This is also a key first step toward a bottom-up analog of full-scale top-down MMM analysis.

But two things hold back its full potential as the end-all solution. First, it is generally confined to existing trackable media. And while it accurately values assists, it stops short of being a budgetary simulation tool.

The first point is changing fast as more media becomes trackable. New channels, such as addressable TV, though not necessarily bid in an RTB environment, can report impressions back on actualized time stamps and geocoordinates – potentially even IP – as opposed to traditional estimates. This facilitates complete fractional attribution of user exposure and enables full-pathway understanding of media contribution to conversion.

As for the second point, event-level data’s predictive stability will be refined, and in the future, budget simulation or “what if” scenario planning should be possible. The implementation of this practice in near real time provides for on-the-fly optimization, as well as budget forecasting exercises that can be tested and verified more responsively.

This new paradigm of fractional attribution and econometric modeling is no longer a science experiment. In the historical absence and subsequent underutilization of this granular data, estimates, assumptions and rules were the best available practice. These inferential analyses are no longer necessary to understand media efficacy; marketers now can invest, measure, simulate and optimize in closer to real time than ever previously possible.

Follow Accordant Media (@Accordant) and AdExchanger (@adexchanger) on Twitter.

Must Read

AdExchanger's Big Story podcast with journalistic insights on advertising, marketing and ad tech

Guess Its AdsGPT Now?

Ads were going to be a “last resort” for ChatGPT, OpenAI CEO Sam Altman promised two years ago. Now, they’re finally here. Omnicom Digital CEO Jonathan Nelson joins the AdExchanger editorial team to talk through what comes next.

Comic: Marketer Resolutions

Hershey’s Undergoes A Brand Update As It Rethinks Paid, Earned And Owned Media

This Wednesday marks the beginning of Hershey’s first major brand marketing campaign since 2018

Comic: Header Bidding Rapper (Wrapper!)

A Win For Open Standards: Amazon’s Prebid Adapter Goes Live

Amazon looks to support a more collaborative programmatic ecosystem now that the APS Prebid adapter is available for open beta testing.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Gamera Raises $1.6 Million To Protect The Open Web’s Media Quality

Gamera, a media quality measurement startup for publishers, announced on Tuesday it raised $1.6 million to promote its service that combines data about a site’s ad experience with data about how its ads perform.

Jamie Seltzer, global chief data and technology officer, Havas Media Network, speaks to AdExchanger at CES 2026.

CES 2026: What’s Real – And What’s BS – When It Comes To AI

Ad industry experts call out trends to watch in 2026 and separate the real AI use cases having an impact today from the AI hype they heard at CES.

New Startup Pinch AI Tackles The Growing Problem Of Ecommerce Return Scams

Fraud is eating into retail profits. A new startup called Pinch AI just launched with $5 million in funding to fight back.