Home Data-Driven Thinking No More Sketchy Data: Why ‘Close Enough’ Won’t Work In Marketing’s AI Era

No More Sketchy Data: Why ‘Close Enough’ Won’t Work In Marketing’s AI Era

SHARE:
Maor Sadra, CEO & Co-Founder, INCRMNTAL 

For a long time, marketers have gotten away with flying blind. Campaigns have been run on a patchwork of incomplete inputs across various platforms, missing key fields and rarely aligned goals. Brands didn’t question performance. As long as the top line and bottom line were fine, who cares how clean the data was?

The consequence? Multimillion-dollar decisions were made based on the wrong data. Siloed teams were operating on different definitions of success. Media and measurement stacks didn’t speak the same language. And no one was truly accountable for whether the data being fed into marketing systems was structured, accurate or even usable.

But that approach is no longer sustainable.

The industry has entered what I’ve termed the “AI-volution,” an era where campaigns are designed, distributed and optimized by machines. AI is now embedded across every step of the media value chain. While this unlocks new potential, it also comes with a critical dependency: AI is only as good as the data it’s fed.

When the input data is messy, misaligned or just plain wrong, AI doesn’t correct that. On the contrary, it amplifies the problems. Bad data no longer just creates inefficiencies. It also leads to bad decisions at scale.

That kind of structure might have been acceptable five years ago. Today, it’s a strategic liability.

Attribution is not measurement

Attribution has long been treated as a form of measurement. But, in reality, it often amounts to storytelling. Multiple platforms claim credit for the same conversions and sometimes inflate their impact by as much as 200%, with different logic and very little accountability. What one platform classifies as a “view-through” might be ignored entirely by another. A spike in direct traffic goes unexplained, while campaign-level inputs remain fuzzy at best.

The illusion of precision provided by attribution is seductive, especially when budgets are under pressure. But it’s not precision; it’s correlation dressed up as causality. And the more channels and platforms are introduced, the more distorted the view becomes.

This wasn’t a huge issue when campaign shifts happened quarterly and reports were cobbled together after the fact. But today, marketing systems – especially those powered by AI – are optimizing in real time, often with zero human intervention. That means the quality of the input data has never mattered more.

Flawed data has led marketers down the wrong path. Marketers have been known to log TV and CTV spend as a single lump sum with no breakdown by date, campaign, country or creative, making it impossible to link performance to actual media tactics. Clients tag up to 50% of their conversion data as “unknown” country, meaning half of their audience geography was literally invisible. Companies are working with a dozen ad platforms in a dozen different time zones and settings and wondering why the data doesn’t actually make sense.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

These aren’t unusual cases. They’re almost the norm. And they highlight how broken inputs don’t just muddy results; they actively encourage bad decisions.

Marketing has a data infrastructure problem

What’s clear now is that this is not just a measurement issue; it’s a foundational one. And it starts with how data is collected, structured and handled internally across marketing organizations.

AI and automation have rapidly scaled campaign execution. What hasn’t scaled, at least not in parallel, is the infrastructure needed to support reliable, privacy-safe, cross-channel data. The result is that many marketers are still making critical decisions using data pipelines that would fall apart under even basic scrutiny.

This is where a new kind of capability is needed: the marketing data engineer. Not a media buyer, not an analyst buried in spreadsheets, but someone who bridges the technical understanding of data architecture with the strategic awareness of how media is planned and measured. A role tasked with ensuring that campaign inputs are complete, structured and consistent, before optimization tools and machine learning models are allowed anywhere near them.

This person isn’t building dashboards. They’re building the systems that determine whether results can be trusted in the first place.

Time to clean house

If marketing wants to thrive in an AI-powered, privacy-first environment, it can no longer afford to operate on “close enough” data. Not because expectations are higher, but because automation has removed the buffer zone. AI will not ask whether the data makes sense; it will act on it immediately and at scale.

That means the burden now falls on marketing leaders to treat data not as a byproduct of campaigns, but as infrastructure. Clean, structured, privacy-compliant data isn’t a nice-to-have; it’s the only thing standing between strategic clarity and algorithmic chaos.

It’s time to build the roles, processes and systems that ensure the data driving AI is fit for purpose. Because sketchy data isn’t just inefficient anymore; it’s expensive, it’s misleading and it’s holding the industry back.

And in the AI-volution, the only thing worse than no data … is bad data that looks good.

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Follow INCRMNTAL and AdExchanger on LinkedIn.

For more articles featuring Maor Sadra, click here.

Must Read

CleanTap Says It Easily Fooled Programmatic Tech With Spoofed CTV Devices

CleanTap claims that 100% of the invalid traffic it spoofed was accepted into live auctions run by programmatic platforms and was successfully bid on by advertisers.

HUMAN Expands Its IVT Detection Tool Kit With A New Product For Advertisers, Not Platforms

HUMAN has recently started complementing its bid request analysis by analyzing the time between when a bot clicks an ad and when the landing page loads. Now it’s offering the solution to individual advertisers.

Index Exchange Launches A Data Marketplace For Sell-Side Curation

Through Index Exchange’s data vendor marketplace, curators gain access to third-party data sets without needing their own integrations.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Can Publishers Trust The Trade Desk’s New Wrapper?

TTD says OpenAds is not just a reaction to Prebid’s TID change, but a new model for fairer, more transparent ad auctions. So what does the DSP need to do to get publishers to adopt its new auction wrapper?

Scott Spencer’s New Startup Wants To Help Users Monetize Their Online Advertising Data

What happens when an ad tech developer partners with a cybersecurity expert to start a new company? You end up with a consumer product that is both a privacy software service and a programmatic advertising ID.

Former FTC commissioner Alvaro Bedoya speaks to AdExchanger Managing Editor Allison Schiff at Programmatic IO NY 2025.

Advertisers Probably Shouldn’t Target Teens At All, Cautions Former FTC Commissioner

Alvaro Bedoya shared his qualms with digital advertising’s more controversial targeting tactics and how kids use gen AI and social media.