Home Data-Driven Thinking Data Quality: In Demand But Hard To Define

Data Quality: In Demand But Hard To Define

SHARE:

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Jason Downie, senior vice president and general manager of data solutions at Lotame.

Marketers can have all the data they could ever want, but if that data is of low quality, the only thing they’ll scale is the size of their mistakes.

Every business leader knows that data quality is important. What’s concerning is that 84% of CEOs worry about the quality of data [PDF] inside their organizations.

Unfortunately, data quality has so many definitions that it has no real meaning. One reason for this confusion is that ad tech has historically been an incredibly crowded and fragmented space.

But as consolidation continues and margins shrink, the remaining ad tech players have the opportunity lead a more meaningful discussion around quality.

Here, a little common sense can go a long way. Consider the way we talk about ZIP+4 data, for example. For many advertisers, ZIP+4 – a five-digit code plus four digits to pinpoint a segment within the delivery area – is a mark of quality but to what end? If marketers want income insights, ZIP+4 can yield quality results but the data set is useless for gender, because neighborhoods aren’t segregated by gender.

Intellectually, we can all understand how the purpose for which we use a particular data set influences the way we think about the quality of that data. But as a practical matter, we often bypass common sense questions because we know that, above all else, we must scale.

Marketers often miss out on the right data because they need scale.

Changing Marketers’ Mindsets About Data Quality

Common sense only gets you so far because most marketing challenges are so specific. Here, it’s helpful to think about quality as a process, rather than as a result.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Imagine you want to know the gender of an audience with 10 million profiles, sourced from multiple vendors. What accuracy rate counts as a quality outcome? Perfection is unrealistic at that scale, but 50% accuracy is no better than a guess.

The challenge in navigating that territory between 50/50 and perfection comes down to investigative prowess and the client’s needs. In addition to asking common sense questions about sourcing, collection and chain of custody, marketers need to test the data.

When marketers see wide variations in accuracy, they need to have discussions with their vendors about methodology. Quality is process above all else.

Emphasizing process requires more work than assembling the biggest data set. But quality and utility are tougher to gauge than scale. The more rigorous the standards and processes are for verifying the integrity of the data, the better.

Quality Must Mean Something Specific To An Organization

Marketers allocate budget based on ROI, which forces us to put data quality in a larger context. That’s a good thing, because marketing is a lot more than the accuracy of data inputs. Having the right data isn’t the same thing as deploying it for maximum effect in the real world, where strategy, creative and media budget all contribute to the overall outcome.

For the gender example, a 70% accuracy rate isn’t great, especially when compared to site targeting, which requires no data at all because content is used as a proxy for gender. But is 70% accuracy acceptable? The answer depends on the relative cost and efficacy of the alternative options. Targeting by site is accurate, but it’s also expensive. In some cases, it’s possible that 70% – what we might call lower quality data – performs well enough.

That’s not to suggest that we should settle for low quality data – we shouldn’t, but we do need to be pragmatic. There is no one-size-fits-all for data quality. Each organization has unique challenges, needs and goals.

Marketers who can embrace the idiosyncratic nature of the data quality question put themselves in the best possible position to deploy their data in a way that’s most meaningful to their organizations. If improvements in data quality aren’t leading to attributable increases in ROI, marketers are just spinning their wheels.

Follow Lotame (@Lotame) and AdExchanger (@adexchanger) on Twitter.

Must Read

Criteo Lays Out Its AI Ambitions And How It Might Make Money From LLMs

Criteo recently debuted new AI tech and pilot programs to a group of reporters – including a backend shopper data partnership with an unnamed LLM.

Google Ad Buyers Are (Still) Being Duped By Sophisticated Account Takeover Scams

Agency buyers are facing a new wave of Google account hijackings that steal funds and lock out admins for weeks or even months.

The Trade Desk Loses Jud Spencer, Its Longtime Engineering Lead

Spencer has exited The Trade Desk after 12 years, marking another major leadership change amid friction with ad tech trade groups and intensifying competition across the DSP landscape.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

How America’s Biggest Retailers Are Rethinking Their Businesses And Their Stores

America’s biggest department stores are changing, and changing fast.

How AudienceMix Is Mixing Up The Data Sales Business

AudienceMix, a new curation startup, aims to make it more cost effective to mix and match different audience segments using only the data brands need to execute their campaigns.

Broadsign Acquires Place Exchange As The DOOH Category Hits Its Stride

On Tuesday, digital out-of-home (DOOH) ad tech startup Place Exchange was acquired by Broadsign, another out-of-home SSP.