Home Data-Driven Thinking Data Quality: In Demand But Hard To Define

Data Quality: In Demand But Hard To Define

SHARE:

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Jason Downie, senior vice president and general manager of data solutions at Lotame.

Marketers can have all the data they could ever want, but if that data is of low quality, the only thing they’ll scale is the size of their mistakes.

Every business leader knows that data quality is important. What’s concerning is that 84% of CEOs worry about the quality of data [PDF] inside their organizations.

Unfortunately, data quality has so many definitions that it has no real meaning. One reason for this confusion is that ad tech has historically been an incredibly crowded and fragmented space.

But as consolidation continues and margins shrink, the remaining ad tech players have the opportunity lead a more meaningful discussion around quality.

Here, a little common sense can go a long way. Consider the way we talk about ZIP+4 data, for example. For many advertisers, ZIP+4 – a five-digit code plus four digits to pinpoint a segment within the delivery area – is a mark of quality but to what end? If marketers want income insights, ZIP+4 can yield quality results but the data set is useless for gender, because neighborhoods aren’t segregated by gender.

Intellectually, we can all understand how the purpose for which we use a particular data set influences the way we think about the quality of that data. But as a practical matter, we often bypass common sense questions because we know that, above all else, we must scale.

Marketers often miss out on the right data because they need scale.

Changing Marketers’ Mindsets About Data Quality

Common sense only gets you so far because most marketing challenges are so specific. Here, it’s helpful to think about quality as a process, rather than as a result.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Imagine you want to know the gender of an audience with 10 million profiles, sourced from multiple vendors. What accuracy rate counts as a quality outcome? Perfection is unrealistic at that scale, but 50% accuracy is no better than a guess.

The challenge in navigating that territory between 50/50 and perfection comes down to investigative prowess and the client’s needs. In addition to asking common sense questions about sourcing, collection and chain of custody, marketers need to test the data.

When marketers see wide variations in accuracy, they need to have discussions with their vendors about methodology. Quality is process above all else.

Emphasizing process requires more work than assembling the biggest data set. But quality and utility are tougher to gauge than scale. The more rigorous the standards and processes are for verifying the integrity of the data, the better.

Quality Must Mean Something Specific To An Organization

Marketers allocate budget based on ROI, which forces us to put data quality in a larger context. That’s a good thing, because marketing is a lot more than the accuracy of data inputs. Having the right data isn’t the same thing as deploying it for maximum effect in the real world, where strategy, creative and media budget all contribute to the overall outcome.

For the gender example, a 70% accuracy rate isn’t great, especially when compared to site targeting, which requires no data at all because content is used as a proxy for gender. But is 70% accuracy acceptable? The answer depends on the relative cost and efficacy of the alternative options. Targeting by site is accurate, but it’s also expensive. In some cases, it’s possible that 70% – what we might call lower quality data – performs well enough.

That’s not to suggest that we should settle for low quality data – we shouldn’t, but we do need to be pragmatic. There is no one-size-fits-all for data quality. Each organization has unique challenges, needs and goals.

Marketers who can embrace the idiosyncratic nature of the data quality question put themselves in the best possible position to deploy their data in a way that’s most meaningful to their organizations. If improvements in data quality aren’t leading to attributable increases in ROI, marketers are just spinning their wheels.

Follow Lotame (@Lotame) and AdExchanger (@adexchanger) on Twitter.

Must Read

Intent IQ Has Patents For Ad Tech’s Most Basic Functions – And It’s Not Afraid To Use Them

An unusual dilemma has programmatic vendors and ad tech platforms worried about a flurry of potential patent infringement suits.

TikTok Video For Open Web Publishers? Outbrain Built It.

Outbrain is trying to shed its chumbox rep by bringing social media-style vertical video to mobile publishers on the open web.

Billups Launches Attention Measurement For Out-Of-Home

Billups, a managed services agency that specializes in OOH, is making its attention measurement solution and a related analytics dashboard available for general use.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
US District Court for the Eastern District of Virginia, Alexandria

The Google Ad Tech Antitrust Case Is Over – And Here’s What’s Happening Next

Just three weeks after it began, the Google ad tech antitrust trial in Virginia is over. The court will now take a nearly two-month break before reconvening for closing arguments right before Thanksgiving.

Jounce Media's Chris Kane at Programmatic IO NY on Sept. 25, 2024.

The Bidstream Is A Duplicative, Chaotic Mess – But It Doesn’t Have To Be That Way

Publishers are initiating more and more auctions – but doesn’t mean DSPs are listening to more bids, according to Chris Kane.

Readers Are Flocking To Political News, Says WaPo – And Advertisers Are Missing Out

During certain periods this year, advertisers blocked more than 40% of The Washington Post’s inventory over brand safety concerns.