Home The Sell Sider 3 Reasons Publishing And Ad Data Loses Value

3 Reasons Publishing And Ad Data Loses Value

SHARE:

The Sell Sider” is a column written for the sell side of the digital media community.

Today’s column is written by Michael Manoochehri, chief technology officer at Switchboard Software.

Why do data projects continue to fail at an alarming rate?

Gartner estimates that through 2017, 60% of data projects never get past preliminary stages.

There are common reasons these projects struggle, but when it comes to data, the advertising industry is anything but common.

Our industry is experiencing a historic surge in data volume. Complexity is growing due to the success of programmatic advertising, and publishers are demanding access to larger numbers of data sources.

From a data perspective, I believe there are three main barriers that prevent publishers from maximizing the value of their data, and it’s time we start talking about them. While there’s no quick fix to solve these problems, there are practical steps publishers can start taking now to prevent their data from losing value.

Granularity barrier

Obtaining a unified view of how content relates to revenue requires a combination of multiple, programmatic data sources. Most supply-side platforms (SSPs) provide reporting data in some fashion, but there remains a daunting lack of standard granularity in available data.

Some SSPs provide data broken out to the individual impression, while others only provide daily aggregates. Granularity mismatches become an even greater challenge when each source generates different versions of currencies, date formats, time formats, bid rate formats and so on. These differences add up fast, and when they do, the inability to build a unified view of all data lowers its overall value.

To solve data granularity issues, publishers must apply business rules to normalize their data. These rules can be defined by options in a vendor’s UI, SQL code in the data warehouse or pipeline code by an engineer. Business rules describe how data “should” look – bid rates as decimal values versus percentages, for example – from a given source. Lack of visibility into where business rules are defined can cause costly problems.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Time and time again, I’ve observed engineering teams debate with C-level executives about inaccuracies in revenue reporting. The reason is often because somewhere along the data supply chain, a business rule changed in an undetectable way. To prevent granularity from becoming an issue, there must be transparency for business rules.

To get started, I suggest going through the exercise of simply accounting for all steps in the data supply chain to document how rules are being applied, and by whom. Knowing how business rules are used for normalization is essential to preserving the value of the data.

API barrier

SSPs often promote how accessible their data is through application programming interfaces (APIs), but sometimes that’s not the reality.

Publishers and advertisers rely on a network of multiple, heterogeneous data sources, but many are completely unprepared for the rate of change, problems and quirks exhibited by SSP APIs. SSP APIs can and will change or break. As the number of APIs under management grow, the possible points of failure multiply, creating a distributed systems problem. Laws like the General Data Protection Regulation also require that API credentials and data access are managed securely using best available practices. Ultimately, any time a team can’t contend with errors or downtime from a given source, its data loses value.

I’ve met many talented engineering teams who struggle to understand how much of their API-based reporting works, so it should come as no surprise that line-of-business leaders often feel in the dark as well. To prevent API challenges from becoming too daunting, I suggest that engineering teams proactively develop both processes to monitor API health and data operations playbooks to react to changes and outages. They should ensure that API credentials are not tied to user accounts and that they are stored with secure password management software.

Playbooks are crucial for engineering to diagnose problems and for the line-of-business manager to understand what’s going on. They also serve as excellent handover documentation if there is engineering churn.

Scale barrier

Advertising data volume is exploding, and heterogeneous data sources are proliferating. This one-two punch puts up a tough scalability barrier that’s difficult to fight through.

Ultimately, delivering scalability comes down to smart infrastructure planning. However, many businesses think until they need to scale, there’s no work to be done, which is a dangerous mistake. Keep in mind that not all infrastructure products – however helpful they may be – are suitable for scaling past a certain size. Scale problems hit unexpectedly, and when they happen, queries crawl to a halt, dashboards fail and data is lost.

To prepare for scalability challenges, publishers should start measuring now. Specifically, they must understand how much data they have, how queries are performing and how responsive their dashboards are. Can their infrastructures handle unexpected spikes in volume, whatever those spikes might look like? They should walk through a use case of what they would do, should their data exceed current capacity.

Publishers should also know that there are significant performance differences between standard on-premise databases and cloud data warehouses. In my experience, once the data required for daily analysis approaches 10 GB, standard databases become slow and costly to maintain. Publishers must understand the tradeoffs when they eventually need to migrate to new infrastructure.

One final thought: Keep in mind that breaking through these barriers will always require some engineering. Publishers should try to get proactive about aligning the goals of engineering and business teams. The more closely aligned they are, the faster they’ll get more value from their data.

Follow Switchboard Software (@switchboardsoft) and AdExchanger (@adexchanger) on Twitter.

Must Read

Comic: Alphabet Soup

Buried DOJ Evidence Reveals How Google Dealt With The Trade Desk

In the process of the investigation into Google, the Department of Justice unearthed a vast trove of separate evidence. Some of these findings paint a whole new picture of how Google interacts and competes with its main DSP rival, The Trade Desk.

Comic: The Unified Auction

DOJ vs. Google, Day Four: Behind The Scenes On The Fraught Rollout Of Unified Pricing Rules

On Thursday, the US district court in Alexandria, Virginia boarded a time machine back to April 18, 2019 – the day of a tense meeting between Google and publishers.

Google Ads Will Now Use A Trusted Execution Environment By Default

Confidential matching – which uses a TEE built on Google Cloud infrastructure – will now be the default setting for all uses of advertiser first-party data in Customer Match.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
In 2019, Google moved to a first-price auction and also ceded its last look advantage in AdX, in part because it had to. Most exchanges had already moved to first price.

Unraveling The Mystery Of PubMatic’s $5 Million Loss From A “First-Price Auction Switch”

PubMatic’s $5 million loss from DV360’s bidding algorithm fix earlier this year suggests second-price auctions aren’t completely a thing of the past.

A comic version of former News Corp executive Stephanie Layser in the courtroom for the DOJ's ad tech-focused trial against Google in Virginia.

The DOJ vs. Google, Day Two: Tales From The Underbelly Of Ad Tech

Day Two of the Google antitrust trial in Alexandria, Virginia on Tuesday was just as intensely focused on the intricacies of ad tech as on Day One.

A comic depicting Judge Leonie Brinkema's view of the her courtroom where the DOJ vs. Google ad tech antitrust trial is about to begin. (Comic: Court Is In Session)

Your Day One Recap: DOJ vs. Google Goes Deep Into The Ad Tech Weeds

It’s not often one gets to hear sworn witnesses in federal court explain the intricacies of header bidding under oath. But that’s what happened during the first day of the Google ad tech-focused antitrust case in Virginia on Monday.