Hearst Newspapers’ Battle To Keep Programmatic From Breaking Its Site

Susan Parker HearstProgrammatic advertising makes it difficult for publishers like Hearst Newspapers to ensure that users enjoy fast-loading pages and quality ad experiences.

Hearst closely monitors its site templates, optimizing for fast load times and ad viewability. But once it starts letting third parties run scripts on its website – aka programmatic advertisers – that can slow the experience and cause readers to leave pages.

If third parties slow down the site, traffic dips and so does revenue. But it also impacts the reader experience, affecting loyalty and the future of Hearst’s newspapers. With so much on the line, the publisher is experimenting with solutions that give it greater visibility into content and ad performance, while advocating for vendors to do more to help the sell side.

Lurking Problems

Slowdowns can be malicious or accidental. In-banner video companies constantly find new exploits that allow them to sneak onto publishers’ sites, causing them to slow down or crash. Others try to skirt quality control guidelines.

Creative assets wrapped with multiple tags can also cause problems. An ad may work fine in isolation, but not when other ads are running on the same page. Agencies and tech vendors often test in isolation, without considering how their technology might interact in real publisher environments.

At Hearst Newspapers, isolating and fixing these problems is the job of Susan Parker, VP of digital revenue and analytics. She takes a vigilant stance, poring over reams of data from disparate sources in order to identify problems.

A big issue for publishers is a lack of visibility into what’s going on with inventory. Important data can be siloed, preventing her from seeing how content and advertising perform in concert with each other. She’s recently become a fan of Burt, a tool that marries those analytics. It brings together data about the site experience, from Adobe Analytics, and ad experience, via AdX and DFP.

“As a large publisher, we have a lot of content, templates and users,” Parker said. “Our ability to know what’s happening on every single page is impossible without good reporting, because no one user can see how the entire audience is viewing our pages.”

If revenue or impressions declined, an extensive process was required to figure out why.

“We had to merge data from different systems,” which was a labor-intensive, manual process, Parker said. “Now it’s being pulled and processed every day, instead of every couple of weeks or at the end of the month.”

Less time pulling and cleaning data means more time for analysts to interpret it. Hearst has also pre-created dashboards within Burt that allow non-analysts to clearly see and understand site trends.

If a web developer makes a change, he or she can quickly check to make sure it’s not impacting the site. It would previously take days to detect unexpected changes, or other hidden issues could lurk for much longer before someone noticed something amiss.

Lack Of Communication

Having more usable, actionable data is starting to put Hearst back in control of tracking down problems from external sources. When Parker reaches out to partners about issues, they are often eager to fix them or equally frustrated about someone farther down the chain causing the problem.

Within the industry, there’s no standardized way for publishers to alert vendors to problems with programmatic ads. She can’t tell advertisers that a tag they’re running is so slow that the ad isn’t even being served 50% of the time. Or that two ad creatives are conflicting and causing the page to crash, which can happen for esoteric reasons, such as both ads using an exchange’s default naming convention in their code.

Verification tags, designed to ensure ads are highly viewable, can actually make ads less viewable.

“They’re the slowest ads on the page to load,” Parker said. “You want the ad to load quickly because you want to be there while the reader is engaged, for as long as possible.”

While Hearst can troubleshoot discrepancies with viewability around directly sold campaigns, Parker knows that dysfunctional scripts could show that its site is a poor performer, when it’s the technology that’s the problem.

Serving ads in real publisher environments is tough, something many vendors could do better, Parker said. Industry guidelines don’t look at how ads interact on a real page. “When technologies are layered and bundled, there are interactions that aren’t always understood.”

She advocates that tech vendors start paying attention to the publishers showing the ads, not just the advertisers buying the technology.

With strong data and analytics to show how advertisers impact their site, Parker wants to be part of the conversation and solution for better ads and user experiences.

“We’re reaching a point,” she said, “where these technologies have to be tested with the help of publishers, and not around them.”

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!


  1. Does not Google Tag Manager solve the issue of multiple ad tags slowing down the site loading and related issues?

    • @James- I think it has to do mainly with impressions being served through 3rd party ad tags (like via exchanges/DSPs/networks). From working on the buy side, it’s not uncommon to have one creative run with a over dozen separate 3rd/4th party trackers.

      I went to sfgate.com with Ghostery and got 61 separate trackers.

      Now, a bit of this is due to having a ton of unneeded content on their site (a dozen social media share buttons and a few ‘Sponsored’/’From our partners’ widgets), but the majority are through the 5 ad units on their homepage.