Home Daily News Roundup The Max Effect; No Keeping Up With Core Updates

The Max Effect; No Keeping Up With Core Updates

SHARE:
Comic: Black Boxes

A Fine Performance

Ecommerce consultancy Smarter Ecommerce – or smec, as it’s known – just published its annual State of PMax report. 

Performance Max campaigns are tough to aggregate and study. It’s a black box, after all. So it’s useful to have smec’s tracking of thousands of PMax advertisers as a benchmark. 

And the trends are interesting, although not always easily explainable. 

For instance, PMax advertisers set campaigns to either optimize for profitable revenue per sale or maximize overall sales volume. No surprise, then, that ecommerce advertisers prefer revenue per sale rather than total volume. 

“However, this preference is eroding slightly over time, due to unknown reasons,” writes Mike Ryan, the author of smec’s report (and the PMax whisperer, as AdExchanger calls him).

Sometimes it takes a Goldilocks test to figure out. For instance, an advertiser running only one broad PMax campaign is likely missing out, although that’s a relatively common approach. But running narrowly targeted campaigns – like setting up ROAS targets for every particular product, which would seem to make sense given how PMax works – will backfire. 

PMax needs a critical mass of regular conversions to effectively optimize. By parsing campaigns into wafer-thin targets, advertisers can accidentally prevent all of their campaigns from taking off. 

Shaken To The Core

In late 2024, Google announced (confirmed, more like) that it will be making more core updates per year

What’s more, these core updates – which are so named because they substantially change search rankings – are increasingly volatile for publishers and SEO pros. In other words, these updates cause dramatic, unexpected swings in search traffic (usually going down). 

Google has always been somewhat mysterious about its core updates. It doesn’t explain in detail what’s changing and what publishers should do to adapt. Industry pros just sort of feel out what types of traffic are up or down and use their gut to determine what Google’s algo seems to prefer following a change. 

For jaded publishers, the emerging best practice is to simply absorb the hits and wait out the losses when a core update strikes traffic. Making rapid reactionary changes to try and regain former rankings can muddy the waters for publishers. And it’s not as if Google’s algo will take heed anyway. 

“You can’t just quickly fix and recover,” SEO consultant Jason Hennessey tells Digiday. “People go into panic mode and think there’s a quick fix but there’s not.”

Wiki’s Woes

Everyone’s copying from Wikipedia, even the generative AI machines. 

Increased activity coming from generative AI crawlers is straining Wikipedia’s web infrastructure, Wikimedia claims.

Wikipedia makes sense as a repository for generative AI scraping. The same goes for Wikimedia Commons, which hosts 144 million openly licensed images, videos and other rich media files.

Since January 2024, Wikimedia’s bandwidth usage from downloading multimedia files has grown 50%. Most of this traffic is from automated scraping of its image database.

As a result, Wikipedia has less available bandwidth to support traffic surges during high-profile news events.

“The load on our infrastructure is not sustainable and puts human access to knowledge at risk,” Wikimedia wrote in its annual plan for 2025. It also noted the difficulty of identifying bots “in the world of unauthenticated traffic.”

To make matters worse, two-thirds of Wikimedia’s most expensive traffic comes from bots, which bulk read pages that aren’t visited regularly by humans, meaning these pages aren’t archived in local servers. Instead, the pages must be retrieved from Wikimedia’s core servers, thus carrying higher costs. 

Wikipedia isn’t the only publisher dealing with this burden. But generative AI’s ravenous appetite must be sated.

But Wait! There’s More

Amazon has put forward an offer to buy TikTok US. [NYT]

DoubleVerify will invest in FirstPartyCapital, a fund that itself invests in ad tech and mar tech startups. [release]

Tumblr is seeing an uptick from young users who are fleeing X, which is yucky, and Instagram’s and TikTok’s feeds of lifestyle influencer posts. [Business Insider]

AI search is the new arms race for retailers. [The Information]

You’re Hired!

Pixability hires Maria McCarthy as SVP of sales for the Midwest and Western regions. [release]

Thanks for reading AdExchanger’s daily news round-up… Want it by email? Sign up here.

Must Read

Kelly Andresen, EVP of Demand Sales, OpenWeb

Turning The Comment Section Into A Gold Mine

Publisher comment sections remain an untapped source of intent-based data, according to Kelly Andresen, who recently left USA Today to head up comment monetization platform OpenWeb’s direct sales efforts.

Comic: Shopper Marketing Data

Shopify Launches A Product Network That Will Natively Integrate Items From Across Merchants

Shopify launched its latest advertising business line on Wednesday, called the Shopify Product Network.

Criteo Lays Out Its AI Ambitions And How It Might Make Money From LLMs

Criteo recently debuted new AI tech and pilot programs to a group of reporters – including a backend shopper data partnership with an unnamed LLM.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Google Ad Buyers Are (Still) Being Duped By Sophisticated Account Takeover Scams

Agency buyers are facing a new wave of Google account hijackings that steal funds and lock out admins for weeks or even months.

The Trade Desk Loses Jud Spencer, Its Longtime Engineering Lead

Spencer has exited The Trade Desk after 12 years, marking another major leadership change amid friction with ad tech trade groups and intensifying competition across the DSP landscape.

How America’s Biggest Retailers Are Rethinking Their Businesses And Their Stores

America’s biggest department stores are changing, and changing fast.