Home Daily News Roundup The Max Effect; No Keeping Up With Core Updates

The Max Effect; No Keeping Up With Core Updates

SHARE:
Comic: Black Boxes

A Fine Performance

Ecommerce consultancy Smarter Ecommerce – or smec, as it’s known – just published its annual State of PMax report. 

Performance Max campaigns are tough to aggregate and study. It’s a black box, after all. So it’s useful to have smec’s tracking of thousands of PMax advertisers as a benchmark. 

And the trends are interesting, although not always easily explainable. 

For instance, PMax advertisers set campaigns to either optimize for profitable revenue per sale or maximize overall sales volume. No surprise, then, that ecommerce advertisers prefer revenue per sale rather than total volume. 

“However, this preference is eroding slightly over time, due to unknown reasons,” writes Mike Ryan, the author of smec’s report (and the PMax whisperer, as AdExchanger calls him).

Sometimes it takes a Goldilocks test to figure out. For instance, an advertiser running only one broad PMax campaign is likely missing out, although that’s a relatively common approach. But running narrowly targeted campaigns – like setting up ROAS targets for every particular product, which would seem to make sense given how PMax works – will backfire. 

PMax needs a critical mass of regular conversions to effectively optimize. By parsing campaigns into wafer-thin targets, advertisers can accidentally prevent all of their campaigns from taking off. 

Shaken To The Core

In late 2024, Google announced (confirmed, more like) that it will be making more core updates per year

What’s more, these core updates – which are so named because they substantially change search rankings – are increasingly volatile for publishers and SEO pros. In other words, these updates cause dramatic, unexpected swings in search traffic (usually going down). 

Google has always been somewhat mysterious about its core updates. It doesn’t explain in detail what’s changing and what publishers should do to adapt. Industry pros just sort of feel out what types of traffic are up or down and use their gut to determine what Google’s algo seems to prefer following a change. 

For jaded publishers, the emerging best practice is to simply absorb the hits and wait out the losses when a core update strikes traffic. Making rapid reactionary changes to try and regain former rankings can muddy the waters for publishers. And it’s not as if Google’s algo will take heed anyway. 

“You can’t just quickly fix and recover,” SEO consultant Jason Hennessey tells Digiday. “People go into panic mode and think there’s a quick fix but there’s not.”

Wiki’s Woes

Everyone’s copying from Wikipedia, even the generative AI machines. 

Increased activity coming from generative AI crawlers is straining Wikipedia’s web infrastructure, Wikimedia claims.

Wikipedia makes sense as a repository for generative AI scraping. The same goes for Wikimedia Commons, which hosts 144 million openly licensed images, videos and other rich media files.

Since January 2024, Wikimedia’s bandwidth usage from downloading multimedia files has grown 50%. Most of this traffic is from automated scraping of its image database.

As a result, Wikipedia has less available bandwidth to support traffic surges during high-profile news events.

“The load on our infrastructure is not sustainable and puts human access to knowledge at risk,” Wikimedia wrote in its annual plan for 2025. It also noted the difficulty of identifying bots “in the world of unauthenticated traffic.”

To make matters worse, two-thirds of Wikimedia’s most expensive traffic comes from bots, which bulk read pages that aren’t visited regularly by humans, meaning these pages aren’t archived in local servers. Instead, the pages must be retrieved from Wikimedia’s core servers, thus carrying higher costs. 

Wikipedia isn’t the only publisher dealing with this burden. But generative AI’s ravenous appetite must be sated.

But Wait! There’s More

Amazon has put forward an offer to buy TikTok US. [NYT]

DoubleVerify will invest in FirstPartyCapital, a fund that itself invests in ad tech and mar tech startups. [release]

Tumblr is seeing an uptick from young users who are fleeing X, which is yucky, and Instagram’s and TikTok’s feeds of lifestyle influencer posts. [Business Insider]

AI search is the new arms race for retailers. [The Information]

You’re Hired!

Pixability hires Maria McCarthy as SVP of sales for the Midwest and Western regions. [release]

Thanks for reading AdExchanger’s daily news round-up… Want it by email? Sign up here.

Must Read

6 (More) AI Startups Worth Watching

The founders of six AI startups offer insights on the founding journey and what problems their companies are solving.

Nielsen and Roku Renew Their Vows By Sharing Even More Data With Each Other

Roku’s streaming data will now be integrated into Nielsen’s campaign measurement and outcome tools, the two companies announced on Monday,

Broadcast Radio Is Now Available Through DSPs

Viant struck a deal with IHeartMedia and its Triton Digital advertising platform that will make IHeart’s broadcast radio inventory available through Viant’s DSP.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

Lionsgate Enters The Ads Biz With An Exclusive Ad Server

The film and TV studio Lionsgate has chosen Comcast’s FreeWheel as its exclusive ad server to help manage and sell the growing volume of ad inventory Lionsgate creates with new FAST channels.

Layoffs

The Trade Desk Lays Off Staff One Year After Its Last Major Reorg

The Trade Desk is cutting its workforce. A company spokesperson confirmed the news with AdExchanger. The layoffs affect less than 1% of the company.

A Co-Founder Of DraftKings Wants To Help Creators Monetize Content

One of the DraftKings founders now leads HardScope, parent of FaZe Clan, aiming to bring FaZe’s content and distribution magic to creators beyond gaming.