The biggest privacy-related story of 2024 is actually about something that didn’t happen.
To the shock of some and the knowing, somewhat smug told-ya-so nods of others, Google announced over the summer that it was scrapping its longstanding plan to deprecate third-party cookies in Chrome.
Instead of getting rid of them, Google will now introduce what Anthony Chavez, VP of the Privacy Sandbox, referred to in a July blog post as “a new experience.” A new experience? How exciting.
The mechanism will allow people to “make an informed choice that applies across their web browsing history,” Chavez wrote.
Development of the Chrome Privacy Sandbox APIs, meanwhile, proceeds apace.
But if the steep reduction in the number of cookie-related pitches in my inbox since July is any indication, the urgency among ad tech vendors, publishers and advertisers to test the Sandbox has virtually evaporated.
“Marketers relying on programmatic ad buying found a bit of a reprieve,” says Mike Froggatt, an analyst and senior director at Gartner.
But a reprieve is only a stay of execution. Signal loss is real.
Still, the Privacy Sandbox no longer dominates the headlines. I’m even considering turning off my Google Alert for “Privacy Sandbox” – and maybe creating one for “Andrew Ferguson.” He’s a current Republican FTC Commissioner and President-elect Donald Trump’s pick to replace Lina Khan as chair.
But cookies aside – and don’t forget to leave a few real ones out for Santa – there were lots of other big privacy developments in 2024, which we predict will continue to be important storylines in 2025. Here are some of the highlights.
PET projects
Over the past few years, privacy-enhancing technologies have steadily moved from the theoretical to the practical.
All of the large ad platforms have invested in PETs; the IAB Tech Lab has a working group dedicated to PETs, and all of the APIs in the Chrome Privacy Sandbox are built on different PETs.
I’m sure the readers of this newsletter are very familiar with the concept, but it’s worth defining terms.
A PET is a solution that can accomplish complex data-processing functions without revealing individual, household or personal user-level information to unauthorized parties. Put simply, it’s technology designed to be privacy-enhancing from the start rather than jury rigged with protections after the fact.
One example is differential privacy, which adds carefully calibrated and random “noise” to a data set to hide individual data points. Another is secure multi-party computation, which is a cryptographic technique that allows multiple parties to jointly analyze data while keeping the inputs private. It’s what most data clean rooms are built on.
PETs will only become more mainstream in the year ahead and years to come, especially as regulators continue scrutinizing and putting pressure on the digital advertising industry.
But it’s also important to remember this: PETs are not a panacea.
The Federal Trade Commission recently made that point very clear.
Not-so-clean rooms
In a blog post published in November, the FTC warned that despite their potential benefits, data clean rooms don’t automatically preserve privacy.
As a matter of fact, a “close examination of DCRs” – I refuse to acknowledge the acronym, but this is a quote, so let us proceed – “yields an evergreen lesson: even if privacy-enhancing technologies alone can’t protect privacy and even if they address some privacy risks, they can contribute to others.”
For example, if a data clean room isn’t configured properly, it could set the stage for unauthorized data sharing and increase the risk of leaks and breaches.
Meanwhile, all too often, “data that enters a clean room is dirty,” as in “unconsented, inaccurate, unreliable and perhaps even unlawful,” says Jamie Barnard, CEO of privacy compliance software startup Compliant.
“If there is too much trust, too little verification and careless controls, there is a serious risk of cross-contamination and leakage,” Barnard says. “Moreover, if this all happens inside a black box, the parties may be oblivious to the harm until it’s too late.”
The FTC, however, is not oblivious and doesn’t take a company’s privacy claims at face value.
As agency staff put it in the blog post: “Liability for violations of the FTC Act isn’t magically mitigated by clever technology.”
Them’s fightin’ words – but will that fight continue?
Trading places
With a Republican majority under Trump, the FTC is set to shift its focus and enforcement priorities.
The commission will still actively pursue consumer and privacy protection, because that’s its mandate. But expect a decrease in the amount of rulemaking, more sympathy for the business community and a move away from pejorative terminology, like “commercial surveillance” and “surveillance advertising.”
Still, it’s hard to say exactly what we’ll see in terms of privacy enforcement over the next four years.
In his current role as a commissioner, Andrew Ferguson, Trump’s nominee for chair, voted in at least partial support of every privacy-related action since he joined in April, including the release of the FTC’s staff report examining the data practices of social media and video streaming services and the FTC’s settlements with Gravy Analytics and Mobilewalla.
But in his pitch to Trump to serve as chair – the memo was leaked and obtained by Punchbowl News – Ferguson wrote that under his charge, the agency would “stop abusing [its] enforcement authorities as a substitute for comprehensive privacy legislation.”
So I guess we’ll just have to wait and see. 🤷♀️
🙏 Thanks for reading! The most consistent feedback I got this year is that people like the cat videos. I rarely get comments on the articles themselves, just reactions to the cat stuff. And I’m okay with that. It hopefully means people are reading (or at least scrolling) all the way to the bottom. 😹
This is the last issue of 2024, so let me wish you all a happy New Year, and I hope you have something fun planned. As always, feel free to drop me a line at allison@adexchanger.com.