Home Data Privacy What Regulators Talk About When They Talk About Ad Tech

What Regulators Talk About When They Talk About Ad Tech

SHARE:
Comic: The Great Online Privacy Battle

If you want to know what privacy regulators think about online advertising, it’s not a mystery. Just listen to what they’re saying.

Federal policymakers, state attorneys general and California’s new privacy watchdog are all hammering the same points: protect kids, honor opt-outs, back up your privacy promises, stop collecting more data than you need and don’t make it a hassle for people to exercise their privacy rights.

“The expectation is that consumers shouldn’t have to jump through a bunch of hoops,” said  Tom Kemp, executive director of the California Privacy Protection Agency, speaking during the IAB’s Public Policy & Legal Summit in Washington, DC, last week.

“The guidance I have for any business,” he added, “is to walk a mile in the shoes of a consumer.”

No kidding

And when that consumer is a kid or a teen, regulators are intolerant of any ambiguity. Children’s privacy is where they’re drawing some of their brightest lines – and they’re increasingly skeptical of companies that insist they “don’t know” when minors are using their products.

That excuse is wearing thin, said Delaware Deputy Attorney General John Eakins.

Eakins and his team will ask companies outright, “Do you process kids’ or teens’ data?” and the answer is always, “Oh no, of course not.”

“Yet we know for a fact that the people using [their project or service] are actually kids and teens,” he said.

It’s implausible for companies to claim they lack “actual knowledge” of who’s on their platform – the trigger for children’s privacy obligations under COPPA – when their business model depends on selling hyper-specific audiences, Eakins said, like “an adult who’s between 25 and 30 years old and likes to read The New York Times and enjoys having a prime steak.”

You can’t have it both ways.

“Putting your head in the sand is just not going to stand up anymore,” Eakins said. “I’m not sure what the case will be that breaks through, but, from our perspective, it’s going to bring a lot of inquiry.”

Comic: Programmatic PipesWhen someone opts out, believe them

One area where there’s been more than just inquiry is into how companies handle opt-outs – especially the Global Privacy Control (GPC), a universal opt-out mechanism that lets consumers automatically opt out of data sales and sharing across sites without having to make individuals requests to multiple controllers.

Although some, like privacy attorney Alan Chapell, warn that the GPC could have anticompetitive side effects if browsers turn it on by default, Kemp described the signal as “incredibly important” because it enables “privacy at scale” without the maddening task of having to opt out of tracking one company at a time.

“Oftentimes, exercising your privacy rights is a never-ending set of chores,” Kemp said.

He also made it clear that honoring the GPC isn’t a side note; it’s the through line.

The majority of California’s recent enforcement actions, including Disney (a $2.75 million fine), Healthline ($1.55 million), Tractor Supply ($1.35 million) and Jam City ($1.4 million), involved opt-out failures, including ignoring GPC signals, misconfigured opt-outs and inadequate consumer choice mechanisms.

From here, the pressure will only increase.

In February, the California Privacy Protection Agency launched an Audits Division and hired a chief privacy auditor, Sabrina Ross, to lead it. Before coming to the CPPA, Sabrina spent nearly six years as the privacy and AI policy director at Meta and was the global head of policy at Uber before that.

The division plans to conduct technical tests in the wild to make sure that the GPC and other preference signals are being honored in live environments and not just on paper.

No love for mixed messages

But getting the mechanics right is only part of the job.

Companies also need to make sure their data practices line up with what they’ve told people in their privacy policies. Offering privacy controls doesn’t mean much if, behind the scenes, you’re collecting, sharing or retaining data in ways that contradict those promises.

The Federal Trade Commission’s recent case against dating app OkCupid and its parent company, Match, is a good example of what happens when promises and practices drift apart.

Although OkCupid’s privacy policy described limits on who could access user profile photos and other personal information, the company shared that data with a third-party anyway. There was no fine, but OkCupid and Match are now subject to 10 years of compliance reporting and privacy monitoring.

“If you make privacy promises to consumers, you’ve got to hold the line on those,” said Ben Wiseman, an associate director of the FTC’s Division of Privacy & Identity Protection. “Don’t make promises that you’re not going to be able to keep.”

Portion control

And those promises don’t just cover where data goes, but also how much data companies gather and how long they retain it.

Most new state privacy laws include a data minimization provision that says companies should only collect personal data that’s adequate, relevant and reasonably necessary for the purposes they’ve disclosed to people.

Comic: Data Minimization

Which is just a wordy way of saying: Be transparent.

“Take a look at what your privacy policy is saying,” said Kashif Chand, chief deputy attorney general in the Data Privacy & Cybersecurity Section of the New Jersey AG’s office.

If a privacy policy is vague and doesn’t clearly state what a company is doing with data, then people have no hope of understanding how the data is being minimized. “They both flow together,” Chand said.

And speaking of flows, Delaware Deputy AG Eakins stressed that the minimization obligation doesn’t stop with the first party; it has to travel with the data as it moves through the ad tech supply chain, and there also needs to be a contract to prove it.

“When we’re talking about a long value chain of people passing personal data, what have you contracted?” Eakins said. “Are you making sure that the person you’re handing that data off to is also going to be bound by those same terms?”

Tools like the IAB’s Multi-State Privacy Agreement aim to make that process easier by baking data minimization and purpose limitation language into contracts across the ad tech ecosystem.

But even with standardized contracts, the target keeps moving. The standard for data minimization isn’t consistent from state to state, which is a challenge for controllers, acknowledged Chandler Crenshaw, assistant attorney general and consumer privacy unit manager in the Virginia’s AG office.

Maryland, for example, requires that collection be “strictly necessary,” while neighboring Virginia – just across the Potomac River, Crenshaw quipped – uses a “reasonable” test.

“If we had a signal from Congress,” he said, “this is one of the areas where I think it would be good for a decision to be made.”

Must Read

Northbeam Adds The Third Leg Of The Attribution Stool With Incrementality Testing

There’s MMM and MTA, but no single ad measurement works for brands with multiple points of sale. On Tuesday, Northbeam launched an incrementality tool to complete what it calls “the trifecta of digital attribution.”

Keyword Blocking Demonetized More Than Half Of Reuters’ Brand-Safe Stories

The effect wasn’t just limited to news content. The Reuters.com/lifestyle vertical also had some of its brand-suitable pages blocked.

The Agentic Marketplace Is Here. Where Does That Leave DSPs and SSPs?

Swivel and Olyzon’s new partnership brings buy-side and sell-side agents together as early examples of an agentic marketplace.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
Comic: Causal Meets Casual

Jones Road Beauty Is Using A New Type Of MMM To Reset Its Media Measurement

Inside how Jones Road Beauty is trying to turn messy, conflicting measurement signals into a single testing roadmap for its media mix.

Comic: America's Mext Top AI Model

AI Is Moving Fast. The Law, Not So Much

IAPP’s Global Summit in DC was a reminder that AI is moving fast – and judges, privacy lawyers and practitioner are racing to keep up.

CIMM Is Out To Prove That All Media Isn’t Equal

An upcoming paper from CIMM doesn’t just demonstrate that differences in media quality can be measured. It also argues that tying media value to short-term outcomes has perpetuated longstanding industry challenges.