Every era of advertising has been defined by the moment its infrastructure became the bottleneck.
In the late 1990s, the bottleneck was the fax machine. Media buyers negotiated insertion orders over phone calls, signed paper contracts and waited days to confirm a campaign was live. The industry did not abandon that process because it was bad; it abandoned it because the internet created a volume of inventory that human-speed negotiation could no longer serve.
Programmatic auctions emerged not as an upgrade, but as an evolutionary necessity—the only architecture that could match the velocity of digital supply.
We are at a similar inflection point again. This time, the bottleneck is the audience segment.
The architecture that got us here
Give Boolean segment targeting and its yes/no binary logic its due. For the better part of a decade, it was the right tool for the environment it inhabited.
A data provider would model an audience – e.g., “auto intenders” – package it as a segment ID and ship it to a brand. The brand would load that ID into a DSP. The DSP would match it against an impression and bid accordingly.
One partner, one buyer, one signal, one handshake. The bilateral model was elegant because the ecosystem was simple enough to be bilateral: If the audience profile fits the campaign strategy, then bid on the impression; if it doesn’t, don’t.
But evolution does not ask permission. Publishers began contributing real-time contextual signals. Measurement companies layered in exposure data. CRM platforms pushed first-party purchase history into the bidstream.
And now, autonomous AI agents are entering the decisioning layer as primary orchestrators that evaluate 10 million impressions per second.
The segment was built for a world where a human could define the logic on a Tuesday and let it run for a quarter. The agents operate in a world where logic must be composed in four milliseconds, from signals that have never been combined before, for an impression that will never exist again.
The combinatorial wall
In a bilateral model, you can precalculate your audience. The segment is a flat file. You build it; you ship it; it works.
Now imagine an agent that needs to fuse signals from a publisher’s contextual feed, a measurement company’s exposure graph, a weather API, an auto configurator’s behavioral intent and a retailer’s purchase data – all at impression time, all for a single bid.
With just 100 data providers, the number of possible segment combinations is 2¹⁰⁰. That number exceeds the atoms in the observable universe. You cannot prebuild those intersections. You cannot store them. You cannot retrieve them in time.
This is not a technology failure you can solve with faster servers. It is a structural limit – the point where human-defined Boolean targeting logic can no longer keep pace with the dimensionality of the machine. The combinatorial wall is not a glitch; it is a deadline.
Closing the blind spot
Inside every segment, there is a quieter failure costing buyers real dollars on every impression.
Consider two individuals, both labeled “auto intender.” Person X configured a vehicle online 72 hours ago, returned to the dealership site twice and was exposed to a competitor’s connected TV campaign last night. Person Y browsed a single automotive article three weeks ago. Under current DSP targeting logic, they are identical because simple binary decisioning is involved. Same segment. Same bid. Same five dollars.
But they are not the same, and binary decisioning cannot account for the nuances. There is no encoding for how much, how recently or in which direction. Every invisible gradient is revenue a buyer will never capture and a seller will never earn.
But there is a way to measure relevance as a spectrum, not a switch.
User Context Protocol (UCP), often referred to as Agentic Audiences, substitutes the segment ID with a dense, self-contained embedding: roughly four kilobytes of compressed mathematical meaning riding the same SSP-to-DSP infrastructure we already operate.
The publisher contributes a context vector. The data provider contributes an intent vector. The measurement company contributes an exposure vector. At impression time, these compose into a single point in high-dimensional space – a live, continuous portrait of the moment.
The buyer’s agent trains on its own conversion data to learn an “ideal outcome” vector. The bid decision becomes a dot product: How similar is this impression’s composite vector to my ideal? High similarity, high bid. Low similarity, pass. Sub-second decision-making. No combinatorial explosion. No segment taxonomy. Just math.
Person X’s vector encodes relevance, intensity, recency and competitive exposure. Person Y’s encodes a fading signal three weeks stale. The agent bids fourteen dollars on one and four on the other.
The format shift
The fax machine was not replaced by a better fax machine. The insertion order was not replaced by a faster insertion order. And the segment will not be replaced by a better segment. It will be replaced by a format native to the machines that now make the decisions.
The insertion order had its era. The segment had its era. The vector’s era is beginning. Evolution, as always, is not optional.
“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Follow Evgeny Popov and AdExchanger on LinkedIn.
