"Data-Driven Thinking" is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Mark Shedletsky, founder and CEO at Vertical Mass.
Many consumers clamor to get their hands on the most sought-after brands – lining up for days for Apple’s new iPhone, spending thousands of dollars on concert tickets – before giddily sharing it on social media.
But bad actors regularly knock off these premium brands with the goal of profiting from their hard-earned audience. They haven’t paid for the name or the design. It’s stolen and packaged as the real product, when it’s far from it.
This isn’t just a problem for physical goods. Audience data is also falling into the wrong hands and used for retargeting, unbeknownst to consumers. The General Data Protection Regulation (GDPR) and fallout from the Facebook/Cambridge Analytica scandal have put consumer privacy squarely in the crosshairs. However, that’s only part of the larger issue, along with the less-discussed concept of brand privacy.
Brands spend billions of dollars to cultivate one-to-one relationships with consumers and create audience profiles in exchange for discounts and other rewards. There’s a give and take here, with a direct relationship between the data provided with consent and the benefit received.
Where things have run afoul, however, is the gaggle of third parties accessing and using that data without permission. Data companies gain access to publisher and brand audiences through mobile analytics SDKs, social media scraping, programmatic ad serving and more.
Apps may want access to the microphone or the location of our device when it seems irrelevant to the casual game being played or the service being downloaded. It’s likely to aid in the piracy of a consent-based relationship between a consumer and a brand. For instance, microphones may get deployed to help ad tech platforms identify audience signals in your living room and sell retargeting.
These kinds of practices have created a growing need for a transparency framework, as the IAB has recently introduced, that audits where data comes from and helps media buyers trust what they’re getting and from whom they’re getting it. It’s a potential fix for a universal intellectual property issue that’s at the crux of the need for greater brand privacy.
This data is a brand’s intellectual property, similar to its logo or product marks. A mobile location company can’t just collect data on audiences at NBA games and then sell it as “NBA fans” audience data. The NBA hasn’t authorized the collection for an audience that it cultivated and from which the third party is trying to derive monetary value. If the mobile location company sold it as a “basketball interest audience,” this creates fewer issues. But packaged as the NBA’s audience, the NBA and its official partners are the only ones with the right to its value.
GDPR and measures like the IAB’s transparency framework are positive steps toward better brand privacy. Brands must also take steps to understand the terms of service from every company that has placed a pixel on its website, which is an easy way to weed out the trusted partners from the knock-off purveyors. There are also tools available to determine who else might be monetizing a brand’s data in the open marketplace.
Any fix to the current digital ad system must hinge on the concept of control of where data comes from and goes, as well as control over the direct relationship between brands and audiences. All parties stand to benefit from this greater control over what they own (data) to create a more sustainable and mutually advantageous future for the industry.
It would be wise for brands to review their audience data buying practices to determine if it is buying official or knock-off audience data. Or maybe the brand is the one being knocked off.
Fostering an environment based on real audience data, direct from the source with permission, respects the privacy of both brands and consumers and keeps everyone’s data a whole lot safer.