Data Ethics: Permission Is Not The Problem

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Tom Weiss, chief technology officer and chief data scientist at Dativa.

The recent uproar caused by the Facebook-Cambridge Analytica scandal has made the ethical use of data one of the big news stories of 2018. It has also cast something of a shadow over targeted advertising.

The value of targeted advertising is clear to both consumers and the industry, and we should promote positive examples to remind people of the benefits of targeted advertising. When I shared on Facebook that I was drinking wine to celebrate my daughter’s latest achievement, Facebook targeted me with ads for wine merchants. I wasn’t surprised but intrigued. When I tried one product, it was great. If you want to target me with ads selling high-quality products in which I am interested, then count me in.

It has to be better than seeing and hearing ads about products and services in which I am not interested. But if these same ads are used to sell shoddy products, trick vulnerable people or spread political misinformation, then I’m sure we are all opposed to it.

The real problem with targeted advertising is not that it is based on sensitive data, but that only the people targeted by the ad can see it, which makes it extremely hard to hold advertisers to account. Imagine ads aimed exclusively at a data set of white people for renting an apartment. These ads are illegal, but how can we enforce the law if nobody knows a crime has been committed? A white targeted person has no way of knowing that the advertiser is deliberately not targeting black people.

So how do we regulate advertising when it’s no longer a single mass-market media? Most data legislation tends to focus on consent, as we saw with the arrival of the General Data Protection Regulation (GDPR) last month, but permission is not the only issue. People will click yes to the terms and conditions of their new phones and TVs. Clicking alone isn’t accountability, and while GDPR has raised the profile of the problem, it doesn’t address the core issue of responsibility. What is important is the content of the messages and the groups being targeted by advertisers and not that data are being gathered and used to target people.

The focus on consent, although important, is ultimately misguided. The problem is in the messages. I believe that the sell side must provide easy access to the database of creative used by advertisers in targeted ads on their platform. Facebook has created a searchable archive of political ads, but critically, it is not providing information on the people being targeted by these ads. The archive might address the issue of political influence by foreign actors, but it doesn’t solve the broader problem of discriminatory targeting.

Any such database of creative needs to be open so consumers can see not only the messages that both they and others are seeing but also those being targeted. For example, advertisers might say that they address people identified by their data as being unemployed with this back-to-work campaign. If the same people are being targeted with payday loans, then we know there is a problem. Such an approach would provide at least a start toward transparency on both the buy and sell sides.

This transparency would open bad actors for immediate criticism. Pricing car insurance based on gender is illegal in Europe, even though women have fewer accidents, as it’s considered gender discrimination. However, insurers can run campaigns online to offer women discount codes that are not targeted at men because no one knows they are doing it. Merely seeing that discounts are offered on Facebook – or other platforms – is not enough. We need to know who they are targeted at.

In addition to providing full transparency across the industry, as most brands would be willing to pay for competitive information, a central database of targeted advertising messages would provide a new revenue stream for the platforms. At the moment, legislation is going down the route of shutting down data sharing, and this has the potential of damaging both the industry and the consumer experience.

It’s time for advertisers and platforms to fight back with a new level of transparency. It took time to gain this same consumer trust in both mass-market media, and we need to ensure that targeted advertising is not legislated out of existence before it has reached maturity. The sooner the shoddy practices of some are exposed by a transparent system everybody can trust, the better things will be for the entire industry.

Follow Dativa (@Dativa4Data) and AdExchanger (@adexchanger) on Twitter.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!