“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Sara M. Watson, a technology critic and fellow at the Berkman Center for Internet and Society at Harvard University.
I have owned a truck and am interested in buying another, according to my Acxiom aboutthedata.com profile.
Neither “facts,” however, are true. Acxiom doesn’t share where this data comes from. I can only guess that Acxiom makes this conclusion because DMV records show my father’s Ford truck registered in the 1990s and my profile mistakenly lists my parents’ home address.
Would I fall into the “Truckin’ & Stylin’” Personicx consumer segment developed to describe me to potential advertisers? My aboutthedata.com profile shows me what Acxiom knows about me, but not about what it means or what effect it has on the advertisements I see.
The FTC data broker report released at the end of May argues that data brokers need to do more to make the sources of data, the data profile and uses transparent to consumers. The report acknowledges Acxiom’s efforts in leading the industry by building the first interface for consumers to review and revise data. More in the industry are likely to follow, whether to get in front of future legislation or as mandated later on.
But this doesn’t go far enough.
Sensitive Information
The FTC’s definition of “sensitive information” is far too vague. So is the definition that industry leaders use.
“Data regarding personal information that pertains to employment or insurability decisions, or that relates to sensitive health-related issues or confidential matters deserves much different treatment than data that would indicate that I am a sports fan,” Scott Howe of Acxiom writes.
Some information is intuitively sensitive. But is driving a truck sensitive? I can imagine that my proclivity for truck driving could be used both to target advertising to me, but could also be a factor in risk-based calculations or inferred political leanings. Big data methods promise to surface novel correlations from matching up disparate data. That means innocuous-seeming data like sports fandom quickly becomes sensitive when used as proxies and towards unintuitive ends. It is unfair to expect consumers to understand the far-reaching effects of potential correlations.
AdExchanger Daily
Get our editors’ roundup delivered to your inbox every weekday.
Daily Roundup
Meaningfully Transparent
It is not enough to make the data transparent to users. For data to be made meaningfully transparent, we need to be able to understand how data is used. Profiles filled with details don’t tell us what it means to advertisers or underwriters.
Right now I can opt out of data brokers using my interest in trucks for marketing, but I have no granular controls over appropriate and inappropriate uses of that data point. Without a clear indication of the value of sharing that detail, or the risks of that detail being used toward sensitive applications, data brokers actually force risk-averse consumers to assume the worst and choose the undesirable binary of the opt-out.
The FTC report has been criticized within the industry for not doing a good job of outlining the harms to consumers. I agree, but I suspect that it is because most of the time, these harms are invisible to consumers.
“If a consumer is denied the ability to conclude a transaction based on an error in a risk mitigation product, the consumer can be harmed without knowing why,” the report says.
Even when data is exposed to consumers, it’s near impossible to interpret its potential uses. The use of data is still in a black box.
I argue that in order for data brokers and the FTC to build something meaningfully transparent to consumers, the data needs to be explained in its uses, not just in its sources, collection and static representation.
Accountability For Data Uses
There also is a sense in the industry that the FTC perhaps misplaced its focus on data brokers, rather than the customers of those data brokers. However, that is where they, as regulators, potentially have the most concentrated impact to protect against misuses and harms surfacing from inappropriate uses of data. Ideally, future models for regulation would hold both the data brokers and the customers of data brokers accountable for their uses of data.
Holding data brokers accountable for the practices of their customers is difficult, but some data brokers are starting to scrutinize the uses of the data they peddle. The FTC report briefly discusses best practices of data brokers using “seeding” techniques in the data to catch reselling and misuses on secondary and tertiary markets. Marketing data should not be used for insurance and credit underwriting, for example, Howe argues.
Self-regulation is a good first step, especially in shaping what model regulation should look like. But self-regulation does not stop a few bad actors from taking advantage of vulnerable consumer populations, as we have seen with predatory lending. We need to put forth consumer protection levers like those developed to address Fair Credit Reporting.
The FTC report breaks out its recommendations for data brokers based on the variety of products data brokers sell, from “marketing products” to “risk mitigation” and “people search products.” This organizing structure points to the importance of addressing each of the potential uses of data in turn. This works as brokers self-identify their products and their customers, but this needs greater explanation and transparency to hold all parties involved accountable for the uses of data.
Involve Consumers
The FTC report sets up potential future legislation to address how data can and should be used. And the industry is jumping ahead in an effort to self-regulate. But nowhere in this conversation is a discussion with consumers about how to make these changes meaningful to individuals. If we miss that, the protections demanded by the FTC might leave consumers in the dark with a spreadsheet of meaningless data points and no indication of what the values add up to. The concessions data brokers make to reveal but not contextualize data will just confuse, rather than empower consumers.
The efforts of the industry and of the FTC are working to make the data economy less obscure. Still, we need to start having conversations not just about the legality of data uses, but about the ethics of data uses. Those conversations need to involve consumers in order to reflect normative judgments about the appropriateness of data’s uses in context.
This goes further than asking how data can be used. Instead, this asks how data should be used. We’re in an exploratory phase about the potential of data right now. We have to start interrogating not just what is possible, but what is right. This is true not just for the data broker and advertising industries, but for all big data applications.
Follow Sara M. Watson (@smwat) and AdExchanger (@adexchanger) on Twitter.