The Federal Trade Commission (FTC) wants to know what marketers are doing with segmentation profiles like “urban scrambler” and “ethnic second city struggler.”
The potential for advertising segmentation to exacerbate inequality was a central topic at Monday’s FTC workshop, “Big Data: A Tool for Inclusion or Exclusion?” The Washington, D.C., event included representatives from advocacy groups like the American Civil Liberties Union (ACLU) on the consumer side and the National Retail Federation (NRF) on the marketers’ side, as well as data companies like Epsilon.
While the FTC has proposed legislation to address issues in data marketing, the workshop also served as a forum for how the industry, especially data brokers, should self-regulate.
“There needs to be more accountability throughout the ecosystem,” FTC Commissioner Julie Brill told AdExchanger at the event. The FTC has already made legislative recommendations regarding data brokers and data security. Some are outlined in an FTC report, “Data Brokers: A Call for Transparency and Accountability,” released in May.
Some so-called “data brokers” (the definition of what the term entails varies depending whom you ask, though the FTC named nine in its report) have already heeded Brill’s call to allow consumers to see their files. Acxiom, for instance, has a consumer data look-up site called AboutTheData.com.
Brill, however, said she would like to see a common portal, maintained by numerous data marketing companies, because most of the major players like Acxiom or Epsilon aren’t household names
Brill asked for more transparency in the industry.
“I’m calling on data brokers themselves to say what their clients are doing with the profiles they’re creating,” she told AdExchanger. “How is ‘ethnic second city struggler’ being used? How is ‘urban scrambler’ being used? That’s where the rubber hits the road. We do need a lot more transparency to figure that out.”
But for many marketing data providers, additional transparency into how data brokers collect and use data is a no-go because it would amount to “giving up trade secrets,” said Epsilon’s chief privacy offer and general counsel, Jeanette Fitzgerald, at the panel.
Christopher Calabrese, legislative counsel for the ACLU, disagreed.
“I think we’re woefully inadequate regarding transparency right now,” Calabrese told Fitzgerald. He acknowledged that data collection practices are the industry’s “secret sauce,” but felt “individual consumers should be able to know what assumptions are being made about them.”
For instance, while Acxiom says it collects 3,000 data points about consumers, its AboutTheData website doesn’t reveal all of that information. “There are nowhere near that number of data points about me, or anything about how they’re being used,” Calabrese said. “Am I being grouped as an ‘urban scrambler?’ Am I vulnerable?”
Federal Regulation: Arguments For And Against
Marketing representatives bristled at the suggestion of more legislation. After all, consumer protection laws addressing some of these concerns, such as anti-discrimination laws and the Health Insurance Portability and Accountability Act, already exist. The Fair Credit Reporting Act (FCRA) gives consumers the right to know and correct information related to their credit score, and to check if they were denied a loan based on that information.
Additionally, the Civil Rights Act and the Americans with Disabilities Act also provide protection from discrimination based on race, ethnicity and sexual orientation, noted Carol Miaskoff, an attorney for the Equal Employment Opportunity Commission.
Some in the crowd were baffled why legal protections needed to be extended into the marketing world.
“The FTC seems to be trying to extend traditional discrimination law to marketing. That’s crazy,” said Berin Szoka of the think tank TechFreedom, who attended the workshop. “Laws governing insurance, lending, employment, housing and so on focus on real harms, and the FTC should certainly enforce those … but imposing those laws on marketing would actually harm consumers.”
Others favored more regulation.
“From a statistical perspective if you have a 2-3% error rate [using some of these big data practices] – which is good, and people get tenure for – that means you’re wrong 6 million times [in the United States alone]. That’s a lot of people that your automated decision-making could be harming,” said Jeremy Gillula, staff technologist at the Electronic Frontier Foundation.
But Mallory Duncan, SVP and general counsel for the NRF, argued the stakes around marketing decisions are lower than the stakes around financial decisions: “Access to credit is a fundamental right,” she said. “Access to a high-end men’s fashion catalog is not.”
While retail catalogs are unlikely to be regulated by the FTC, other products, like credit card offers, fall into a murky area. Current laws about credit only apply to existing customers, not customer acquisition. Some panelists argued it could cause harm if people sorted into low-income segments only receive offers for high-interest, low-quality credit cards while those in wealthier segments receives choice offers.
In February, the FTC put FCRA in action when it targeted Instant Checkmate, a sort of data broker, which had advertised itself to employers and landlords. The FTC argued this activity made it a consumer reporting agency despite its disclaimers otherwise. Instant Checkmate settled the case in April, paying a fine and agreeing to additional oversight.
Although the settlement did not address any targeted advertising done by Instant Checkmate, its ad choices did not go unnoticed by Latanya Sweeney, the FTC’s chief technologist, who referenced a study she conducted at Harvard University in 2013 before joining the FTC.
She showed screenshots that suggested that the company had changed the copy in its AdWords ads when the search terms included black names, like hers and others like “Jermaine” and “Ebony.” The copy would dynamically insert the Googled person’s name, and add “[name] – Arrested?” Sweeney argued this practice could potentially harm an individual’s employment prospects.
It’s unclear whether the targeting was intentional, or perhaps resulted from algorithms optimizing performance based on clicks.
Which brought up another point made by the opening speaker, Solon Barocas, a postdoctoral research associate at the Princeton University Center for Information Technology Policy. “Unintentional discrimination is far more likely to be occurring,” he said. “Data mining can inherit past prejudice and can also reflect current prejudice.”
The ACLU’s Calabrese echoed Barocas’ thoughts: “Data is not bad or good. Data is. It reflects existing disparities in our society.”
Will Self-Regulation Work?
If consumer segmentation practices unintentionally reinforce social inequality, marketers need to address this through best practices, Barocas and other argued.
But what self-regulation entails isn’t yet clear-cut.
“Companies need clarity on what risks they should be actively mitigating. Unless they have that, it’s hard to coalesce around a set of best practices,” said Michael Spadea, director of the advisory group Promotory Financial Group, on the panel.
Furthermore, data marketing providers think they’ve done enough.
“I think there is already a lot of self-regulation,” said Epsilon’s Fitzgerald. “There is the DMA [Direct Marketing Association], the IAB [Interactive Advertising Bureau]. The DMA will enforce those guidelines among members, or get non-members to act in an ethical problem – and then if there is still a problem, they will turn over to FTC.”
When the FTC observes demonstrable inequality, it will act in accordance to existing laws.
“As technologically advanced as we get, there are principles which exist in our law, like discrimination, and targeting vulnerable communities in ways that can harm them, that are fundamental,” Brill said. “We are going to be applying those principles even though technology is moving rapidly.”
For other areas where the FTC has highlighted concerns, “I’d like to see the industry take steps to address these issues before Congress acts or before we get involved,” Brill said.