Today, apps that want to request detailed user info go through a review process with Facebook in which developers are required to justify what they want to collect and why.
After the lid blew off the Cambridge Analytica story, Facebook hired a forensic auditor in the UK to investigate the company’s servers in London. The auditor was asked to leave the premises on Monday by Britain’s Information Commissioner’s Office, which is pursuing its own warrant to investigate Cambridge Analytica’s systems.
When AdExchanger asked Facebook if it has plans to audit other third parties it had previously told to delete data to make sure they actually did, a company rep pointed to a blog post by Paul Grewal, Facebook’s VP and deputy general counsel, which said it has a “variety of manual and automated checks” to ensure compliance with its policies, including random audits of existing apps and “regular and proactive monitoring of the fastest-growing apps.”
But one ad executive called it “enforcement theater.” When this person’s company was asked to delete data, the request came orally, rather than in writing, and no one from Facebook requested a look inside the company’s database. The company says it did destroy the data, but there was no follow-up and Facebook never asked for proof.
“We deleted all of it, but there was no audit beyond that,” the exec told AdExchanger. “We could easily have just not deleted it.”
In Kogan’s case, he had permission to collect Facebook data, just not to resell it or share it. But former Cambridge Analytica contractor Christopher Wylie told The Observer that when Facebook’s security protocols were triggered, because Kogan was pulling a large amount of data in a short period of time – millions of profiles over just a few weeks – “apparently Kogan told them it was for academic use so, they were like, ‘Fine.’”
But, as clearly happened with Cambridge Analytica, Facebook data does make its way into the commercial sphere.
Bryant Garvin, director of YouTube, search and display advertising at Purple, has also been on the receiving end of shady emails from obscure companies with claims of some sort of fancy, proprietary data collection technique.
“It happens every couple of months,” Garvin said. “Someone sends an email from a company I’ve never heard of that purports to have personalized targeting options, and they’re never clear on the science behind it or how they’re getting the data. It’s always a major red flag for me.”
And a CEO of a small agency told AdExchanger that it’s common to get emails from people, sometimes with ties to academia, offering Facebook data or device IDs for sale.
But rather than a thriving black market for Facebook data fueled by malevolent intent, the more likely issue is willful ignorance. A case of “data suppliers promising lots of deep data without being forthcoming about the source, and data buyers determined to not look that closely,” said Beth Morgan, COO at mobile data company Twine.
“The terms of service say that publishers can’t share the data they get through Facebook,” Morgan said. “So, the problem lies in a) ignorance and b) difficulty in auditing/checking. Basically, the data industry operates largely on trust, because it’s relatively hard to track data flows and see where it’s going.”
And this isn’t Facebook’s problem alone. Tracking the provenance of data and where it goes is a major frustration for anyone with proprietary data operating in the digital ecosystem.
“If you integrate with most data vendors, they commingle the data,” said Keith Petri, chief strategy officer at Screen6. “And, especially if you have direct-to-publisher relationships with access to proprietary data, those publishers don’t want their users to be commingled, mixed and profiled by other platforms.”