2018 was the year Facebook’s skeletons really came skittering out of the closet, and the drip, drip, drip of bad news is seemingly endless.
On Tuesday, the New York Times reported that Facebook shared more data with more partners than previously disclosed, including contact information, private messages and friend lists.
Facebook’s response to the Times investigation was to clarify how third parties access its API. A partner like Spotify would integrate Facebook’s messaging capabilities into its product so people could message their friends, although that would only work if they chose to use the Facebook Login. Facebook called the experiences it enabled “common in our industry” and akin to “being able to have Alexa read your email aloud."
But Facebook is fighting an unwinnable PR battle. Explanatory blog posts are poor weapons against outrage, and it's becoming clear that Facebook has some demons to excise at the mother ship level. And the scandals that rocked Facebook’s world in 2018 appear to be mainly of Facebook’s own making.
“Everybody came to the realization this year that many of Facebook’s problems aren’t necessarily external,” said Debra Aho Williamson, a principal analyst at eMarketer. “It’s not just people meddling or outsiders taking data they shouldn’t have taken. There are some internal issues here.”
Ghost of Facebook’s past
Leaky data controls are a prime example.
When the Cambridge Analytica scandal broke in March, Facebook quickly insisted that what had happened did not constitute a data breach.
Technically, Facebook wasn’t wrong. Prior to 2014, Facebook’s API gave third-party developers access to friend data. Aleksandr Kogan, the academic researcher at the heart of the Cambridge Analytica affair, had permission to collect Facebook data, he just wasn’t supposed to pass it on to partners like the now-defunct Cambridge Analytica, which he did in violation of Facebook’s terms of service.
Here’s the issue: Facebook knew way back in December 2015 that Cambridge Analytica possessed the illicit user data when The Guardian broke the news, but all Facebook did at the time was get signed documents from Kogan and Cambridge Analytica that the data had been deleted. No one at Facebook actually checked, and the news cycle marched onto something else.
But the clock was ticking down and so was a time bomb within Facebook, which eventually shut down developer access to friend data in 2015, though never ran audits to ensure its developer partners weren’t playing fast and loose with user data.
The seeds of the Cambridge Analytica debacle were planted long before they sprouted in 2018, and that’s something Facebook could have more systematically addressed before this year.
“Prior to 2018, Facebook suffered privately – and now suffers publicly – from loose company policies about its users’ data,” said Jessica Liu, a senior analyst at Forrester. “The Cambridge Analytica problem was the watershed moment in 2018 that exposed Facebook’s haphazard handling of its user data.”
And once the unpleasant revelations started rolling in – security bugs, privacy failures, data hacks – Facebook spent the majority of 2018 in a defensive crouch, telling anyone who would listen (journalists, developers, Congress, shareholders, the crowd at Advertising Week in NYC) that it was sorry for the scandal of the day.
But at this late stage, it’s about action and demonstrating a change in behavior more than mea culpas.
In mid-December, for example, Facebook revealed a security bug that exposed the private photos of up to 6.8 million users to 1,500 developers. Facebook knew about the bug since Sept. 25, but didn’t tell the Irish Data Protection Authority until nearly two months later. According to the General Data Protection Regulation, companies are required to disclose data breaches within 72 hours.
“Where did Facebook go wrong in 2018? If I had to pick one thing, it would be not coming fully clean about what it knows and what it did in all of the scenarios we hear about,” Williamson said.
But there are rumblings in the not-so-middle distance and uncertainties on the horizon that could change Facebook and the technology economy forever, including outraged regulators in Europe, fed-up lawmakers in the United States, the potential for federal privacy regulation and an ongoing Federal Trade Commission probe into whether what happened with Cambridge Analytica was a violation of Facebook’s 2011 consent decree. The FTC is also seeking more power over data security and privacy issues with an eye on big tech.
And not all buyers are feeling well-disposed toward Facebook, either. In November, Rishad Toboccowala, chief growth officer at Publicis Groupe, told the Times that Facebook has “absolutely no morals” and that as a business the company seems “to have lost their compass.” Mat Baxter, the global CEO of Initiative, an IPG-owned agency, recently posted on LinkedIn that that he’ll be advising clients to stay off the platform completely. “It’s about time we take a collective stand against the egregious behavior of Facebook” he wrote.
Then again, maybe Facebook survives this dumpster fire wounded but relatively intact.
“Every major technology company seems to go through some period of intense negative attention and questioning, and this could just be the time for Facebook,” Williamson said. “What we don’t know is whether this will be another Microsoft, which turned itself around to become an amazingly successful company again.”
But one could argue that Facebook is a platform of a very different color living in a very different world. Following the Cambridge Analytica scandal, Facebook “became a lightning rod for simmering existential, big picture concerns,” Williamson said.
“Facebook is at the center of macro scrutiny of the role of social media, how political bad actors can use digital media, how fake news gets distributed – and on and on,” Williamson said. “I don’t recall anyone questioning whether Microsoft was bad for society or seeing headlines like, ‘“Are we using Windows too much and is it making us into bad people?’”