Home Platforms Research: Why Google, Facebook And Instagram Are Being Hypocritical When They Bust Supposed Bad Actors For Messing With Their Algorithms

Research: Why Google, Facebook And Instagram Are Being Hypocritical When They Bust Supposed Bad Actors For Messing With Their Algorithms

SHARE:

When Facebook, Google or Instagram approve of an engagement-boosting tactic, they usually describe it with positive words, such as “organic” or “authentic.”

Tactics they forbid, however – generally unilaterally and sometimes without warning – are characterized as “unnatural” and whoever uses them is a “schemer” or an “offender.”

Most news articles covering these guideline changes unquestioningly adopt the same terminology.

Yet, an academic paper released in November examining the moralizing language employed by the platforms found that the line between what they consider good and bad is “incredibly blurry and constantly changing,” said Caitlin Petre, an assistant professor of journalism and media studies at Rutgers University, and co-author of the study.

“The platforms are portrayed as neutral, benevolent actors working to maintain a meritocracy while users are seen as the deviants,” said Petre, who conducted the research with Brooke Erin Duffy, an associate professor of communications at Cornell, and Emily Hund, a researcher at the University of Pennsylvania.

The big problem with that? “It obscures the fact that these platforms are also private, profit-driven companies that mainly act to safeguard and protect their own financial interests.”

AdExchanger spoke with Petre.

AdExchanger: You use the term “platform paternalism” in your study. How do you define it?

CAITLIN PETRE: In punishing the so-called bad people, the platforms elevate themselves as the rightful arbiters of the online universe, as benevolent figures with the authority to wield power neutrally and in everyone’s best interests. That is platform paternalism. 

What’s an example of something that was once seen as legit and is now considered algorithmic manipulation?

There are many examples, but a good one would be in May 2017 when Facebook said it would demote clickbait headlines in the news feed. Facebook defined clickbait headlines as a form of algorithm gaming, as an illegitimate use of the platform. Facebook started telling publishers not to use so-called clickbait headlines and instead to use a call to action in posts in order to make them compelling.

Flash forward to December, not even a full year later, and Facebook announced that it would start cracking down on “engagement bait,” a term Facebook coined, as far as I know, which is defined as essentially goading users into interacting with a Facebook post by, among other things, including a call to action – the exact tactic Facebook explicitly encouraged publishers to use in May.

What reasoning seems to go into platform guidelines that govern these use cases?

We would all like to know that. These platforms operate as black boxes, so it’s usually hard to know exactly what drives a particular change. But if you take a step back and look at the big picture, you see that the platforms have two different models for achieving visibility and distribution: One is free use and the other is paid.

The definition of what is thought of as gaming the system vs. not gaming the system pushes publishers and creators, or “cultural producers” as we call them in the paper, toward the paid promotion of content as opposed to achieving visibility through free use of the platform.

But doesn’t organic reach eventually decline on social platforms?

There are definitely varying degrees of awareness of the fact that at some point there’s no more free lunch and that you’re not going to be able to achieve the same visibility as before without paying, but there’s also a sense of resignation. For some creators, like freelance photographers on Instagram, for example, there is simply no scaled alternative for them to use.

Is there a worst offender among the big platforms for having the most inscrutable or constantly changing guidelines?

The answer depends on the community of users you’re looking at. My co-authors focus on influencers, so they’d probably say Instagram. I study journalists and what is happening to the media business and so, for me, it’s absolutely Facebook. Facebook has professed again and again that the news is important to them, but we’ve seen how it’s played out in the journalism industry and sometimes with terrible effects. Look at the pivot-to-video crisis, which illustrates that if you’re a publisher and Facebook says, “Jump,” you usually say, “How high?”

What’s the solution?

That’s the million – or many billion – dollar question, and it’s a difficult one. But as the so-called tech lash has grown over the last few years, there’s been greater awareness that these platforms are driven by financial and material interests above any other high-minded goal.

This interview has been edited and condensed.

Must Read

Comic: The Great Online Privacy Battle

What Regulators Talk About When They Talk About Ad Tech

If you want to know what privacy regulators think about online advertising, it’s not a mystery. Just listen to what they’re saying. Federal policymakers, state attorneys general and California’s new privacy watchdog are all hammering the same points: protect kids, honor opt-outs, back up your privacy promises, stop collecting more data than you need and […]

Keyword Blocking Demonetized More Than Half Of Reuters’ Brand-Safe Stories

The effect wasn’t just limited to news content. The Reuters.com/lifestyle vertical also had some of its brand-suitable pages blocked.

The Agentic Marketplace Is Here. Where Does That Leave DSPs and SSPs?

Swivel and Olyzon’s new partnership brings buy-side and sell-side agents together as early examples of an agentic marketplace.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters
Comic: Causal Meets Casual

Jones Road Beauty Is Using A New Type Of MMM To Reset Its Media Measurement

Inside how Jones Road Beauty is trying to turn messy, conflicting measurement signals into a single testing roadmap for its media mix.

Comic: America's Mext Top AI Model

AI Is Moving Fast. The Law, Not So Much

IAPP’s Global Summit in DC was a reminder that AI is moving fast – and judges, privacy lawyers and practitioner are racing to keep up.

CIMM Is Out To Prove That All Media Isn’t Equal

An upcoming paper from CIMM doesn’t just demonstrate that differences in media quality can be measured. It also argues that tying media value to short-term outcomes has perpetuated longstanding industry challenges.