The reason why TikTok will likely never face a full ban is also the strongest argument for why it should: It’s simply too effective at capturing attention.
Beyond issues of geopolitical surveillance and data privacy lies a deeper, less-discussed concern: the quality of the attention platforms like TikTok capture and the methods used to capture it.
Short-form video, infinite scrolling and hyper-targeted algorithms aren’t neutral mediums. They shape, and often compromise, the attention they harvest, creating compulsive habits that warrant serious reflection – and which deserve consideration as a dimension of brand safety, quality and consumer well-being.
Hacking the system
For most of the history of media, attention has been earned through value exchange. Something informative, entertaining or useful was offered, and attention was given in return. Early social platforms adhered to this logic. Users connected with friends, discovered events, shared milestones.
But as digital media matured, formats emerged that bypassed this value exchange entirely. They don’t just attract or invite attention. They capture it and make it harder to withdraw. The experience becomes less of a choice and more of a reflex.
Design patterns like infinite scroll, autoplay video, push notifications, algorithmic recommendations and short-form loops are optimized to reduce friction and extend engagement. They are engineered with direct reference to behavioral science and persuasive computing, trained to exploit reward loops and cognitive vulnerabilities. That engineering works by hacking our neurochemistry.
Controlled substances
Research has consistently linked heavy social media use, especially on these high-potency platforms, with negative mental health outcomes, from sleep disruption to anxiety, depression and suicidality.
The American Psychological Association has drawn parallels between “problematic social media use” and substance abuse. Both are defined by using when you want to stop, taking excessive or deceptive action to maintain access, having strong cravings and using more than intended.
Content plays a role in algorithmic potency, with the most provocative and polarizing posts rising to the top. But the engagement format itself is the issue.
There’s a growing consensus among policymakers that this engagement format deserves to be treated the way we treat controlled substances like alcohol, nicotine, cannabis or even gambling (e.g., with age restrictions and other controls).
In 2023, the US Surgeon General issued an advisory on social media use by minors. Utah has passed legislation imposing restrictions on algorithmic targeting for minors and sued Meta and TikTok to “Remove features causing excessive use: autoplay, perpetual scrolling and push notifications.” Similar laws are being considered in other states and internationally. While most of these measures focus on minors, the same concerns apply to adults.
The lived experience of many users supports this movement. Ask around: How many people have deleted an app from their phone to take a break? How many describe their usage in terms of control, compulsion, abuse or regret? I know I have.
Or perhaps one only needs to look at our shared language for these experiences, with the mainstreaming of terms like “doom-scrolling,” “binge-watching,” “dopamine-farming” and “brain rot,” just to name a few. We know it’s bad for us, but we can’t stop.
Breaking the cycle
Advertisers can’t stop, either. Platforms have effectively captured consumers’ attention, creating a powerful economic incentive (even an imperative) for brands to spend there.
The platforms promise and deliver immediate, measurable outcomes, all the while conveniently grading their own homework. Every like, share, view, comment, site visit or purchase offers marketers immediate gratification, creating their own cycle of dependence.
With billions being spent to buy attention and ensure safe and suitable environments, there’s more than enough reason to question whether compulsive use patterns are good for business. Not all attention is created equal. Media environments that blur the line between consumption and compulsion may not be delivering the kind of attention that advertisers think they’re buying.
A dimension of safety and quality
Mental health is already making its way into the brand safety conversation. Some brand safety providers offer filters that steer advertisers away from content related to mental health, trauma or other emotionally loaded topics. That’s a start. But those tools still rely on a paradigm of brand safety predicated on content adjacency alone.
What’s missing from the conversation is the experience in which the ad is delivered. Not just what the user sees but how they got there. What state of mind they are in. How much control they have over what they’re consuming or how they consume it. Whether they feel trapped. Whether they’re even in a position to register a brand message with the cognitive clarity that effective advertising requires.
These questions should be part of how we define quality, suitability, safety and responsibility in media. Improving brand safety with deeper contextual and semantic understanding is important, but the context isn’t just on the page. There’s the social and human context, shaped by the medium itself, that should be considered as a dimension of safety for brands and audiences alike.
Currently, there’s no established framework for qualifying “healthy” versus “unhealthy” media experiences within targeted advertising, but such a framework would be relatively easy to build. Platforms could cap advertising shown to users exhibiting compulsive scrolling behaviors or flag users engaging at unusual hours. Perhaps certain age groups could be restricted entirely from receiving targeted advertising through addictive media formats.
None of these checks would pose a technological challenge; they simply require developing pragmatic, measurable standards for what constitutes safe and responsible media consumption.
Like privacy, change around compulsive media experiences will ultimately emerge proactively from the advertising industry or eventually be imposed upon it through regulation.
If the goal is to build lasting relationships with real people, then advertisers need to consider not just the message or the content, but the medium itself. The medium is the message, and it’s telling us something. The question is whether we are willing to listen.
“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Follow Broadsheet Communications and AdExchanger on LinkedIn.