Kids are being shaped by the technology around them. It’s happening almost imperceptibly and at all times.
Just try and find a teen in the US who doesn’t use the internet “almost constantly,” according to recent numbers from the Pew Research Center.
According to Pew, 46% of US teens say they spend almost all their time online – up from 24% around 10 years ago – and 96% report using the internet every day. (The other 4% probably just had their phones taken away.)
But tech savviness isn’t the same thing as digital literacy. That’s something, at least, that Republicans and Democrats can agree on.
“America’s young people are a population often unaware of how they’re being shaped – and we might even say catechized – by an incredibly sophisticated information ecosystem unlike anything our parents or grandparents ever confronted,” said Republican FTC Commissioner Mark Meador, addressing a crowd of advertising lawyers at the National Advertising Division’s annual conference in Washington, DC on Tuesday.
(And Meador would know. He mentioned at one point that he has six children, with a seventh due later this week.)
Chat check
Keeping American kids safe in what Meador called “an increasingly complex and fast-paced technological environment” is a top priority for this Federal Trade Commission, and that includes everything from safeguarding data to protecting kids against deep fakes, dark patterns and forming parasocial relationships with AI chatbots.
Last week, the FTC launched an inquiry seeking detailed information from Alphabet, Instagram, Meta Platforms, OpenAI, Snap, X and Character Technologies about how they design, test and monitor their AI chatbots. The agency wants to know how these companies measure safety, manage age restrictions, handle data and protect children and teens from potential dangers.
There have been more than a few really terrible reports of AI chatbots causing harm to young users, including allegations of encouraging self-harm and exposing minors to inappropriate and often sexualized content.
“In the last few weeks, we’ve heard the stories of chatbots advising children on the best methods of committing suicide, or sometimes the chatbots encourage them to go through with it,” Meador said. “These are novel consumer protection issues the commission can’t ignore.”
‘We have laws like COPPA for a reason’
But there are also more bread-and-butter issues that are top of mind for the FTC, like enforcing against good old unlawful data collection.
In early September, the FTC announced a $10 million settlement with Disney over allegations that it improperly collected data from kids under 13 after failing to label some of the videos it uploaded to YouTube as “made for kids” in violation of the Children’s Online Privacy Protection Act (COPPA) Rule.
Disney was able use the data it collected unlawfully to more effectively target advertising to kids without notifying parents or getting their consent.
“We have laws like COPPA for a reason,” Meador said, “because Congress knows, just like parents do, that children are not sophisticated consumers capable of making informed judgments about the advertising messages they receive.”
The trust gap
Most of the current FTC’s work sticks closely to existing law and avoids broad news rules, which is in stark contrast to the previous Democrat-led commission’s more expansive approach.
This shift follows President Trump’s controversial removal of the two Democratic commissioners earlier this year – Alvaro Bedoya and Rebecca Slaughter – consolidating an all-Republican panel. (There are currently no Democratic commissioners on the FTC.)
But issues related to children’s privacy and safety are a clear exception, with the present FTC actively advancing updated regulations to address new digital risks and tighten protections for minors online.
For example, thanks to work initiated under the Biden administration, this FTC was able to finalize sweeping updates to the COPPA Rule in April – the first major overhaul of the law since 2013.
But data protection “is just the tip of the iceberg,” Meador said. “The commission has a lot of work ahead of us.”
And underpinning all of that work is the fundamental need for trust, without which the entire digital marketplace, especially for kids, can’t function effectively or fairly.
To underscore that point, Meador quoted … the Bible.
“As far back as the Book of Leviticus, squirreled away in one of those lengthy compilations of Israelite law, we find the following command: ‘You shall not cheat in measuring length, weight or quantity; you shall have honest balances [and] honest weights,’” Meador said. “No human society can long survive without consumer trust – without ordinary people living in the confidence that when they buy and sell and labor, they’re not being misled.”