Federal Trade Commission Chairwoman Lina Khan warned social media companies and other digital advertisers that the agency is considering new rules to protect American children from “stealth” digital advertising that is “designed to exploit [their] insecurities for commercial gain” during a virtual event on the topic Wednesday.
Khan said that “privacy, psychological, physical, economic and other harms…can arise from predatory advertising directed at children,” and the event convened experts to discuss what new rules on advertising found on social media, online games and virtual reality could mitigate those harms.
The event, titled “Protecting Kids from Stealth Advertising in Digital Media,” featured several panels of experts on childhood development, modern digital advertising practices and potential regulatory solutions.
Girard Kelly, director of the privacy program at Common Sense Media, a nonprofit children’s media recommendation organization, said during a panel that on average, 8 to 12 year-olds consume 5.5 hours of screen time per day, while 13 to 18 year-olds spend 8.5 hours per day on internet-connected devices.
“That’s a lot of screen time, and as expected [Alphabet’s]
TikTok and [Meta’s]
Instagram are all the most popular apps that teens just cannot live without,” Kelly said.
Social media platforms like these can feature so-called “stealth” or “blurred” advertisements, Kelly said, that are difficult for young kids and teens to understand as marketing, including “unboxing” videos of toys, funny memes that promote certain brands and content produced by influencers who are paid to feature products in their social media posts.
“In the current marketing ecosystem that children are immersed in, it’s designed to make it harder for consumers to cognitively process marketing,” said Josh Golin, executive director of the nonprofit Fairplay, which advocates for stricter regulation of marketing to kids. “For developing children who are already at a disadvantage, it’s even harder.”
Golin said these new marketing techniques leveraged so-called “parasocial relationships” wherein children develop a one-sided relationship with an influencer, whether they be a real human or an animated character.
“When marketing comes through a parasocial relationship, children are less likely to understand what’s going on and be able to defend themselves against it,” Golin added.
Online games are another venue where companies are able to take advantage of children by encouraging them to play longer than is healthy or by inducing in-app purchases of digital products.
Jenny Radesky, a developmental pediatrician at the University of Michigan, presented research findings that the majority of mobile gaming apps targeted at children aged 3 to 5 had “manipulative design features” used to drive revenue.
“If you’re a child trying to make a decision about whether to purchase something or whether to keep playing, you’re faced with design features that might be navigation constraints, parasocial relationships with characters you love and multiple confusing forms of digital currency,” she said.
The FTC has long relied on a self-regulatory organization, the National Advertising Division of BB National Programs, to help enforce rules on children’s advertising, but with the explosion of new forms of digital entertainment, the agency is considering taking a more active oversight role.
The agency is currently considering whether to update rules implementing the 1998 Children’s Online Privacy Protection Act, which imposes rules on online advertising to children under the age of 13 and Congress is debating bipartisan legislation that could expand those protections to kids 16 and younger.
“The last time we revised that rule was 2013,” Khan noted. “And a lot has changed since then.”