Across the nation a new surge of lawsuits have targeted social media giants, placing their harms against young users on the front stage and TikTok is currently locked in the center. On October 8, 2024, 13 states and Washington, D.C., joined forces to launch separate lawsuits alleging that the platform’s profit-driven design choices disproportionately impact youth. The new coalition of state attorneys general represent a critical step forward in the fight to confront the social media industry’s cyclical negligence and perennial mental health harms.
The lawsuit echoes earlier legal actions like the 33-state lawsuit against Meta and the recent cases from attorneys general in Texas, New Mexico, and Arkansas, who filed complaints against TikTok, Snap, and YouTube. Together, the states describe, often in gruesome detail, the failure by platforms to protect minors from sextortion, social media addiction, and sweeping privacy violations.
However, the most recent case against TikTok brings to light an even broader range of harms, alleging that the platform not only perpetuates addiction through its content-recommendation algorithms but also encourages minors’ participation in dangerous challenges. In one example, on February 20, 2023, a 15-year-old Manhattan boy, Zackery Nazario, died after being struck by a beam while “surfing” on top of a Brooklyn-bound J train over Williamsburg Bridge. After Zackery's death, his mother found his social media accounts filled with videos encouraging him to partake in the subway surfer challenge. The states further allege that TikTok violates federal law by collecting data from users under the age of 13 without parental consent — an allegation separately supported by a lawsuit filed by the Department of Justice on August 2, 2024.
Although there were 14 complaints filed against TikTok, four especially stand out as the most egregious. Filings from California, New York, Kentucky, and Washington, D.C. pull back the curtain on the platform’s exhaustive damage to kids online. For instance, New York’s lawsuit details how the app’s algorithms expose children to increasingly harmful content in pursuit of maximum engagement. California alleges that TikTok deceives users about its moderation practices by insisting that they remove content that violates their community guidelines. In reality, the company only moves it out of the “"For You"” feed, making it still accessible and searchable to users. Additionally, Washington, D.C.’s complaint accuses TikTok of running a “virtual strip club” through its live streaming feature, where children have been sexually exploited in exchange for virtual currency. Finally, Kentucky’s improperly redacted lawsuit was leaked by Kentucky Public Radio, revealing that the company was intentionally designing algorithms to addict its users despite publicly downplaying their risks.
Addictive by Design
In addition to exposing TikTok’s deliberate indifference and patterns of negligence towards its users, all 14 complaints allege that key design features like the “For You” page, algorithm, TikTok “Live,” and more were intentionally created to addict youth online. The following synthesizes the most egregious findings:
“For You” Page:
TikTok’s core design feature, the "For You" page, delivers engaging content to its users based on how long they view and interact with the videos in their feed. According to Kentucky’s complaint, the company’s internal documents revealed that they knew an average user could become addicted to their app in under 30 minutes and were able to precisely determine that it only takes 260 videos for a user to form an addiction to the platform. In spite of this, TikTok’s own internal research attests to their knowledge that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” In response, the company’s team implemented a screen time tool that would purportedly limit kids’ usage to only 60 minutes. However, they noted that, in reality, it had very little impact and only accounted for about a 1.5 minute drop in a user’s usage. One employee observed that the tool’s success was directly antithetical to the mission of the company stating that, “Our goal is not to reduce the time spent” but rather to “contribute to… [the] retention” of users.
Recommendation Engine:
The power behind TikTok’s "For You" page is its recommendation engine that upranks often divisive and harmful content in order to increase the time that users spend on the platform. Kentucky’s complaint reveals that, in an internal presentation from 2021, the platform viewed itself as being in an “arms race for attention,” stating that “the younger the user, the better the performance.” TikTok in fact knew that its algorithms frequently pushed users into harmful filter bubbles with one internal 2020 presentation warning that the app “can serve potentially harmful content expeditiously.” One TikTok employee noted that it took them 20 minutes to drop into a “negative” filter bubble which they commented, “the intensive density of negative content makes me lower down mood and increase my sadness feelings.” In one particular instance, Charles Bahr, an employee, told his superiors that “the algorithm was sending Generation Z users endless streams of depressing and suicide-glorifying videos.” According to Washington D.C.’s complaint, a few months after he raised the problem, TikTok fired him. Indeed, the platform knew that there were internal concerns about its algorithm and yet continued to send children into downward spirals of depressive content since at least 2020 — and undoubtedly well before.
TikTok Live and Currency:
Washington D.C.’s complaint highlights TikTok’s “Live” feature which allows users to stream videos and receive virtual gifts that promise monetary rewards. Though the section remains heavily redacted, the prosecuting team drew from research compiled by Forbes in April 2022 entitled “How TikTok Live Became A Strip Club Filled With 15-Year-Olds.” The article followed one 14-year old user, “MJ” who broadcasted to 2,000 strangers commenting messages like “$35 for a flash,” “I’m 68 and you owe me one,” and other sexually suggestive comments. Other livestreamers received comments like “IF U DO THE BLACK PART [the user’s bra] IM GOING TO SEND TIKTOK LIVE 35.000 TIKTOK COINS (400$),” “now the shorts,” “more midrift yo,” “keep going baby.” Forbes reported on the app’s underground language that pedophiles used to view the bodies of minors like “outfit check,” “pedicure check,” “there’s a spider on your wall,” and “put your arms up.” In the latest Kentucky complaint against the platform, the company’s internal investigation discovered that “a significant” number of adults were directly messaging underage TikTokers about stripping live on the platform. Further, in just one month, 1 million “gifts” were sent to kids engaged in “transactional” behavior with a TikTok researcher acknowledging that the “content that gets the highest engagement may not be the content we want on our platform.”
Content Moderation:
Several of the lawsuits expose TikTok’s cyclical failure to uphold its community standards and moderate content on its platforms. Kentucky’s unredacted lawsuit uncovered internal studies that uncovered self-harm videos that had more than 75,000 views before TikTok identified and removed them. Yet, rather than removing violative content, Washington D.C.’s complaint reveals that the company often allows videos to remain up and searchable by users, only removing it from users’ "For You" feed or giving it the lowest priority in the algorithm. The company internally acknowledged that their content moderation had substantial leakage rates including: 35.71% of “Normalization of Pedophilia;” 33.33% of “Minor Sexual Solicitation;” 39.13% of “Minor Physical Abuse;” 30.36% of “leading minors off platform;” 50% of “Glorification of Minor Sexual Assault;” and “100% of “Fetishizing Minors.” Furthermore, the platform has attempted to obfuscate the failure of its content moderation by publicizing misleading metrics in the name of faux transparency like “proactive removal rate,” which only captures how fast the company removes content that it manages to catch, not the content that it manages to catch overall.
Beauty Filters:
A key design feature of TikTok is the ability for users to create and apply “filters” that change the appearance of the wearer, often modifying their face into the “ideal” version of themselves. One filter, bold glamor, uses artificial intelligence to create the effect of high cheekbones and strong jawlines. Kentucky’s lawsuit revealed that the platform knew that the feature caused harm to young users. A group of employees suggested that TikTok “provide users with educational resources about image disorders,” create a campaign to “raise awareness on issues with low self esteem (caused by excessive filter use and other issues),” and create “an awareness statement about filters and the importance of positive body image/mental health.” In spite of TikTok’s knowledge that filters harmed youth, Kentucky investigators discovered that the app deliberately retooled its algorithms to amplify users that the company viewed as beautiful. The complaint wrote that, “by changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the ‘For You’ feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users.”
Age Verification:
Several of the submitted complaints allege that TikTok, despite publicly claiming that the app is not directed to children, still collects data from users that it knows are under the age of 13. In fact, according to a survey by Common Sense Media conducted in the United States in 2022, 47% of respondents aged 11 to 12 years were using TikTok. Likewise, in July 2020, TikTok classified more than a third of its 49 million daily users in the United States as being 14 years old or younger. One former employee reported that TikTok had actual knowledge of children even younger based on videos posted on the TikTok platform — yet failed to promptly take down those videos or close those accounts. The state lawsuit insists that TikTok knows that young users are on its platforms not only because of internal documents, but the company’s posture towards content. Indeed, thousands of accounts on the app feature material from well-known pre-teens’ brands such as: My Little Pony, Pokémon, Pre-teens Tonight show, LOLSurprise, Cartoon Network, Bluey, and Kidz Bop. Indeed, California’s complaint cites Britain's data protection authority who, in 2023, issued a fine of $14.9 million because the platform had failed to abide by data protection rules intended to safeguard children online by allowing up to 1.4 million children under the age of 13 to use the service.
Note: For a comprehensive lawsuit focused on TikTok’s age verification failures, see Department of Justice v. TikTok (2024) and Federal Trade Commission v. TikTok (2024).
Tik Tok on the Clock
The lawsuits filed by attorneys general from 13 states and the District of Columbia against TikTok expose a stark reality: a widening divide in the legal protection of young users online, underscored by a pattern of corporate neglect, and a cycle of broken promises. While the recent advancement and proliferation of lawsuits are a positive step forward, the efforts nonetheless reveal the limitations of a fragmented state-by-state approach. In the absence of robust regulatory oversight, harms against youth online have only worsened, as generations of children navigate social media at younger and younger ages.
To bridge this divide, comprehensive legislation like The Kids Online Safety Act (KOSA) is critical. KOSA aims to establish consistent, enforceable standards to protect children from harmful content, data exploitation, and manipulative algorithms. It offers a clear, bipartisan path forward, setting a standard for platforms to combat issues like child sexual abuse material (CSAM), bullying, harassment, and addiction. A poll released by Issue One and Fairplay revealed nearly universal support across the political spectrum for legislation requiring social media platforms to protect kids online. 86% of U.S. voters polled support KOSA, and 76% say they would be more likely to vote for a member of Congress who supports the legislation.
Still, TikTok and other social media platforms have invested millions to kill crucial legislation that would protect kids online. Per an Issue One analysis, lobbying disclosures reveal that ByteDance, the parent company of TikTok, spent a record $6 million on lobbying during the first half of 2024 — a 65% increase from what it spent on lobbying during the first half of 2023 and the most the company has spent in any first half of the year since it first hired federal lobbyists in 2019. In the last year alone, the company employed 49 lobbyists — one for every 11 members of Congress — and supported multiple trade associations like TechNet and NetChoice that directly lobbied against bills like KOSA, stifling lawsuits, and killing online safety legislation at the state level.
Mounting lawsuits against social media companies prove that corporate self-regulation has failed. TikTok is certainly no exception. As children remain exposed to online threats, TikTok continues to deploy addictive design features and ineffective safety measures. Legislation like KOSA would set a national baseline for online safety, compelling platforms to prioritize children over profits and practice responsible innovation to prevent harm. If children aren’t trusted to grade their own homework, how can the U.S. trust social media companies, motivated by a profit-driven margin, to regulate themselves? Without regulations in place, tech companies will continue to disregard the safety of young users, making stronger federal legislation essential to close the loopholes that allow these platforms to profit from neglect.