TikTok executives and employees were well aware that its features encouraged compulsive use of the app, as well as the associated negative mental health effects. NPR. The broadcaster reviewed unredacted documents of the lawsuit provided by the Kentucky Attorney General’s Office. Kentucky Public Radio. More than a dozen states He sued TikTok a few days ago accusing him of “false claims [that it’s] safe for young people.” Kentucky Attorney General Russell Coleman said the program was “specifically designed to be an addiction machine that targets children who are still in the process of developing appropriate self-control.”
Most of the documents submitted to the lawsuits contained redacted information, but in Kentucky, there were erroneous redactions. Apparently, TikTok’s own research found that “compulsive use is associated with a number of negative mental health effects, including loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” TikTok executives also knew that compulsive use could interfere with sleep, work and school commitments, and even “connecting with loved ones.”
They also reportedly knew that the app’s time management tool was barely helping to keep younger users off the app. Even though the tool set a standard limit of 60 minutes per day for app use, teens spent 107 minutes on the app even when it was turned on. That’s just 1.5 minutes shorter than the average usage of 108.5 minutes per day before the tool was launched. Based on internal documents, TikTok attributed the tool’s success to its “improvement.”[ed] public trust in the TikTok platform through media coverage.” The company knew the tool would not be effective, a filing said, “[m]Young people don’t have the executive function to control their screen time, while young adults do.” Another paper states that “on most engagement metrics, the younger the user, the better the performance.”
In addition, TikTok is aware that “filter bubbles” exist and understands that they can be potentially dangerous. According to the documents, employees conducted internal investigations where they found themselves trapped in negative filter bubbles shortly after following certain accounts, such as those focusing on painful (“painhub”) and sad (“sad notes”) content. They are also aware of content and accounts that promote “inspiration” related to disordered eating. Due to the way TikTok’s algorithm works, its researchers discovered that users were placed in filter bubbles after 30 minutes of use in one sitting.
According to the documents, TikTok also struggles with moderation. An internal investigation found that underage girls in the program received “gifts” and “pennies” in exchange for live undressing. Senior officials at the company have reportedly instructed moderators not to delete users who are reported to be under 13 unless their accounts actually show that they are under 13. NPR TikTok has also admitted that a significant amount of content that violates its rules is moderated, including videos that normalize pedophilia and glorify minor sexual harassment and physical violence.
TikTok spokesman Alex Haurek defended the company, telling the organization that the Kentucky AG’s complaint “selects misleading quotes and takes old documents out of context to misrepresent our commitment to community safety.” It also said TikTok has “robust safeguards that include proactively removing suspected underage users” and “voluntarily enables security features such as standard screen time limits, family pairing and privacy for under-16s”.