Impact of TikTok Content on Adolescent Mental Health, as Revealed in a Study
In the digital age, social media platforms like TikTok have become a significant part of young people's lives. However, over the past year, questions have been raised about the impact of these platforms on adolescent users, particularly girls, and their potential contributions to negative mental health outcomes.
Recent investigations by state and federal agencies, as well as reports from the Center for Countering Digital Hate (CCDH), suggest that TikTok's safety measures may not be enough to protect young users from harmful content related to eating disorders and suicide.
TikTok has introduced features such as maturity scores, filters for age-appropriate content, time limits, and screen breaks. The platform has also banned content that could potentially lead to self-harm or suicide, and searches for harmful words or phrases will bring up no results and direct viewers towards local support resources.
However, evidence shows that these efforts fall short. Internal documents and investigations reveal that TikTok continues to promote harmful content, such as "thinspiration" videos and extreme diet challenges, which often evade detection. The platform's algorithmic recommendation system rapidly serves self-harm and suicide-related videos to vulnerable users, creating what some describe as a "digital death trap" for emotionally distressed children.
Furthermore, safety features like Family Pairing and Content Levels are often ineffective. Age verification prompts can be easily bypassed by children, and parental locks are not always successful. Research and lawsuits suggest that the platform’s design elements, such as infinite scroll, autoplay, and personalized feeds, foster addictive use among young users, exacerbating mental health issues.
The CCDH believes that further legislation is required to ensure children are safeguarded from inappropriate material on TikTok. Recent findings by the CCDH suggest that users can access such content in less than eight minutes. This has led to increased mental health risks and ongoing legal actions alleging failure to adequately protect minors.
The hearings highlighted the need for increased scrutiny in determining what content should appear on screens. As the digital landscape continues to evolve, it is crucial that platforms like TikTok take more significant steps to protect the mental health and wellbeing of their young users.
Despite TikTok's implemented safety measures, investigations suggest that the platform's algorithm frequently promotes harmful content associated with eating disorders, self-harm, and suicide, which negatively impacts the mental health of young users. The platform's design, including infinite scroll, autoplay, and personalized feeds, is suspected to contribute to addictive usage among teenagers, worsening their health-and-wellness and mental-health concerns. Moreover, these safety features are often bypassed by young users, leaving them susceptible to inappropriate content on the platform.