Tech
French Families Sue TikTok Over Harmful Content Exposure
Seven French families are suing TikTok, accusing the popular social media platform of exposing their children to harmful content that contributed to severe consequences, including two teenagers taking their own lives. The case, filed in the Créteil judicial court, claims TikTok’s algorithm promoted content related to self-harm, eating disorders, and suicide, according to Laure Boutron-Marmion, the families’ lawyer.
Boutron-Marmion described the lawsuit as the first of its kind in Europe and stated that the families aim to hold TikTok legally accountable for the content their platform failed to moderate. “This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings,” she explained to French media.
One of the tragic cases involves Marie, a 15-year-old who took her own life in 2021. Her mother has argued that TikTok’s exposure to dangerous video content was a significant factor leading to Marie’s death. The ongoing criminal complaint filed last year by Marie’s parents remains separate from the current group lawsuit.
Another teenager involved in the case also died by suicide. Four of the remaining five young women reportedly attempted suicide, with at least one developing an eating disorder after engaging with content on the platform.
TikTok has responded, stating that it has not received any notifications of legal proceedings related to these allegations. The company emphasized its community guidelines, which prohibit content promoting self-harm or suicide, and highlighted its use of technology and human moderators to enforce these standards.
Despite TikTok’s claims of rigorous safety measures, the social media giant has faced increasing scrutiny for its content moderation practices, echoing concerns raised about the impact of social networks on vulnerable youth.