US school district sues Facebook, Instagram, Snapchat, TikTok over harm to kids

Public schools in a Seattle district filed a lawsuit on Friday against parent companies of the biggest social networks on the internet, alleging social media is to blame for “a youth mental health crisis”, and saying these companies have purposefully designed, refined, and operated their platforms in a way that “exploit[s] the neurophysiology” of children’s and youths’ brains.

The companies they sued are Meta for Facebook and Instagram, Snap for Snapchat, ByteDance for TikTok, and Alphabet for YouTube.

In a brief about this case, Seattle Public Schools said:

“Students in the Seattle Public Schools, like students around the country, are struggling with anxiety, depression, thoughts of self-harm, and suicidal ideation, which led King County to join the US Surgeon General last year in recognizing the youth mental health crisis in this community. According to the Surgeon General, one in five children aged 13 to 17 now suffer from a mental health disorder.”

“More than 90% of youth today use social media. Most youth primarily use five platforms: YouTube, TikTok, Snapchat, Instagram, and Facebook, on which they spend many hours a day. Research tells us that excessive and problematic use of social media is harmful to the mental, behavioral, and emotional health of youth and is associated with increased rates of depression, anxiety, low self-esteem, eating disorders, and suicide.”

Cyberbullying

The school district also pins the blame for online bullying on social networks. “The more time an individual, especially males, spend on social media, the more likely they are to commit acts of cyberbullying,” the complaint alleges, citing research on cyberbullies from 2021. Youths experience bullying acts online like name calling; being subjected to false rumors; receiving unsolicited explicit media or threats of bodily harm; online stalking; and revenge porn.

Multiple studies have shown cyberbullying has numerous mental, emotional, and behavioral effects on everyone involved. This includes anxiety, depression, and sleep deprivation, to name a few. The lawsuit mentions that students experiencing these mental health issues “perform worse in school, are less likely to attend school, more likely to engage in substance use, and to act out.” All these, the SPS argues, “affects Seattle Public Schools’ ability to fulfil its educational mission”.

Immune to liability, per Section 230?

When it comes to potential counter-arguments, Seattle Public Schools appears to have foreseen that the companies will hide behind Section 230 of the Community Decency Act. This specific section expressly says: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” However, the district isn’t arguing that the companies should be liable for their users’ posts but that Section 230 shouldn’t shield them from the consequences of their conduct.

“Plaintiff [school district] is not alleging Defendants [social media companies] are liable for what third-parties have said on Defendants’ platforms but, rather, for Defendants’ own conduct,” the complaint says. “Defendants affirmatively recommend and promote harmful content to youth, such as proanorexia and eating disorder content. Recommendation and promotion of damaging material is not a traditional editorial function and seeking to hold Defendants liable for these actions is not seeking to hold them liable as a publisher or speaker of third party-content.”

Seattle Public Schools further alleges the companies are liable for “their own affirmative conduct in recommending and promoting harmful content to youth”, “their own actions designing and marketing their social media platforms in a way that causes harm”, “the content they create that causes harm”, and “for distributing, delivering, and/or transmitting material that they know or have reason to know is harmful, unlawful, and/or tortious”.

When Ars Technica reached out to Meta for comments, the company said it developed more than 30 tools, such as supervisory and age verification tools, that aid teens and families.

“We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks,” a spokesperson went on to say. “We don’t allow content that promotes suicide, self-harm, or eating disorders, and of the content we remove or take action on, we identify over 99 percent of it before it’s reported to us. We’ll continue to work closely with experts, policymakers, and parents on these important issues.”


We don’t just report on threats—we remove them

Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.

https://blog.malwarebytes.com/feed/