We use social media apps for myriad reasons. We use them to stay in contact with friends and family, make new friends, network professionally, read and share news articles, shop for products, and pass time. Although social media can be addictive to many, the social media risks to teens are greater than they are for adults.
The first social media platform launched in 1997.1 Other social media platforms, such as Myspace, Facebook, and Instagram, launched later. Kids and teens enjoy social media platforms like Instagram, YouTube, TikTok, and Snapchat for interaction, which is a normal part of development. Use of these and other social media platforms has, however, also been increasingly linked to mental health problems, anxiety, depression, eating disorders, body dysmorphia, low self-esteem, and suicidal ideation in teens.2
Algorithms of social media platforms target teens by delivering stories, images and videos tailored to them. Meta, the parent company of Facebook and Instagram, spends millions of dollars to target and maintain its teen audience.3 Algorithms amplify posts that collect the most likes, shares, and follows. This amplification can lead to platform timeline feeds dominated by unrealistic beauty standards, over-the-top opinions, extreme stunts, and social media challenges.
Instagram, which is popular among teens, built its platform to encourage people to compare themselves to others. Critics argue it leads to unrealistic expectations for teens and that social media feeds dominated by messages of self-harm, eating disorders, or other self-image-related content can lead to serious mental health issues for children and teens.4
Social media harm lawsuits are being filed by parents alleging social media companies knew the platforms could cause these issues but failed to warn users. Lawsuits against Meta allege the company created unreasonably dangerous products that hook children and teens.5 School districts are also suing the parent companies of Facebook, Instagram, Snapchat, TikTok, YouTube and other social media platforms, claiming the platforms “exploited the vulnerable brains of youth.”6 The school districts are seeking monetary damages to pay for teacher training and screening students with mental health issues.7
On October 24, 2023, a federal complaint filed by the attorneys general of 41 states (Case 4:23-cv-05448) against Meta, alleges that the company engaged in a “scheme to exploit young users for profit” by misleading them about safety features and the prevalence of harmful content, harvesting their data, and violating federal laws on children’s privacy.8 State officials claim that the company knowingly deployed changes to keep children on the site to the detriment of their well-being, violating consumer protection laws.9 Meta could face civil penalties of $1,000 to $50,000 for each violation of various state laws, an amount that could add up quickly given the millions of young children and teenagers who use Instagram.10 Praedicat’s (a science-based data analytics company) liability catastrophe model estimates economy-wide loss (indemnity plus defense) for a mass litigation event centered on addictive software design has an expected value of $15 billion and a 5% chance of exceeding $70 billion.11
According to an article in The Federal Lawyer (Nov/Dec 2022), plaintiffs will struggle to prove causation because social media addiction is currently unrecognized as a mental health disorder.12 Research on the effect of social media on adolescent development has yielded mixed results. When studying the impact of excessive social media use among adolescents, a correlation between social media use and depression was observed by some researchers, yet significant benefits to adolescent well-being were also observed. Given that the impact of excessive social media use on mental health remains unknown, plaintiffs claiming social media addiction will struggle to establish a causal link between excessive social media usage and a negative impact on mental health.
In addition, Section 230 of Communications Decency Act (CDA) of 1996 shields social media platforms from liability, which serves as the strongest defense for social media platforms in lawsuits with teen social media addiction claims. CDA Section 230(c)(1) reads in part, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”13 Plaintiffs can overcome the CDA by proving that either: (1) social media platforms do not act in a publishing capacity when employing personalized algorithms to recommend content to users or (2) these platforms act as information content providers when they design their systems to addict users.14 In the spring of 2023, the Supreme Court passed up the opportunity to narrow the scope of Section 230.15 Consequently, all eyes will turn back to Congress to see whether the law is ultimately reformed.
On June 28, 2022, a social media litigation firm filed a lawsuit (Smith v. TikTok Inc.) against TikTok and its parent company, ByteDance Ltd. in Los Angeles County, CA, on behalf of the families of two children who died while attempting a viral TikTok challenge known as the "blackout challenge," which involves using objects to strangle oneself to the point of losing consciousness.16 The lawsuit seeks a ruling on TikTok's strict product liability; failure to warn and negligence for allegedly contributing to the two deaths; relief for economic losses, pain, and suffering; exemplary or punitive damages; and medical expenses for the victims' families. Although, pursuant to CDA Section 230, social media platforms cannot be held liable for the content others have posted on their platforms, this lawsuit is based on TikTok’s actions, not the actions of its users.17
As of October 2023, there have been no court-approved settlements or jury verdicts in social media lawsuits. There were 429 social media lawsuits pending in Multi-district Litigation No. 3047 in the Northern District of California.18 Plaintiffs have alleged that the social media platforms of YouTube, TikTok, Instagram, Snap, among others, are defective because they are designed to maximize screen time.19 As such, the plaintiffs allege this conduct results in emotional and physical harms, including death.20
With the increase in the amount of social media addiction claims being filed, insurance companies can expect to see an increase in these claims under various policies including GL, Excess, and Auto policies.