Trial Magazine
Theme Article
Breaking the Habit
Social media apps are at the center of a youth mental health crisis, and the kids and families impacted allege these products are addictive and defectively designed. Learn how plaintiffs are seeking justice.
December 2024Beginning in September 2021, The Wall Street Journal began publishing a series of articles based on damaging internal documents that former Meta employee Frances Haugen leaked.1 According to those internal studies, 41% of teen users began feeling “unattractive” while using Instagram.2 Thirty-two percent of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse.3 And 13.5% of teenage girl users said Instagram exacerbated their thoughts of suicide and self-injury.4
32% of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse.
Haugen’s revelations did not come as a surprise to Janet and Ian Russell. One morning in November 2017, Janet found the body of her 14-year-old daughter Molly hanging in her bedroom.5 Ian recalled looking at his daughter’s web browsing history soon after her suicide and discovering “the bleakest of worlds.”6 He said, “It’s a world I don’t recognize . . . once you fall into it, the algorithm means you can’t escape it and it keeps recommending more content. You can’t escape it.”7
A coroner’s inquest held to determine the cause of Molly’s death8 revealed that she had been pushed thousands of graphic photos and videos relating to suicide, depression, and self-harm—which certain platforms continued to feed her account even after her death.9 In a landmark ruling, the coroner concluded that Molly had died from “an act of self-harm while suffering from depression and the negative effects of online content.”10
Haugen’s revelations and the Molly Russell inquest have prompted a multinational reckoning with social media—our children’s heavy reliance on Instagram, Snapchat, TikTok, and YouTube; these platforms’ known tendency to create addictive behaviors and exacerbate mental health issues; and their detrimental impact on the educational environments in middle and high schools. In several of his State of the Union addresses, President Biden spoke about the harms social media causes, stating that “we must hold social media platforms accountable for the national experiment they’re conducting on our children for profit.”11
Surgeon General Vivek Murthy has issued an advisory on the effects of social media on youth mental health, calling for warning labels to be presented on these platforms and noting that “while social media may have benefits for some children and adolescents, there are ample indicators that social media can also have a profound risk of harm to the mental health and well-being of children and adolescents.”12 Even those who have worked for these companies recognize the risks. Former Facebook employee Chamath Palihapitiya, who was a vice president for user growth, drew attention to the issue: “The short-term, dopamine-driven feedback loops that we have created are destroying how society works.”13
Meanwhile, Congress has ramped up efforts to pass comprehensive legislation protecting children online, such as the Kids Online Safety Act, and the Federal Trade Commission has issued a notice for proposed rulemaking to further limit companies’ ability to monetize children’s data.14
Despite this flurry of activity and attention, social media platforms have continued to insist that they enjoy broad immunity from civil liability—in other words, that they cannot be held accountable for their role in triggering a nationwide youth mental health crisis that their own research warned of for years.15
That argument has been put to the test in In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (MDL 3047)—and has so far failed. With this litigation, children and parents can finally seek recourse against social media companies through direct legal action.
Personal Injury Cases
The first cases that would later form the MDL against Instagram, Snapchat, TikTok, and YouTube were filed in early 2022 by parents and children who were determined to force these companies to take responsibility for their negligence and defective products.16 When the Judicial Panel on Multidistrict Litigation (JPML) heard arguments in September 2022 to centralize those cases for pretrial purposes, 28 related actions had been filed in 17 districts.17 At the time of publication, close to 500 lawsuits have been filed in MDL 3047,18 which is pending in the Northern District of California.19
The personal injury plaintiffs bringing suit in this action allege Meta (Instagram and Facebook), Snap,20 TikTok, and Google (YouTube) designed their products to be addictive and that the addictive nature of these products caused a wide range of mental and physical harms, ranging from clinical depression to suicide.21
MDL 3047 teed up several novel legal issues: Can social media platforms even be considered “products” so that products liability principles apply to them? Can social media platforms be held liable for the design of their platforms, notwithstanding 47 U.S.C. §230, a statute some have interpreted as a blanket immunity shield?22 How do tort claims asserted against social media companies intersect with the First Amendment, and does the right to free speech shield not just the content posted online but also the corporate conduct of online platforms? What duty, if any, does a social media platform owe to child users and their parents?
These consequential questions became a focus when the defendants in the MDL moved to dismiss two product design defect claims, two failure-to-warn claims, and a negligence per se claim based on violations of the Children’s Online Privacy Protection Act (COPPA). In granting in part and denying in part these motions, the court undertook a careful, fact-specific, conduct-oriented analysis for each of the threshold legal questions to determine which product defects could be subject to plaintiffs’ claims.23 The court followed the same approach in October 2024 when, via a separate ruling, it addressed plaintiffs’ consumer protection, deception, and misrepresentation claims.24
The court began by noting the text of §230: “No provider . . . of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The court rejected defendants’ argument that §230 functions as a catchall liability shield, instead insisting that the “application of §230 is more nuanced.”25
Specifically, the court observed that the plaintiffs “make myriad allegations that do not implicate publishing or monitoring of third-party content”26—the cornerstone of §230 immunity. These allegations include design defect claims alleging that defendants failed to use robust age verification, implement adequate parental controls and notifications, provide options to users to self-restrict their time on a platform, remove “barriers to the deactivation and deletion of accounts,” label images and videos that have been edited through appearance-altering filters, and implement protocols to allow users to report child sexual abuse material.27 The court allowed those products liability claims to proceed.
The court’s First Amendment analysis was similarly conduct-oriented. The court noted that “much of the conduct alleged by plaintiffs does not constitute speech or expression, or publication of same,” and thus was not protected by the First Amendment, just as it was not immunized by §230.
Finally, as to the question of whether the defendants’ platforms are products, the court once again examined each feature of each platform on its own terms, finding that each of the above-mentioned defects do implicate products and their design. For example, the court held that defendants’ failure to assist users in limiting their app usage is a “product” defect—analogizing those defects to “physical timers and alarms, which have long been in use.”
Importantly, the court has allowed the plaintiffs’ failure-to-warn claims to move forward in their entirety, with respect to all allegedly defective features of the defendants’ platforms.28 That includes the defendants’ alleged failure to warn the public the recommendation algorithms that their platforms use are designed to maximize user engagement (and company profits), regardless of the collateral damage to children. Because of this aspect of the ruling, liability discovery is now open against all defendants with respect to all allegedly defective aspects of their social media platforms.
State and Local Government Actions
As noted above, hundreds of personal injury lawsuits have been consolidated into MDL 3047. However, these families are not alone in their efforts to hold social media platforms accountable. Public institutions have also joined the fight. A coalition of 34 state attorneys general have filed suit directly into the MDL, alleging Meta has designed Instagram to be addictive, that Meta concealed health and safety risks from the public, and raising claims under the states’ consumer protection laws.29 The court has largely allowed the state attorneys’ general claims to proceed, subject to many of the same limitations placed on the personal injury plaintiffs.
An additional nine states, including New Mexico and Tennessee, have filed suits in their respective state courts raising similar claims. Meanwhile, school districts around the country have filed suit into the MDL, claiming Meta, Snap, TikTok, and YouTube have created a public nuisance by targeting their addictive products to kids.30
These districts claim school administrators, counselors, and teachers have been forced to manage the problems caused by social media addiction,31 implementing smartphone bans,32 hiring counseling staff, and dealing with social media-fueled misbehavior in classrooms.33 In October, the court largely denied defendants’ motions to dismiss the school districts’ negligence claims.34 As of the time of publication, no ruling has been issued on the school districts’ public nuisance claims.
How to File a Case in the MDL
The mechanics of filing a personal injury case in MDL 3047 will be familiar to anyone who has filed a case in any other multidistrict litigation.35 Because plaintiffs’ leadership has already filed a comprehensive master complaint, an attorney filing a new case needs to file only a “short-form complaint” identifying the plaintiff, defendants, claims, injuries, and other basic details. Short-form complaints should be filed in the Northern District of California and then uploaded to MDL Centrality here: https://www.mdlcentrality.com/. You need to simultaneously submit a form through MDL Centrality identifying your client’s social media accounts (known as a “plaintiffs’ preservation form”).
You have 105 days after filing the short-form complaint to submit a plaintiff’s fact sheet. For the plaintiff’s fact sheet, it is important to articulate in detail your client’s social media usage, the nature of their injuries, and their medical history and background. The most important information to have and understand when completing the fact sheet involves knowing which platforms contributed to the harm, details on the usage of those platforms, relevant medical diagnoses and attendant harms, and a list of the plaintiff’s treatment providers.
If you are having clients complete the fact sheet on their own initially, it’s imperative to thoroughly review the responses and ask clarifying questions where necessary. Remind your clients that these are discovery responses and that it is OK to leave questions blank where they don’t know or can’t access the answer, rather than to speculate. If counsel for any plaintiff has questions or concerns regarding the contents or verbiage of the fact sheet, MDL leadership and plaintiffs’ steering committee firms are knowledgeable resources to lean on.
You may find that serving these clients can be challenging. Many young people whose addiction to social media has led to serious injury have also faced confounding trauma, including sexual violence and other forms of abuse. Working with a young, victimized client base requires heightened sensitivity and a trauma-informed approach.36
But the weight of this harm makes it all the more important that the thousands—if not millions—of kids and families across the country who are struggling due to the addictive power of social media and its tendency to lead young people down dangerous rabbit holes have the opportunity to pursue justice.37 MDL 3047 presents a powerful vehicle to help these individuals and hold some of America’s biggest and most profitable companies accountable for their role in perpetuating a national youth mental health crisis.
Previn Warren is a trial lawyer at Motley Rice in Washington, D.C., and can be reached at pwarren@motleyrice.com.
Notes
- Georgia Wells, Jeff Horwitz, & Deepa Seetharaman, Facebook Knows Instagram is Toxic for Teen Girls, Company Documents Show, Wall. St. J. (Sept. 14, 2021).
- Id.
- Id.
- Id.
- See Molly Russell Inquest: Online Life Was ‘The Bleakest of Worlds’, BBC (Sept. 21, 2022), https://www.bbc.com/news/uk-england-london-62981964.
- Id.
- Dan Milmo, “The Bleakest of Worlds”: How Molly Russell Fell Into a Vortex of Despair on Social Media, The Guardian (Sep. 30, 2022), https://www.theguardian.com/technology/2022/sep/30/how-molly-russell-fell-into-a-vortex-of-despair-on-social-media.
- An inquest is a legal procedure in the United Kingdon where the coroner investigates the cause of death. See Inquests – A Factsheet for Families, The Coroners’ Courts Support Service, https://coronerscourtssupportservice.org.uk/wp-content/uploads/2018/11/CCSS-EL_Inquest_Factsheet_Final29317221_3.pdf.
- Id.; Molly Russell Inquest: Online Life Was ‘The Bleakest of Worlds’, BBC (Sept. 21, 2022), https://www.bbc.com/news/uk-england-london-62981964.
- Molly Russell Inquest: Schoolgirl, 14, Died by Self-harm While Suffering ‘Negative Effects of Online Content’, Coroner Finds, Sky News (Sept. 30, 2022), https://news.sky.com/story/molly-russell-inquest-teenager-died-by-self-harm-while-suffering-negative-effects-of-online-content-coroner-finds-12707322.
- See Natasha Singer, Biden Wants Congress to Reduce the Risks of Social Media for Children, NY Times (Mar. 7, 2024), https://www.nytimes.com/2024/03/07/us/politics/congress-state-union.html.
- Social Media and Youth Mental Health: The U.S. Surgeon General’s Advisory, Department of Human and Health Services, at 13 (May 23, 2023), https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf; Dr. Vivek H. Murthy, Surgeon General: Why I’m Calling for a Warning Label on Social Media Platforms, NY Times (June 17, 2024), https://www.nytimes.com/2024/06/17/opinion/social-media-health-warning.html.
- See Amy B. Wang, Former Facebook VP Says Social Media is Destroying Society With ‘Dopamine-driven Feedback Loops,’ Washington Post (Dec. 12, 2017), https://www.washingtonpost.com/news/the-switch/wp/2017/12/12/former-facebook-vp-says-social-media-is-destroying-society-with-dopamine-driven-feedback-loops/.
- See FTC Proposes Strengthening Children’s Privacy Rule to Further Limit Companies’ Ability to Monetize Children’s Data, Federal Trade Commission (Dec. 20, 2023), https://www.ftc.gov/news-events/news/press-releases/2023/12/ftc-proposes-strengthening-childrens-privacy-rule-further-limit-companies-ability-monetize-childrens.
- Social media companies often rely on 47 U.S.C. § 230, commonly known as Section 230, to assert that they are not liable for any harms that result from use of their platforms.
- The first case filed alleged that Defendants were liable for the wrongful death of an 11-year-old, “caused by [her] addictive use of and exposure to Defendants’ unreasonable dangerous and defective social media product.” Rodriguez v. Meta Platforms, 22-cv-00401, ECF No. 1 (N.D. Cal. Jan. 20, 2022).
- See Transfer Order at 1, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal. Oct. 6, 2022), ECF No. 119.
- In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (MDL No. 2047), U.S. Dist. Ct. Northern Dist. Cal., https://www.cand.uscourts.gov/in-re-social-media-adolescent-addiction-personal-injury-products-liability-litigation-mdl-no-3047/.
- Lawsuit Updates & History, Social Media Victims Law Center (Aug. 7, 2024), https://socialmediavictims.org/social-media-lawsuits/past-updates/#:~:text=July%202024%20MTD%3A%20As%20of,slated%20for%20October%2025%2C%202025.
- Snap, Inc. was formerly known as Snapchat and is the technology company that offers the social media platform Snapchat.
- See Plaintiffs’ Second Amended Master Complaint (Personal Injury), In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, 22-MD-003047-YGR (N.D. Cal.), ECF No. 494.
- See, e.g., Eric Goldman, Why Section 230 is Better Than the First Amendment, 95 Notre Dame L. Rev. 33, 36–39 (2019) (arguing that the §230 defense immunizes social media companies against defamation lawsuits, completely protects commercial speech, and is not overcome even when a plaintiff proves that a company had knowledge about the tortious or criminal content in question).
- See Order on Motion to Dismiss, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal. Nov. 14, 2023), ECF No. 430.
- See Order on Motion to Dismiss, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal. Oct. 15, 2024), ECF No. 1214.
- Id. at 14.
- Id. at 14–15.
- Id. at 51–52.
- See Order on Motion to Dismiss, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal. Oct. 15, 2024), ECF No. 1214; Transcript of Proceedings at 25–28, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal Nov. 16, 2023), ECF No. 457; Transcript of Proceedings at 36, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, No. 22-md-03047 (N.D. Cal. Mar. 21, 2024), ECF No. 714.
- See Mot. to Dismiss Multistate Attorneys General Compl. and Fl. Attorney General Compl. ECF No. 517 at 26, In re Social Media Addiction, 22-md-3047.
- The school districts’ public nuisance theory is similar in concept to that alleged in In Re: JUUL Labs, Inc., Marketing, Sales Practices, & Products Liability Litigation, 497 F. Supp. 3d 552 (N.D. Cal. 2020). In the same way that JUUL targeted children with their marketing, social media companies targeted children with carefully engineered algorithms, notifications, and other addictive product features.
- See Viral ‘Devious Licks’ TikTok Challenge Encourages Kids to Steal From School, PBS (Oct. 25, 2021), https://www.pbs.org/newshour/show/viral-devious-licks-tiktok-challenge-encourages-kids-to-steal-from-school (discussing a social media challenge that “encouraged students to record themselves stealing or vandalizing school property, then posting the video online”).
- See Meg Oliver & Analisa Novak, Schools Across U.S. Join Growing No-phone Movement to Boost Focus, Mental Health, CBS News (Oct. 20, 2023), https://www.cbsnews.com/news/schools-no-phone-movement-focus-mental-health/ (noting that special “Yondr” pouches used to store students’ phones during the day costs “between $25 and $30” per student).
- See David Ingram, A Teachers Union Says It’s Fed Up With Social Media’s Impact on Students, NBC News (Jul. 20, 2023), https://www.nbcnews.com/tech/social-media/teachers-union-says-s-fed-social-medias-impact-students-rcna95376 (“The nation’s second-largest teachers union said . . . it was losing patience with social media apps that it says are contributing to mental health problems and misbehavior in classrooms nationwide.”).
- See Order on Motion to Dismiss, In Re Social Media Adolescent Addiction/Personal Injury Prods. Liab. Litig., No. 22-md-03047 (N.D. Cal. Oct. 24, 2024), ECF No. 1267.
- The template for a short-form complaint can be found here: https://motleyrice.com/cocounsel/social-media-litigation.
- See, e.g., Rebecca Howlett & Cynthia Sharp, Strategies for a Trauma-Informed Law Practice, American Bar Association (Oct. 26, 2021), https://www.americanbar.org/groups/gpsolo/resources/ereport/archive/strategies-trauma-informed-law-practice/.
- Recent estimates indicate that over one third of 13 to 17-year-olds use TikTok, YouTube, Snapchat, and/or Instagram either “almost constantly” or “several times a day.” See Monica Anderson, Michelle Faveri & Jeffrey Gottfried, Teens, Social Media and Technology 2023, Pew Research Center (Dec. 11, 2023), https://www.pewresearch.org/internet/2023/12/11/teens-social-media-and-technology-2023/. Meanwhile, “one in three high school students and half of female students reported persistent feelings of sadness or hopelessness, an overall increase of 40% from 2009.” Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory, Department of Human and Health Services, at 3, https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf. These statistics are anything but unrelated.