Revision #1
System
8 days ago
The $6 Million Verdict That Could Cost Big Tech Billions: Inside the Landmark Social Media Addiction Ruling
On March 25, 2026, a Los Angeles jury delivered a verdict that the technology industry had spent years trying to prevent. After more than 40 hours of deliberation across nine days, the jury found Meta and Google negligent in the design of Instagram and YouTube, concluding that both platforms were deliberately engineered to be addictive to children [1]. The jury awarded plaintiff K.G.M. — a now-20-year-old California woman — $3 million in compensatory damages and $3 million in punitive damages [2].
The dollar amount itself is modest relative to the companies' revenues. But the legal reasoning behind it — that social media platforms can be treated as defective products under product liability law — has the potential to reshape the entire industry. More than 2,400 similar cases are consolidated in a federal multidistrict litigation (MDL 3047), and legal experts say this bellwether verdict will serve as a template for thousands more [3].
The Plaintiff and the Damages
K.G.M., identified during trial as Kaley, began using YouTube at age 6 and Instagram at age 9 [1]. By the time she finished elementary school, she had posted 284 videos on YouTube [4]. Her attorneys argued that the platforms' design features contributed to depression, anxiety, and suicidal ideation.
The jury apportioned liability at 70% for Meta and 30% for Google. In compensatory damages, Meta owes $2.1 million and Google $900,000. The punitive damages followed the same split: $2.1 million from Meta and $900,000 from Google [2].
For context, Meta reported full-year 2025 revenue of $201 billion, while Alphabet (Google's parent) reported $403 billion [5][6]. The $6 million total verdict represents roughly 0.001% of the companies' combined annual revenue — a rounding error in financial terms. But the verdict's significance lies not in this individual award but in the legal pathway it opens. If similar damages were applied across even a fraction of the 2,400+ pending cases, the cumulative exposure would be measured in billions.
One day before the Los Angeles verdict, a separate New Mexico jury ordered Meta to pay $375 million in civil penalties for violating state consumer protection law by failing to protect children from predators on Facebook and Instagram [7]. That case, brought by New Mexico Attorney General Raúl Torrez, marked the first time a state prevailed at trial against a tech company for harming young people [8].
The Features on Trial
The legal team representing K.G.M. pursued a strategy that distinguished this case from prior failed attempts to hold platforms accountable. Rather than targeting the content users encounter on social media — which is shielded by Section 230 of the Communications Decency Act — the attorneys focused on platform design itself [9].
The specific features the jury found defective included:
- Infinite scroll: The bottomless feed that eliminates natural stopping points, which plaintiff's attorneys compared to a "digital casino" [1]
- Autoplay: Videos that begin playing automatically, reducing the friction required to consume more content [1]
- Push notifications: Persistent alerts designed to pull users back to the app [4]
- Beauty filters: Features that plaintiff's attorneys argued amplify body dysmorphia, particularly among young girls [10]
- Algorithmic recommendations: Systems that learn user preferences and serve increasingly targeted content to maximize engagement [9]
The judge instructed the jury that the method of content delivery constitutes a separate legal question from the content itself [9]. This distinction was critical: it meant Meta and Google could not invoke Section 230 to shield their design choices. No appellate court has yet ruled on this question, making it a near-certain focus of the companies' planned appeals [11].
What the Companies Knew
The trial surfaced internal documents that became central to the jury's finding of "malice, oppression or fraud" — the standard required for punitive damages under California law [10].
Among the most damaging exhibits was a Meta memo stating: "If we wanna win big with teens, we must bring them in as tweens" [1]. Other documents showed that Meta's own data indicated 11-year-olds were four times more likely to return to Instagram compared with competing apps — despite the platform's minimum age requirement of 13 [4]. Internal research also revealed that Meta knew approximately 30% of 10- to 12-year-olds in the United States were using Instagram [4].
Meta CEO Mark Zuckerberg testified during the six-week trial, defending the company's record. "If people feel like they're not having a good experience, why would they keep using the product?" he said [12]. Defense attorneys argued that K.G.M.'s mental health difficulties stemmed from her home environment and the COVID-19 pandemic, not social media use. They noted that K.G.M.'s own therapist had never documented social media as a contributing factor to her condition [1].
Therapist Victoria Burke, who did testify for the plaintiff, stated that for K.G.M., social media use and self-image "were closely related" and that platform activity could "make or break her mood" [10].
The Legal Framework: Product Liability Meets Big Tech
The K.G.M. case used a product liability framework — the same legal theory applied to defective cars, dangerous pharmaceuticals, and faulty consumer products. Jurors were asked to determine whether the platforms had "design defects which means they are addictive" and whether Meta and YouTube "knew the design or operation of their platforms was dangerous or was likely to be dangerous when used by a minor" [9][3].
The jury answered yes on both counts, finding the companies negligent and additionally liable for failure to warn [10].
This approach has drawn inevitable comparisons to two prior waves of mass litigation: the 1990s tobacco cases and the 2010s–2020s opioid lawsuits.
The parallels are substantive, not just rhetorical. In tobacco litigation, plaintiffs proved that cigarette manufacturers knew their products were addictive, deliberately targeted young users, and publicly denied the risks. Attorneys representing social media plaintiffs have argued the same pattern applies. "The manufacturers, the distributors, the pharmacies, they knew about the risks, they downplayed them, they oversupplied," said one attorney who previously worked on opioid cases. "Here, it is very much the same thing. These companies knew about the risks, they have disregarded the risks, they doubled down to get profits from advertisers over the safety of kids" [13].
But significant differences remain. The tobacco Master Settlement Agreement (MSA) of 1998 involved 46 state attorneys general and resulted in a $206 billion industry-wide payout over 25 years — a figure that reflected decades of documented causation between smoking and disease. The evidentiary link between social media use and specific mental health outcomes is contested. Zuckerberg testified that "scientific studies have not proved the link between social media and mental health harms" [14], and some independent researchers have echoed concerns that the causal evidence is weaker than advocates suggest.
"There's a long way to go before you see something akin to the master settlement that this is often analogized to in the tobacco and opioid litigation," one legal expert noted after the verdict [13].
The Cascade: Who's Next?
TikTok and Snapchat were originally named as co-defendants in the K.G.M. case. Both settled confidentially before trial — Snap approximately one week before proceedings began on January 22, 2026, and TikTok on January 27, the day jury selection was scheduled to start [3]. Neither settlement constitutes an admission of liability.
The broader MDL 3047 litigation encompasses more than 10,000 individual personal injury cases, nearly 800 school district lawsuits, and actions by attorneys general from more than 41 U.S. states [3]. The K.G.M. trial was the first of three bellwether cases scheduled in Los Angeles, with additional bellwether trials planned for the summer in federal court [15].
Clay Calvert, a nonresident senior fellow of technology policy studies at the American Enterprise Institute, told CBS News that the verdict "definitely could open the floodgates of litigation" and "will certainly trigger more" families to take legal action [10].
Every major social media platform uses some combination of the features found defective in K.G.M. — infinite scroll, autoplay, algorithmic recommendations, and notification systems. Google's defense attempted to distinguish YouTube as "a responsibly built streaming platform, not a social media site" [16], but the jury rejected that argument. Whether platforms like X (formerly Twitter), Pinterest, or Discord could mount more successful distinctions remains untested.
Design Changes and the Engagement Dilemma
The K.G.M. verdict does not by itself mandate specific design changes — it is a damages verdict, not an injunction. However, the legal risk it establishes creates strong incentives for platforms to modify the features the jury found defective, particularly for minor users.
Some changes are already underway independent of this litigation. California passed a law in September 2024 making it illegal for minors' social media accounts to include "addictive feeds" without parental consent [14]. New York enacted a law in December 2025 requiring social media platforms to display mental health warning labels [14]. Australia banned social media for children under 16 in late 2024.
The fundamental business question is whether platforms can remove or modify these features without destroying the engagement metrics that drive their advertising revenue. Meta generated approximately $195 billion of its $201 billion in 2025 revenue from advertising [5]. Alphabet's advertising revenue similarly constitutes the vast majority of its $403 billion total [6]. Features like infinite scroll and algorithmic recommendations exist because they maximize time on platform, which maximizes ad impressions. Removing them for users under 18 — or for all users — would almost certainly reduce engagement.
Meta has pointed to voluntary measures it has already implemented, including time-limit reminders and default notification restrictions for teen accounts. Critics argue these measures constitute what one researcher termed "compliance theater" — changes that generate positive headlines without materially altering the addictive dynamics of the product.
The Evidence Gap and the Question of Causation
The effectiveness of design modifications in reducing mental health harms is an open empirical question. The U.S. Surgeon General issued an advisory in 2023 stating that "we do not yet have enough evidence to determine if social media is sufficiently safe for children and adolescents" [14]. Research points to elevated risks for adolescent girls and those already experiencing poor mental health, particularly regarding cyberbullying-related depression, body image issues, disordered eating, and poor sleep quality [14].
But research also shows that the relationship between social media and mental health is not uniform. Some studies find modest effect sizes; others find none. The causal direction is debated: do vulnerable adolescents gravitate toward social media, or does social media make adolescents vulnerable?
Age verification has emerged as an alternative regulatory approach, but it carries its own complications. Social media companies have argued that effective age verification requires collecting biometric or identity data that undermines the privacy of all users, including adults [17]. Surveys show that 29% of respondents are most concerned that age verification systems are easy for children to bypass [17].
The Broader Liability Question
The verdict raises a more fundamental question about where liability should sit in the digital ecosystem. Social media platforms designed the features the jury found defective, but they operate within a broader environment that includes device manufacturers who enable 24/7 access, parents who provide devices to children, schools that have been slow to develop digital literacy curricula, and a regulatory framework that has largely left platforms to self-govern.
Meta's defense in the K.G.M. case explicitly raised this point, arguing that the plaintiff's difficulties were "the result of her fractious home life" rather than platform design [16]. The defense position — that mental health is "profoundly complex and cannot be linked to a single app" [11] — resonates with some researchers who caution against monocausal explanations for the adolescent mental health crisis.
The counterargument, which the jury appears to have accepted, is that platform designers have the most direct control over the features that enable compulsive use. A car manufacturer cannot escape liability for defective brakes by arguing that the driver should have been more careful. By finding Instagram and YouTube defective as designed, the jury applied that same logic to social media.
Whether appellate courts uphold this reasoning will determine whether the K.G.M. verdict becomes a footnote or a turning point. The Section 230 question — whether platform design choices fall outside the statute's protections — has not been resolved at the appellate level [11]. Both Meta and Google have announced they will appeal [11].
What Comes Next
The K.G.M. verdict is one data point in a litigation campaign that will take years to fully resolve. Two more bellwether trials are scheduled in Los Angeles, with federal bellwether trials to follow [15]. The New Mexico verdict adds a separate $375 million judgment to Meta's legal exposure [7]. State attorneys general from more than 41 states have their own pending actions [3].
Rob Nicholls, a senior research associate in media and communications at the University of Sydney, has described the verdict as establishing "a legal template potentially enabling thousands of additional cases through class actions and individual lawsuits globally" [9].
For the technology industry, the most immediate concern may not be the damages themselves but the discovery process. Each new trial brings the potential for additional internal documents to surface — documents that reveal what executives knew, when they knew it, and what they chose to do about it. In the tobacco litigation, it was the industry's own internal research that proved most devastating. The social media industry may be learning the same lesson.
The jury foreman in the K.G.M. case offered a simple summary of the nine-day deliberation: "We stuck to following the law and how it was presented to us" [1]. The question now is whether other juries, in other courtrooms, will reach the same conclusion.
Sources (18)
- [1]Jury finds Meta and Google negligent in social media harms trialnpr.org
A Los Angeles jury found Meta and Google liable in a closely watched trial accusing social media platforms of designing their products to get young users addicted, awarding plaintiff K.G.M. $6 million in damages.
- [2]Jury finds Meta, Google liable in landmark social media addiction trial, awards more than $6M in damagesfoxbusiness.com
Meta was ordered to pay 70% of compensatory damages ($2.1M), while Google is responsible for 30% ($900K), plus additional punitive damages totaling $3 million.
- [3]Social Media Addiction Lawsuits (2026): KGM Trial, MDL 3047, and TikTok & Snapchat Settlements Explainedspencer-law.com
As of early 2026, the litigation includes more than 10,000 individual personal injury cases, nearly 800 school district lawsuits, and actions by attorneys general from more than 41 U.S. states. TikTok and Snapchat settled the KGM case before trial.
- [4]Jury in Los Angeles finds Meta, YouTube negligent in social media addiction trialcnbc.com
Internal documents showed 11-year-olds were four times more likely to return to Instagram versus competitors. Meta knew approximately 30% of 10- to 12-year-olds in the U.S. were using Instagram despite the 13+ age requirement.
- [5]Meta Reports Fourth Quarter and Full Year 2025 Resultsinvestor.atmeta.com
Meta's full-year 2025 revenue was $200.97 billion, a 22% year-over-year increase, with approximately $195 billion from advertising.
- [6]Alphabet (GOOGL) Q4 2025 earningscnbc.com
Alphabet's annual revenue for fiscal year 2025 was $402.8 billion, a 15% year-over-year increase, exceeding $400 billion for the first time.
- [7]Meta must pay $375 million for violating New Mexico law in child exploitation case, jury rulescnbc.com
A New Mexico jury ordered Meta to pay $375 million in civil penalties — $5,000 per violation — for failing to safeguard children from predators on Facebook and Instagram.
- [8]New Mexico Department of Justice Wins Landmark Verdict Against Metanmdoj.gov
New Mexico becomes the first state in the nation to prevail at trial against a major tech company for harming young people.
- [9]Meta and Google just lost a landmark social media addiction case. A tech law expert explains the fallouttheconversation.com
The judge instructed jurors that content delivery mechanisms constitute a separate consideration from the content itself, significantly limiting platforms' ability to rely on Section 230 protections.
- [10]Meta and YouTube found liable on all charges in landmark social media addiction trialcbsnews.com
Jurors found defendants liable on negligence and failure-to-warn claims, determining the companies acted with 'malice, oppression or fraud.' Expert Clay Calvert warned the verdict 'could open the floodgates of litigation.'
- [11]US Jury Verdicts Against Meta, Google Tee Up Fight Over Tech Liability Shieldinsurancejournal.com
Several lower courts have ruled that companies' platform design choices are not protected by Section 230, but no appellate court has weighed in. The appeals are expected to center on this legal shield.
- [12]Zuckerberg grilled about Meta's strategy to target 'teens' and 'tweens'npr.org
Zuckerberg defended Meta during trial testimony, stating: 'If people feel like they're not having a good experience, why would they keep using the product?'
- [13]Meta, YouTube found negligent in landmark social media addiction lawsuitfinance.yahoo.com
Lawyers who worked on opioid cases drew parallels: 'These companies knew about the risks, they have disregarded the risks, they doubled down to get profits from advertisers over the safety of kids.'
- [14]Social media companies are fighting the 'age verification trap'fortune.com
California passed a law in 2024 banning addictive feeds for minors without parental consent. New York enacted mental health warning label requirements in December 2025. Age verification raises privacy concerns.
- [15]Social Media and Youth Mental Health - U.S. Surgeon General's Advisoryhhs.gov
The Surgeon General stated 'we do not yet have enough evidence to determine if social media is sufficiently safe for children and adolescents,' noting elevated risks for adolescent girls.
- [16]1 down, 1000s to go: A landmark verdict could reshape social mediacnn.com
The Los Angeles jury verdict is the first of three bellwether trials, with more bellwether trials to follow in summer in the federal case.
- [17]Big Tech just lost a 'social media addiction' case. It may not be the last.poynter.org
Google asserted 'This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.' Defense argued plaintiff's issues stemmed from 'fractious home life.'
- [18]Age-verification requirements for social media spark new privacy concernsmarketplace.org
29% of respondents in a Common Sense Media survey said they are most concerned that age verification systems are easy for children to bypass.