Jury Finds Meta Liable for Harming Children in New Mexico Case
TL;DR
A New Mexico jury found Meta liable on all counts of violating the state's Unfair Practices Act, ordering the company to pay $375 million for concealing known risks of child sexual exploitation and mental health harms on Facebook and Instagram. The landmark verdict—the first jury finding of its kind against a social media company—signals a potential inflection point in litigation that legal experts compare to the reckoning faced by tobacco and opioid manufacturers, with more than 40 state attorneys general pursuing similar cases.
On March 24, 2026, a Santa Fe jury delivered what Meta had spent years and millions of dollars trying to prevent: a finding that the company knowingly harmed children and lied about it. The jury found Meta liable on all counts of violating New Mexico's Unfair Practices Act, ordering the company to pay $375 million in civil penalties . The verdict marks the first time a jury has held a social media company accountable for child safety failures—a legal milestone that could reshape how the technology industry operates .
The decision came after a seven-week trial in which New Mexico Attorney General Raúl Torrez presented evidence that Meta concealed what it knew about child sexual exploitation on Facebook and Instagram, failed to enforce its own age restrictions, and used algorithms that connected predators with minors . Meta has said it will appeal .
What the Jury Found
The jury determined that Meta willfully engaged in both "unfair and deceptive" and "unconscionable" trade practices under New Mexico's consumer protection law . Specifically, jurors agreed that Meta:
- Made false or misleading statements about the safety of its platforms
- Engaged in unconscionable practices that exploited the vulnerabilities and inexperience of children
- Failed to enforce its own prohibition on users under age 13
- Used algorithms that prioritized sensational and harmful content over user safety
Jurors identified thousands of separate violations. Under the state's Unfair Practices Act, each violation carries a maximum penalty of $5,000, and the cumulative total reached $375 million . New Mexico prosecutors had originally sought approximately $1.9 billion—roughly five times the amount awarded .
The Evidence: Undercover Stings and Whistleblower Testimony
The state's case rested on two pillars. The first was an undercover investigation in which New Mexico agents created social media accounts posing as children to document how quickly predators could find and contact them . The fake accounts were contacted and solicited for sex by three New Mexico men, two of whom were arrested at a motel where they believed they would meet a 12-year-old girl .
The second was testimony from former Meta employees. Arturo Bejar, a former Meta engineering director turned whistleblower, testified about his efforts to warn company executives after his own 14-year-old daughter received sexual solicitations on Instagram . Bejar described how Meta's recommendation algorithms, designed to match users with content aligned to their interests, also served predators: "The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls" .
Prosecutors also introduced internal Meta communications showing that employees discussed how CEO Mark Zuckerberg's 2019 decision to make Facebook Messenger end-to-end encrypted by default would eliminate the company's ability to report approximately 7.5 million instances of child sexual abuse material (CSAM) to law enforcement . This evidence suggested Meta leadership weighed the loss of CSAM detection capability against the business benefits of encryption and chose to proceed.
Beyond the New Mexico trial, six whistleblowers with a combined 20 years of experience at Meta filed disclosures with Congress, the SEC, and the FTC alleging that Meta deleted or doctored internal safety research showing children—some as young as 10—were exposed to grooming, sexual harassment, and violence on its platforms . According to these whistleblowers, Meta's legal team "manipulated research methods and buried negative data," instructing researchers to avoid asking teen survey participants questions that might reveal harms .
How New Mexico Overcame Section 230
For nearly three decades, Section 230 of the Communications Decency Act has shielded internet platforms from liability for content posted by users. Meta argued it was protected by both Section 230 and the First Amendment's free speech guarantees .
New Mexico prosecutors used a legal strategy that has been tested across multiple state cases: rather than suing over specific user-generated content, they targeted Meta's own conduct—its design decisions, algorithmic amplification, and deceptive marketing . The court denied Meta's motion to dismiss based on Section 230 immunity, ruling that the state's claims addressed the company's business practices, not its role as a passive host of third-party speech .
This distinction—between what users post and what a platform's algorithms do with those posts—has become the central legal theory in social media litigation nationwide. As PBS reported, prosecutors contend that Meta "remains responsible for algorithmic amplification of harmful content," which is a fundamentally different act than simply hosting user speech .
$375 Million in Context
The $375 million penalty is significant as a legal precedent but modest relative to Meta's finances. In the fourth quarter of 2025, Meta reported revenue of $59.89 billion and net income of $22.77 billion . The penalty amounts to roughly 0.6% of a single quarter's revenue, or about 1.6% of quarterly profit.
The financial threat to Meta lies not in this single verdict but in what it enables. More than 40 state attorneys general have filed lawsuits against Meta making similar claims about deliberately designing addictive features that harm young users . If other states achieve comparable outcomes using similar legal theories, the cumulative liability could reach into the tens of billions.
Additionally, the New Mexico case has a second phase. Beginning May 4, a judge—not a jury—will determine whether Meta's platforms created a public nuisance and whether the company should fund public programs to address the documented harms . That phase could add further financial obligations and, more importantly, impose operational requirements on how Meta runs its platforms.
Meta also faces a separate bellwether trial in federal court in Los Angeles, where a California jury has been deliberating on whether Meta and YouTube are liable for child harms in the consolidated multidistrict litigation (MDL 3047) . That case could shape the trajectory of thousands of individual and school district lawsuits.
Meta's Defense
Meta mounted a defense centered on three arguments. First, the company pointed to its safety investments. Attorney Kevin Huff told the jury that Meta employs 40,000 people working on platform safety and has built automated tools to detect and remove harmful content . Huff argued the company has been transparent that "some bad actors and inappropriate content can slip through its safety filters" .
Second, Meta has rolled out safety features for minors in recent years, including parental controls, restrictions on who can contact teen accounts, and time-use limits . CEO Mark Zuckerberg acknowledged the difficulty of enforcing the under-13 ban, noting that "a meaningful number of people lie about their age to use our services" .
Third, Meta raised constitutional concerns. The company argued that its content moderation decisions constitute protected speech under the First Amendment—that just as individuals' speech is protected from government censorship, a platform's editorial decisions about content deserve similar protection . Meta's attorneys accused prosecutors of cherry-picking evidence and conducting what they called a flawed investigation .
A Meta spokesperson said after the verdict: "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online" .
The Tobacco and Opioid Playbook
Legal observers have drawn direct comparisons between the social media litigation wave and the cases that brought accountability to the tobacco and pharmaceutical industries .
Attorney Jayne Conroy, who served on plaintiffs' teams in opioid litigation and is now involved in social media cases, identified the structural parallel: "The cornerstone of both cases is the same: addiction." She outlined the opioid template—"manufacturers and distributors knew about the risks, they downplayed them, they oversupplied, and people died"—and argued the social media version is identical: "These companies knew about the risks, they have disregarded the risks, they doubled down to get profits from advertisers over the safety of kids" .
The legal strategy mirrors the tobacco cases in another respect. Tobacco litigation succeeded in part because state attorneys general, rather than individual plaintiffs, brought the claims—aggregating harm across entire populations rather than trying to prove causation for individual smokers. New Mexico's case follows the same model, suing on behalf of all state residents rather than specific children .
If social media companies lose the pending bellwether cases, they will face pressure to reach industry-wide settlements and restructure how minors interact with their platforms—a scenario that could echo the Master Settlement Agreement that reshaped the tobacco industry in 1998 .
What Comes Next
Meta has confirmed it will appeal the verdict . Appeals in New Mexico state courts typically take one to two years, and Meta will likely argue that the trial court erred in denying Section 230 immunity and that the evidence was insufficient to support the jury's findings.
Before the appeal, the second phase of the New Mexico trial begins May 4, when a judge will consider the public nuisance claims and potential injunctive relief . If the judge orders Meta to fund remedial programs or change its platform operations in New Mexico, those requirements could become a template for other states.
The federal MDL bellwether trial in Los Angeles remains pending as of late March 2026 . Its outcome will directly affect the trajectory of hundreds of consolidated lawsuits from families and school districts.
Meanwhile, legislative action continues to accelerate. Multiple states have passed or are considering laws restricting minors' access to social media, and federal legislation remains under discussion . The combination of jury verdicts, legislative action, and regulatory pressure creates what insurance industry analysts have described as a convergence of legal risk unlike anything the technology sector has previously faced .
For Meta, the $375 million penalty is manageable. The precedent is not. A company that generated $201 billion in revenue in 2025 can absorb a single nine-figure judgment . What it cannot easily absorb is a legal framework, now validated by a jury, that holds platforms responsible not just for what users post but for the design decisions that determine what users see—and who finds whom.
Limitations of Available Evidence
Several aspects of this case remain unclear from public reporting. The exact number of New Mexico children affected by exploitation on Meta's platforms has not been specified in available coverage. The full scope of internal documents presented at trial is not publicly available, as some evidence was introduced under seal. Meta's detailed safety spending figures—beyond the 40,000-employee claim—have not been independently verified. The second phase of the trial, beginning in May, may produce additional findings that alter the picture presented here.
Related Stories
Meta Faces Landmark Trial Over Child Safety Allegations in New Mexico
Meta Suffers Court Losses Compared to 'Big Tobacco Moment'
Meta and Google Lose Landmark Social Media Addiction Lawsuit
Meta Considers Cutting 20% of Workforce Amid AI Investment Pressure
Trump Appoints Zuckerberg, Ellison, and Huang to Tech Advisory Panel
Sources (13)
- [1]Meta must pay $375 million for violating New Mexico law in child exploitation case, jury rulescnbc.com
A jury found Meta violated New Mexico law, ordering $375 million in damages for failing to warn users and protect children from sexual predators on its platforms.
- [2]Jury finds Meta liable in case over child sexual exploitation on its platformscnn.com
The jury found Meta liable on all counts after a seven-week trial, with evidence including undercover investigations and internal communications about encryption's impact on CSAM reporting.
- [3]New Mexico jury says Meta harms children's mental health and safety, violating state lawnpr.org
The jury found thousands of violations at $5,000 each, totaling $375 million. Prosecutors had sought approximately $1.9 billion.
- [4]Santa Fe jury awards New Mexico $375M in Meta child exploitation casesourcenm.com
New Mexico Attorney General Raúl Torrez's 2023 lawsuit alleged Meta violated consumer protection laws and misled the public on risks for teen users.
- [5]New Mexico jury finds Meta violated consumer protection law over child exploitation claimscbsnews.com
Meta acknowledged difficulty enforcing age restrictions, with Zuckerberg noting a 'meaningful number of people lie about their age.'
- [6]Meta's Unsealed Internal Documents Prove Years of Deliberate Harm and Inaction to Protect Minorstechoversight.org
Six whistleblowers filed disclosures revealing Meta deleted or doctored internal safety research showing children exposed to grooming and exploitation.
- [7]Jury finds Meta's platforms are harmful to children in 1st wave of social media addiction lawsuitspbs.org
Over 40 state attorneys general have filed suits against Meta. Prosecutors argue algorithmic amplification creates separate liability from passive content hosting.
- [8]New Mexico Department of Justice Wins as Court Denies Meta's Motion to Dismissnmdoj.gov
The court denied Meta's claim of immunity under Section 230, ruling the state's claims targeted Meta's business practices rather than its role hosting user content.
- [9]Meta Reports Fourth Quarter and Full Year 2025 Resultsinvestor.atmeta.com
Meta reported Q4 2025 revenue of $59.89 billion (up 24% YoY) and full-year 2025 revenue of $200.97 billion, with net income of $22.77 billion in Q4.
- [10]Meta ordered to pay $375 million in New Mexico trial over child exploitation, user safety claimsnbcnews.com
The verdict marks the first time Meta has been held accountable in a jury trial for child safety failures on its platforms.
- [11]Social Media Addiction Lawsuits (2026): KGM Trial, MDL 3047, and Settlements Explainedspencer-law.com
Multiple bellwether trials are underway in 2026, with the KGM v. Meta case in Los Angeles proceeding alongside the New Mexico state trial.
- [12]Jury begins deliberations in landmark trial over children's safety risks on Metatheindianalawyer.com
Meta attorney Kevin Huff argued the company employs 40,000 people on safety and has built automated tools, while acknowledging some harmful content evades filters.
- [13]Big tech's tobacco or opioid moment? 'Reckoning' seen in swirl of social media addiction trialsfortune.com
Attorney Jayne Conroy drew direct parallels between opioid and social media litigation: 'The cornerstone of both cases is the same: addiction.'
Sign in to dig deeper into this story
Sign In