Back-to-back jury verdicts hit Meta with $375 million, find Meta and YouTube negligent
In the span of two days, two U.S. juries delivered verdicts that could reshape how courts treat social media’s impact on children. On March 24, 2026, a Santa Fe jury ordered Meta to pay $375 million for violating New Mexico’s consumer protection laws. The next day, a Los Angeles jury found Meta and Google’s YouTube negligent in the design of their platforms, awarding almost $6 million in damages to a single plaintiff.
The figures drew headlines, even as investors appeared unfazed. A $375 million penalty is less than 2% of Meta’s $22.8 billion net income in 2025, and the company—valued at roughly $1.5 trillion—saw its shares rise 5% on the day of the New Mexico verdict. The rulings themselves do not automatically change how the platforms operate; a financial penalty does not rewrite code or alter algorithms.
Meta and Google have signaled they will appeal, with First Amendment challenges to the product-design theory likely to be a central battleground. The companies’ lawyers are expected to argue that the science linking platform design to mental health harm remains contested and that safety measures are already in place.
In the meantime, Instagram, Facebook and YouTube will continue to operate as they did before the verdicts. Beyond the headlines, the New Mexico case stands out for how it sought to sidestep Section 230 of the Communications Decency Act, which for three decades has broadly shielded internet platforms from liability for content posted by users.
Rather than suing over user-generated content, New Mexico Attorney General Raúl Torrez accused Meta of deceiving consumers about the safety of its products, advancing a consumer-protection claim under the state’s Unfair Practices Act. Filed in December 2023, the complaint posed a straightforward question: Did Meta knowingly lie to New Mexico consumers about platform safety?
The jury answered yes on all counts, finding liability on three distinct legal theories under the Unfair Practices Act.
First, jurors found straightforward deception—public statements, including CEO Mark Zuckerberg’s congressional testimony that research on the platform’s addictiveness was inconclusive and parental guidance materials that omitted known risks of grooming and sexual exploitation, amounted to representations made in connection with a commercial transaction.
New Mexico argued that users “pay” for Meta’s platforms with their data, which is converted into advertising revenue, and that misrepresentations in this data-for-services exchange are actionable regardless of Section 230. Second, jurors found an unfair practice—conduct offensive to public policy even if not technically deceptive.
Evidence at trial centered on what Meta’s own engineers and executives knew and then ignored. Internal documents showed repeated warnings, including alarm bells about child sexual abuse material proliferating on the platform. The twin verdicts could open the door to hundreds or even thousands of similar cases.
For now, the legal landscape is shifting faster than the products themselves. The ultimate test of whether these rulings will change how platforms are designed and marketed to young users is likely to unfold over years of appeals and follow-on litigation.
