
Juicio contra Meta en Los Ángeles: asuntos y posiciones
March 15, 2026
Décisions sur les réseaux sociaux en 2026 : Californie et Nouveau-Mexique
April 12, 2026Social Media Rulings -
Recent rulings in both New Mexico and California signal a meaningful shift in how courts are approaching the responsibility of social media companies. Rather than treating platforms as passive hosts of user-generated content, these cases focus on how corporate decisions—particularly around design, safety, and transparency—may contribute to real-world harm, especially for minors.
In New Mexico, a 2026 jury verdict against Meta Platforms Inc. marked one of the first times a social media company was held liable under state law for harms linked to child safety and mental health. The court found that Meta had knowingly exposed minors to risks, including exploitation, while also misleading users about the effectiveness of its safety measures (Associated Press, 2026). The jury concluded that these actions constituted deceptive trade practices, emphasizing that the company failed to adequately disclose known dangers associated with its platforms (Associated Press, 2026). As a result, Meta was ordered to pay approximately $375 million in penalties (Le Monde, 2026). The ruling is widely viewed as significant because it shifts the legal focus toward product design and corporate knowledge, rather than solely on user-generated content.
Meta responded by denying wrongdoing and asserting that it has invested substantially in tools and policies designed to protect younger users. The company stated that it disagrees with the verdict and intends to appeal, maintaining that the decision does not accurately reflect its efforts to improve user safety (Associated Press, 2026). This response aligns with a broader pattern among technology companies, which often emphasize ongoing safety investments while challenging legal findings of liability.
A separate but related case in California, K.G.M. v. Meta & YouTube, further expands the legal scrutiny facing social media platforms. In this case, a Los Angeles jury found both Meta Platforms Inc. and Google LLC (through its YouTube platform) liable for negligence tied to the design of their services (The Guardian, 2026; Novak Jones, 2026). The court determined that features such as infinite scrolling, autoplay, and algorithmic recommendation systems were intentionally designed to maximize user engagement, but also contributed to harmful outcomes for young users, including anxiety, depression, and body image issues (The Guardian, 2026). Importantly, the jury found that the companies failed to provide adequate warnings about these risks, and that their design choices were a substantial factor in causing harm (The Times of India, 2026).
The damages awarded in the California case totaled approximately $6 million, with Meta assigned the majority of the liability and Google responsible for the remainder (The Guardian, 2026). While smaller in financial scale than the New Mexico ruling, the case is considered legally groundbreaking because it targets platform design rather than content moderation, potentially challenging long-standing legal protections such as Section 230 of the Communications Decency Act (Novak Jones, 2026).
Both companies rejected the California verdict. Meta argued that mental health outcomes are influenced by a wide range of factors and should not be attributed to any single platform. Similarly, Google defended YouTube as a responsibly designed service and disputed claims that its features directly caused harm. Both companies indicated that they plan to appeal the decision (New York Post, 2026).
Taken together, the New Mexico and California rulings illustrate an emerging legal trend: courts are increasingly willing to treat social media platforms as products whose design can create foreseeable risks. The decisions highlight common themes, including failure to warn users, prioritization of engagement over safety, and internal awareness of potential harms. At the same time, the consistent response from companies—denial of liability, emphasis on safety efforts, and plans to appeal—suggests that the legal battle over platform responsibility is far from settled. If these rulings are upheld, they could have far-reaching implications for how social media companies design their products and how regulators approach digital safety in the future.
References
Associated Press. (2026, March). New Mexico jury says Meta harms children’s mental health and safety, violating state law. https://www.wuft.org/2026-03-24/new-mexico-jury-says-meta-harms-childrens-mental-health-and-safety-violating-state-law
Le Monde. (2026, March). Meta found liable for endangering children in New Mexico in ‘historic’ verdict. https://www.lemonde.fr/en/pixels/article/2026/03/25/meta-found-liable-for-endangering-children-in-new-mexico-in-historic-verdict_6751796_13.html
New York Post. (2026, March 25). Historic social media addiction ruling against Meta and Google could open legal floodgates. https://nypost.com/2026/03/25/business/historic-social-medial-addiction-ruling-against-meta-google-could-open-legal-floodgates/
Novak Jones, D. (2026, March 26). US jury verdicts against Meta, Google tee up fight over tech liability shield. Reuters. https://www.reuters.com/sustainability/boards-policy-regulation/us-jury-verdicts-against-meta-google-tee-up-fight-over-tech-liability-shield-2026-03-26/
The Guardian. (2026, March 25). Meta and YouTube designed addictive products that harmed young people, jury finds.https://www.theguardian.com/media/2026/mar/25/jury-verdict-us-first-social-media-addiction-trial-meta-youtube
The Times of India. (2026, March). In a landmark ruling, jury tells Meta and Google that the companies failed to warn users about risks of using Instagram and YouTube. https://timesofindia.indiatimes.com/technology/tech-news/in-a-landmark-ruling-jury-tells-meta-and-google-that-the-two-companies-failed-to-warn-users-about-risks-of-using-instagram-and-youtube/articleshow/129810233.cms



