Meta Faces Jury Trial in New Mexico Over Child Exploitation and Safety Allegations

New Mexico prosecutors allege Meta knowingly enabled child exploitation, as a landmark jury trial begins in Santa Fe.
Court legal setting representing Meta jury trial allegations. Court legal setting representing Meta jury trial allegations.
By MDL.

Executive Summary

  • The New Mexico Attorney General alleges Meta knowingly prioritized engagement over child safety.
  • Evidence includes “Operation MetaPhile,” where undercover agents posing as minors were solicited for sex.
  • Internal documents allegedly show CEO Mark Zuckerberg overruled safety staff regarding AI chatbot access for minors.
  • Meta defends its record, citing new safety tools and accusing the state of sensationalism.
  • The case bypassed Section 230 dismissal by focusing on product design rather than third-party content.

Meta faces a landmark jury trial in Santa Fe, New Mexico, where state prosecutors allege the social media giant knowingly facilitated child sexual exploitation and human trafficking on its platforms. The proceedings, which began with jury selection, pit the New Mexico Attorney General’s office against the parent company of Facebook and Instagram in a case focused on corporate liability for user safety.

New Mexico Attorney General Raúl Torrez argues that Meta’s design choices prioritized user engagement and profit over child safety, effectively creating what he described as a “marketplace for predators.” The lawsuit contends that the company failed to implement effective safeguards, knowingly exposing minors to risks including sexual solicitation, sextortion, and unmoderated groups dedicated to commercial sex. Filings in the case cite internal documents suggesting that approximately 100,000 children on Meta’s platforms experience online sexual harassment daily.

Prosecutors intend to present evidence from “Operation MetaPhile,” an investigation in which undercover agents posed as minors on the platforms. According to the Attorney General’s office, these accounts were solicited for sex by adult users. The state alleges that despite a surge in predatory activity directed at these accounts, Meta failed to disable the offending profiles and instead sent automated communications regarding account monetization. Further allegations in court filings claim that CEO Mark Zuckerberg overruled internal safety recommendations, permitting minors to access AI chatbots despite warnings about potential sexually exploitative interactions.

Meta has firmly denied the accusations, characterizing the state’s arguments as sensationalist and based on cherry-picked documents. A company spokesperson stated that Meta has a longstanding commitment to supporting young people, citing the implementation of “Teen Accounts” with built-in protections and tools for parental supervision. The company maintains that it has worked with law enforcement and experts for over a decade to safeguard its platforms.

Judicial Precedent and Regulatory Impact

This trial serves as a critical test for legal strategies that target a technology company’s product design rather than third-party content, a distinction that allowed the case to proceed despite the liability shields typically provided by Section 230 of the Communications Decency Act. The verdict could establish significant legal precedents regarding the duty of care social media platforms owe to minor users and the extent of their liability for platform architecture. While the trial focuses on civil liability for Meta, the evidence references separate criminal actions against specific users; it is important to note that all individuals charged with crimes are presumed innocent until proven guilty in a court of law.

Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Secret Link