
A landmark courtroom fight is testing whether Big Tech can profit from kids’ exposure to predators while telling parents “your children are safe.”
Story Snapshot
- New Mexico’s attorney general sued Meta over claims Facebook and Instagram misled families about children’s safety risks.
- The case uses New Mexico’s Unfair Trade Practices Act, a strategy designed to get around federal liability shields that often protect platforms.
- State investigators say undercover accounts posing as minors drew sexual solicitations, highlighting weak age verification and easy access to children.
- Meta denies the core allegations, argues the state “cherry-picked” evidence, and points to teen protections and parental tools.
What the New Mexico jury is being asked to decide
New Mexico Attorney General Raúl Torrez brought the case to state court alleging Meta’s Facebook and Instagram misled users about the risks children face on the platforms. The lawsuit argues the company’s design choices prioritized engagement and growth over child safety safeguards. Jury selection began in Santa Fe in early February 2026, with the trial running about seven weeks and closing arguments beginning March 23, 2026.
The claims center on whether Meta’s public assurances and product choices amounted to unfair or deceptive practices under state consumer-protection law. A judge previously rejected Meta’s attempt to throw the case out, allowing it to move forward while removing CEO Mark Zuckerberg from personal liability. The jury is now weighing competing narratives: state claims that risks were downplayed versus Meta’s insistence it disclosed risks and invested in protections.
Undercover evidence, internal estimates, and the dispute over scale
State investigators say undercover accounts posed as children and documented sexual solicitations, which New Mexico argues demonstrates how easily predators can locate and contact minors. Reporting on the case also references internal estimates describing a high volume of harmful interactions involving minors, with figures cited in sources ranging from roughly 100,000 children facing daily sexual harassment to an internal warning about far larger daily volumes of exploitation-related activity.
Meta disputes the state’s framing and challenges how the investigation was conducted, arguing the allegations rely on selective examples and questionable methods. That disagreement matters because it affects how jurors interpret “knowledge” and “reasonable safeguards.” The public record described in coverage indicates the state will lean heavily on internal documents and whistleblower-era scrutiny to argue Meta had clear warnings but delayed or limited safety changes.
The legal strategy: using state consumer law to pierce Big Tech immunity
The case is being watched nationally because it is described as the first state attorney general lawsuit over child sexual exploitation claims against Meta to reach a jury. Instead of relying on theories that often collide with federal protections for online platforms, New Mexico is using its Unfair Trade Practices Act. That approach is meant to focus the trial on consumer deception—what families were told—rather than only on third-party content.
For conservatives wary of Washington’s habit of responding to real harms with sweeping speech controls, this case highlights a different path: targeted accountability through existing fraud and consumer-protection tools. The U.S. still lacks a comprehensive modern federal child online safety framework beyond COPPA, and efforts like the Kids Online Safety Act have stalled. As a result, states are trying to fill the gap, creating a patchwork that could pressure companies to change faster.
What Meta says it’s doing now—and why critics say it’s not enough
Meta’s defense emphasizes youth-safety investments and newer features such as “Teen Accounts” and parental controls, arguing it has built tools to reduce risk and collaborate with experts. In court coverage, Meta also argues the attorney general’s presentation overstates conclusions and “cherry-picks” evidence. The company has acknowledged the case carries financial risk, underscoring why the verdict could shape how aggressively Meta redesigns products.
Critics, including expert testimony cited in reporting, argue that platform incentives still reward attention and rapid interaction, which can increase exposure to grooming and other harms. Other coverage points to weak age verification—often reliant on self-declared birthdates—as a persistent vulnerability. The broader question for jurors is whether these weaknesses are merely hard problems in a massive network, or foreseeable risks inadequately addressed while growth remained the priority.
Why this trial matters to families, policy, and the limits of government power
The most immediate stakes are for children and parents: whether platforms must adopt stronger default protections and verification to reduce predatory contact. But the political stakes extend beyond Meta. With federal lawmaking stalled, a jury verdict could accelerate state legislation and encourage more aggressive lawsuits. At the same time, any regulatory momentum will raise constitutional questions conservatives care about—how to protect kids without empowering bureaucrats to police lawful speech or surveil Americans.
Landmark trial in New Mexico to decide whether Meta misled users about children's safety risks @WashTimes https://t.co/EmBh9IRt2N
— Washington Times Local (@WashTimesLocal) March 23, 2026
Coverage also notes parallel, high-profile litigation in other states focused on addiction-style product design, suggesting juries may become de facto referees for policy debates Congress avoided. Limited public information remains on precisely which internal documents and metrics jurors will credit most, and the case outcome was not determined in the research provided. What is clear is the direction: states are increasingly using courtroom pressure to force changes Big Tech resisted voluntarily.
Sources:
Child exploitation, grooming, and social media addiction claims put Meta on trial
New Mexico trial to decide whether Meta misled users on children’s safety


























