Tech

Meta would rather leave New Mexico than rebuild its apps for kids

Published

on

A bench trial in Santa Fe could force algorithm changes, age verification, and a $3.7bn mental health fund. Meta has threatened to pull Facebook and Instagram from the state instead.

In March, a New Mexico jury reached a verdict that no American jury had reached before. Meta, the company once known as Facebook, had violated the state’s consumer protection law by misrepresenting the safety of Facebook and Instagram for young users. The penalty was $375 million, the first time a state had won at trial against a major US technology company for endangering children.

That was the easy part.

On Monday, the second phase of the same case opens before Judge Bryan Biedscheid in Santa Fe. There is no jury this time. Over an estimated three weeks, the judge will hear what New Mexico Attorney General Raúl Torrez wants Meta to do about the harm a jury has already found it caused, and he will decide.

Advertisement

Reuters, in its 2 May curtain-raiser, framed the proceeding plainly: this is the trial that could force changes to Facebook, Instagram, and other Meta platforms in ways the company has been resisting for nearly a decade.

Meta has answered with a threat that suggests it is, finally, taking the prospect seriously. If the orders are intolerable, Meta has indicated, it will pull Facebook and Instagram from New Mexico altogether.

What the state wants

The remedies on Torrez’s list are not symbolic. According to court filings reviewed by Reuters and the Boston Globe, New Mexico is asking the court to order Meta to verify users’ ages, redesign its recommendation algorithm so it does not optimise for engagement among minors, end autoplay and infinite scroll for users under 18, suspend push notifications during school hours and overnight, and cap children’s monthly time on its platforms at 90 hours.

The state is also asking for $3.7 billion to fund teen mental health services across New Mexico, on top of the $375 million already awarded.

Each of those measures has been studied, lobbied for, and partially adopted in pieces by Meta itself, often pre-emptively, often in markets the company is more afraid of than New Mexico. None has been imposed by court order in the United States.

Advertisement

Were Judge Biedscheid to grant even a meaningful subset, it would be the first time a state court had actively rewritten the product specification of a global social media platform.

How the case got here?

The lawsuit is older than the verdict. Torrez filed it in late 2023, citing an undercover operation by his office that involved creating a fake Instagram profile of a 13-year-old girl. The account, he later told CNBC, was “simply inundated with images and targeted solicitations” from users seeking to abuse children. The state’s case, in essence, was that this was not an accident of scale but a feature of the platform’s recommendation system.

During the first phase of the trial, prosecutors entered into evidence internal Meta communications discussing the consequences of Mark Zuckerberg’s 2019 decision to make Facebook Messenger end-to-end encrypted by default.

According to those filings, employees calculated that the change would impair Meta’s ability to disclose to law enforcement what one document put at roughly 7.5 million reports of child sexual abuse material per year.

Advertisement

The jury, according to NBC News, treated those communications as central to its finding that Meta knowingly harmed children. The encryption decision, ostensibly framed as a privacy upgrade, became one of the most damaging exhibits at trial.

Meta has since had the European Commission formally accuse it of failing to keep underage users off its platforms under the Digital Services Act, the first such charge against a mainstream social platform.

Meta’s response, articulated in pre-trial filings and a public letter cited by The Washington Post and Source New Mexico, has been extraordinary. The company has argued that some of the remedies New Mexico is seeking are technically infeasible, would compromise its ability to operate consistently across markets, and would force it, in the limit, to withdraw Facebook and Instagram from the state.

Torrez called the threat “showing the world how little it cares about child safety,” in a remark widely reported on 30 April.

Advertisement

Whether Meta would actually follow through is harder to assess. New Mexico has a population of about 2.1 million, a fraction of the company’s global user base. The threat is, in part, a negotiation tactic, intended to make the judge consider the spillover effects of any aggressive order. It is also, however, an argument that platform-level remedies in any single jurisdiction set a precedent for the next one.

More than 40 state attorneys general have filed similar suits against Meta, with bellwether trials scheduled across 2026. New Mexico, in that sense, is being treated as a test.

Meta is not arriving at the second phase, having ignored the topic. Over the past several years, it has rolled out a thicket of teen-safety features: AI-driven systems that detect adults messaging minors who do not follow them, “take a break” prompts for excessive use, default-private accounts for users under 16, parental supervision tools, and limits on the kinds of advertising teens can be served. Several of these were announced under regulatory pressure from the EU, where the bloc’s age verification framework is now active.

What Meta has not done, and what New Mexico is asking the judge to order, is to restructure the underlying recommendation engine. The company’s algorithm, as both internal documents and external research have repeatedly shown, is calibrated for time spent on the platform. The state argues that, for minors, calibration is itself the harm.

Advertisement

And there is a cost dimension that increasingly matters. Meta is in the midst of a roughly $145 billion AI capex programme, an investment of historic scale by any measure. Meta’s mounting child-safety legal exposure could, eventually, cost more than the AI cluster bill. The New Mexico phase-two trial is the first time that comparison stops being theoretical.

Judge Biedscheid is being asked, in effect, to translate a finding of corporate harm into a product roadmap. He could rule narrowly, ordering Meta to do little more than what California and the UK already require under their respective age-appropriate design laws.

He could rule broadly, accepting most of Torrez’s list, in which case Meta will appeal, fight a stay, and decide in real time whether the threat to leave New Mexico is a bluff. He could also split the difference, ordering algorithmic changes for minors but stopping short of the 90-hour cap. He is not expected to rule from the bench; the trial is scheduled to run roughly three weeks, with written orders following.

A separate Los Angeles jury found Meta and YouTube liable last year in an addiction case, and Indonesia became the first Southeast Asian country to ban under-16s from major social platforms in late 2025. The legal weather around minors and social media has changed.

Advertisement

New Mexico has, until this trial, mostly been a state where Meta did business unobstructed. Whether it remains one in three months will depend less on what the judge writes than on what Meta decides to do about it. For a company that has spent two decades insisting it could fix its harms voluntarily, that is, finally, a different conversation.

Source link

Advertisement

You must be logged in to post a comment Login

Leave a Reply

Cancel reply

Trending

Exit mobile version