Funsamb logo
Meta told to pay $375m for misleading users over child safety
03/25/2026

Meta told to pay $375m for misleading users over child safety

Meta ordered to pay $375m over child safety claims

A court in New Mexico has ordered Meta to pay $375m (£279m) after a jury found the company misled users about the safety of its platforms for children.

The verdict held Meta, the owner of Facebook, Instagram and WhatsApp, liable for endangering children and exposing them to sexually explicit material and contact with sexual predators. The jury found the company violated New Mexico's Unfair Practices Act by misleading the public about how safe its platforms were for young users.

New Mexico Attorney General Raul Torrez called the decision historic, saying it marked the first time a US state had successfully sued Meta over child safety issues.

Evidence presented at trial

The case was brought by New Mexico in 2022. State prosecutors argued that Meta had steered young users toward sexually explicit content, child sexual abuse material, and solicitation for such material or sex trafficking through its recommendation algorithms.

During the seven-week trial, jurors were shown internal Meta documents and heard testimony from former employees about the company's awareness of child predators using its platforms. Arturo Béjar, a former Meta engineering leader who left the company in 2021 and later became a whistleblower, testified about experiments he conducted on Instagram that he said showed underage users were served sexualized content.

Béjar also told the court that his own young daughter had been propositioned for sex by a stranger on Instagram. Prosecutors also cited internal Meta research that, at one point, found 16% of Instagram users reported seeing unwanted nudity or sexual activity in a single week.

Meta plans appeal

Meta said it disagrees with the verdict and intends to appeal. A spokeswoman said the company works hard to keep people safe on its platforms and has been transparent about the challenges of identifying and removing bad actors and harmful content.

The company argued during the trial that it had spent years trying to combat harmful users and improve protections for minors. It pointed to steps including Instagram Teen Accounts, introduced in 2024 to give young users more control over their experience, and a more recent feature designed to alert parents if their children are searching for self-harm content.

The $375m penalty was calculated after the jury found there had been thousands of violations of the state law, each carrying a maximum civil penalty of $5,000.

Wider legal pressure

Torrez said Meta executives knew their products harmed children, ignored warnings from their own employees, and lied to the public about what they knew. He said the jury had joined families, educators and child safety experts in declaring that "enough is enough."

Meta also faces separate legal pressure elsewhere in the US. The company is involved in a trial in Los Angeles in which a young woman alleges she became addicted to platforms including Instagram and YouTube as a child because of how they were designed. Thousands of similar lawsuits are also moving through US courts.