Meta fined $375m in New Mexico child safety case
The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people’s mental health.
Meta fined $375m in New Mexico child safety case
The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people’s mental health.
A New Mexico jury yesterday (24 March) found Meta Platforms violated state law in a lawsuit brought by the state attorney general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and of enabling child sexual exploitation on those platforms.
After deliberating for less than a day, the jury found that Meta violated New Mexico’s consumer protection law and ordered the company to pay $375 million in civil penalties.
The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people’s mental health.
“We respectfully disagree with the verdict and will appeal,” a Meta spokesperson said in a statement. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.”
In a statement, New Mexico Attorney General Raúl Torrez, a Democrat, called the verdict “a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.”
“The substantial damages the jury ordered Meta to pay should send a clear message to big tech executives that no company is beyond the reach of the law,” he said.
In a second phase of the trial in May, Torrez said his office will ask the court to order Meta to make changes to its platforms to protect children and to impose additional financial penalties.
Meta shares were up 0.8% in after-hours trade following the verdict. The state had asked the jury to award more than $2 billion in damages.
META FACES A BROAD CHALLENGE RELATED TO YOUTH MENTAL HEALTH
The jury’s decision was made in Santa Fe Torrez had accused the company of allowing predators unfettered access to underage users and connecting them with victims, often leading to real-world abuse and human trafficking.
“Over the course of a decade, Meta has failed over and over again to act honestly and transparently,” Linda Singer, an attorney for the state, told the jury during closing arguments on Monday. “It’s failed to act to protect young people in this state.”
“What the evidence shows is Meta’s robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public,” Kevin Huff, an attorney for Meta, told the jury on Monday.
Reuters viewed the trial on Courtroom View Network.
Meta has come under increasing scrutiny in recent years over its handling of child and teen safety, spurred in part by a whistleblower in 2021 who alleged the company knew its products could be harmful but refused to act.
Separately, Meta is facing thousands of lawsuits accusing it and other social media companies of intentionally designing their products to be addictive to young people, leading to a nationwide mental health crisis.
Some of the lawsuits, which have been filed in both state and federal courts, seek damages in the tens of billions of dollars, according to Meta’s filings with financial regulators.
A state court jury in Los Angeles is in the first trial over the addiction claims.
Meta has argued the company in both the addiction and the New Mexico lawsuits by the free-speech protections of the US Constitution’s First Amendment and Section 230 of the Communications Decency Act, which generally bars lawsuits against websites over user-generated content.
The company has said the state’s allegations of harm cannot be separated from the content on the platforms, because its algorithms and design features serve to publish content.
The judge in New Mexico rejected Meta’s arguments on Section 230, allowing the case to go to trial.
NEW MEXICO’S INVESTIGATION
The New Mexico lawsuit grew out of an undercover operation, which Torrez, a former prosecutor, and his office ran in 2023. As part of the case, investigators created accounts on Facebook and Instagram posing as users younger than 14.
The accounts received sexually explicit material and were contacted by adults seeking similar content, leading to criminal charges against multiple individuals, according to Torrez’s office.
The state claims Meta told the public that Instagram, Facebook and WhatsApp are safe for New Mexico teens and children, while hiding the truth about how much dangerous and harmful content the company hosts.
According to the state, internal company documents acknowledged problems with sexual exploitation and mental health harm. Yet the company, the state says, did not institute basic safety tools such as age verification and insisted it was safe.
The state also accused Meta of designing its platforms to maximise engagement despite evidence that they were harming children’s mental health.
Features such as infinite scroll and auto-play videos keep kids on the site, fostering addictive behaviour that can lead to depression, anxiety and self-harm, the lawsuit claims.
On Tuesday, the jury found that Meta had violated the state’s consumer protection law by knowingly engaging in an unfair or deceptive trade practice.
The jury also found that the company’s actions were unconscionable, meaning Meta knowingly took advantage of a lack of knowledge in New Mexico residents. The jury found 75,000 violations and awarded $5,000 per violation.
In May, Judge Bryan Biedscheid is slated to hold a trial without a jury on the state’s claims that Meta created a public nuisance that harmed state residents’ health and safety.
The state will ask Biedscheid to direct Meta to make changes to its platforms, including adding effective age verification and removing predators, it said Tuesday.