New Mexico Jury Orders Meta Platforms to Pay $375 Million Over Child Safety Violations

Meta
AI Generated

A jury in New Mexico on Tuesday declared that Meta Platforms was guilty of breaking state law in a case filed by the attorney general of New Mexico, who claimed the company deceived users about the safety of its online platforms, Facebook, Instagram and WhatsApp and facilitated child sex trafficking.

The jury decided based on less than a day of reflecting that California-based Meta had breached the consumer protection law of the state of New Mexico and sentenced the company to pay $375 million as civil penalties.

The judgment is the first in which a jury has provided a decision on such grounds against the company that is already experiencing a surge of lawsuits due to the effect of its platforms on the mental health of the youth.

In a statement, a Meta spokesperson said, "We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content".

New Mexico Attorney General Raul Torrez said in a statement that the decision was a historic triumph for all children and families that suffered because of the decision made by Meta to prioritize profits over the safety of kids.

"The substantial damages the jury ordered Meta to pay should send a ​clear message to big tech executives that no company is beyond the reach of the law", he added.

Another trial stage will take place in May. Torrez told them that his office will petition the court to have Meta amend its platforms to safeguard children and hand out extra financial fines.

Post-verdict trading of Meta shares increased by 0.8% in after-hours trading. It had already requested the jury to give damages valued at more than $2 billion by the state.

The ruling came after a six-week trial in Santa Fe, during which the state claimed that Meta permitted predators to target underage users and communicate with them, in some cases even abusing them and forcing them into human trafficking in real life.

In closing remarks, state counsel Linda Singer reminded jurors that the company had failed in a number of instances to act transparently or safeguard young users during a 10-year period.

Meta refuted the claims, saying that it has placed comprehensive protections aimed at keeping younger users safe on its platforms.

The case falls under the wider examination the company has undergone in the past few years when it comes to child and teen safety. That investigation was sharpened following the testimony of whistleblowers before the U.S. Congress in 2021, who alleged that the company knew that its products could harm users, but did not do enough to stop it.

Instagram
AI Generated

Besides the case in New Mexico, Meta is also dealing with thousands of lawsuits that allege that the company deliberately structured its social media in such a way as to be addictive to minors. These instances assert that such design decisions have led to a nationwide mental health crisis. According to the regulatory filings made by the company, some of the lawsuits filed in both state and federal courts demand damages amounting to tens of billions of dollars.

A different case dealing with claims of addiction is currently under deliberation at a Los Angeles court where a state court jury is seated.

Meta Defends Its Safety Measures

Meta's defense to the liability in both the New Mexico case and the addiction lawsuits has been that it is under the First Amendment of the U.S. Constitution and Section 230 of the Communications Decency Act, a law that generally immunizes online platforms against liability for user-generated content.

The company sticks to the argument that the so-called harms cannot be separated from the content posted on its sites since the algorithms and design capabilities serve as the means of posting the content.

The judge in the New Mexico case however, denied the Section 230 defense of Meta and permitted the trial to continue.

The lawsuit is based on an undercover investigation by the office of Torrez in 2023. Investigators were opening Facebook and Instagram accounts under the identification of users under the age of 14. The office of the attorney general revealed that such accounts were obtained with sexually explicit content and were approached by adults who wanted such content, and this resulted in criminal charges against several people.

The state claimed that, in New Mexico, Meta publicly represented its platforms as safe for use by children and teenagers, and had been hiding the amount of harmful and dangerous content. The evidence provided in the lawsuit states that the company had documented evidence regarding the problem of sexual exploitation and mental health trauma. Nonetheless, the state alleged, Meta failed to have the simplest safety steps, like age checking, in place and still claimed that its platforms were secure.

The lawsuit further claimed that Meta programmed its sites in a way that ensured maximum engagement of users, even after the damage to mental health in children was proven. Such aspects as endless scrolling and auto-play videos were mentioned as the factors which allow users of a young age to spend more and more time on their platforms and which may cause addiction, depression, anxiety and self-harm.

The jury concluded on Tuesday that Meta had participated in unfair or deceitful trade practices knowingly in violation of a consumer protection law of the state. Jurors further established that the activities of the company were unconscionable, as the company was aware of exploiting the lack of knowledge among the New Mexico residents.

The jury found 75,000 violations and gave $5,000 to each violation, which formed the cumulative fine of $375 million.

In May, Judge Bryan Biedscheid will hear a separate, non-jury portion of the case involving the claims of the state that Meta was a public nuisance that had some impact on the health and safety of residents. The state has indicated that it will request the court to compel the company to make changes, such as the adoption of an effective age verification system and the process of eliminating the predators on its sites.

READ MORE