Meta’s second major trial of 2026 over alleged harms to children begins on Monday.
The landmark jury trial in Santa Fe pits the New Mexico attorney general’s office against the social media giant. The state alleges that the company knowingly enabled predators to use Facebook and Instagram to exploit children.
The trial will introduce evidence that Raúl Torrez, the state’s attorney general, believes shows how Meta’s social networks create dangerous environments for children, exposing them to sexual exploitation, solicitation, sextortion and human trafficking.
The lawsuit states that Meta’s design choices and profit incentives prioritized engagement over child safety and that it failed to implement effective safeguards. The state accuses the company of allowing unmoderated groups devoted to commercial sex and of facilitating the buying, selling, and sharing of child sexual abuse material (CSAM).
“While the New Mexico attorney general makes sensationalist, irrelevant and distracting arguments by cherry-picking select documents, we’re focused on demonstrating our longstanding commitment to supporting young people,” a Meta spokesperson said.
“For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes – like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better.”
The lawsuit follows a two-year Guardian investigation, published in 2023, which revealed that the tech giant was struggling to prevent people from using its platforms to traffick children. The investigation is cited several times in the suit’s filings. In an interview with the Guardian in 2024, Torrez said he believes Meta is the “largest marketplace for predators and paedophiles globally”.
After a week of jury selection, opening statements are set to begin on 9 February, followed by the presentation of evidence. The proceedings are expected to be about seven weeks long.
Social media companies have long maintained they are not responsible for crimes committed via their networks because of a US federal law that generally protects platforms from legal liability for content created by their users, section 230 of the Communications Decency Act. Meta’s attempts to invoke section 230 and the first amendment to get the case dismissed were ultimately denied in a judge’s ruling in June 2024, due to the lawsuit’s focus on Meta’s platform product design and other non-speech issues, such as internal decisions about content and curation.
The New Mexico trial comes just a week after a high-profile case began in Los Angeles, in which hundreds of US families and schools allege that Meta, Snap, TikTok, and YouTube have harmed children. The plaintiffs in the LA lawsuit allege the platforms knowingly addicted young users, leading to depression, eating disorders, self-harm and other mental health problems. The proceedings involve about 1,600 plaintiffs, including more than 350 families and 250 school districts. Snap and TikTok reached settlements with the plaintiffs, while Meta and YouTube remain on trial. Each company has denied wrongdoing.
“The fact that these cases are going to trial proves the section 230 dam is breaking and social media platforms,” said Sacha Haworth, executive director of the Tech Oversight Project. “These are the trials of a generation; just as the world watched courtrooms hold big tobacco and big pharma accountable, we will for the first time see big tech CEOs take the stand.”
Key witnesses for the plaintiffs in the New Mexico suit are expected to include educators and law enforcement officials, who will speak about the alleged harms and crimes they have witnessed take place on Facebook, Instagram, and WhatsApp, and whistleblowers, who may reveal internal company discussions. Teens and families who have experienced harm on the platforms are not anticipated to take the stand.
Torrez and his team have already taken Meta chief Mark Zuckerberg’s deposition, and may play portions of it in court if he does not attend. New Mexico’s jurisdiction limits its ability to compel out-of-state witnesses to testify in person.
On the lawsuit’s journey to the courtroom, the attorney general’s office has made several disclosures with fresh allegations. They include that Meta may have profited by placing advertisements from companies such as Walmart and Match Group alongside content that sexualized children, according to internal documents and emails.
Internal Meta documents obtained by the attorney general’s office show the company estimates that roughly 100,000 children on Facebook and Instagram experience online sexual harassment each day. Filings unveiled chat excerpts of users allegedly discussing how to lure minors into engaging with them sexually. A former Instagram employee testified in 2023 before Congress that his own daughter had received unwanted sexual advances on Instagram. When he notified senior Meta leadership, he said he was ignored.
Among the evidence expected to be presented in court are details of the 2024 arrest of three men charged with sexually preying on children through Meta’s platforms. This was part of an investigation dubbed “Operation MetaPhile” by the attorney general’s office. Undercover agents posing as children were contacted by the suspects, who allegedly solicited them for sex after finding minors through design features on Facebook and Instagram.
The agents did not initiate any conversations about sexual activity, according to the state attorney general. Following a surge of activity from adults to one of these undercover agent’s accounts, Meta did not shut it down, and instead sent it information about how to monetize accounts and grow followings.
A filing last week revealed allegations that Zuckerberg approved allowing minors to access AI chatbot companions despite warnings from the company’s safety staff that the bots could engage in sexual interactions. Internal emails and messages cited in the filings reportedly show that “Meta, driven by Zuckerberg, rejected the recommendations of its integrity staff and declined to impose reasonable guardrails to prevent children from being subject to sexually exploitative conversations with its AI chatbots”.
The documents detail an internal chat in March 2024 about parental controls for minors using AI chatbots on Meta’s platforms; one employee asked whether parents could disable the chatbots for their children. Another employee responded that it was a “Mark-level decision that parents cannot turn it off”.

4 hours ago
6

















































