Facebook parent sued by New Mexico alleging it has failed to shield children from predators
SANTA FE, N.M. — Facebook and Instagram fail to protect underage users from exposure to child sexual abuse material and let adults solicit pornographic imagery from them, New Mexico’s attorney general alleges in a lawsuit that follows an undercover online investigation. The accounts also received recommendations to join unmoderated Facebook groups devoted to facilitating commercial sex, investigators said, adding that Meta also let its users find, share, and sell “an enormous volume of child pornography.” “Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children,” Torrez said, accusing Meta’s executives of prioritizing “engagement and ad revenue over the safety of the most vulnerable members of our society.” Meta, which is based in Menlo Park, California, did not directly respond to the New Mexico lawsuit’s allegations, but said that it works hard to protect young users with a serious commitment of resources. “In one month alone, we disabled more than half a million accounts for violating our child safety policies.” Company spokesman Andy Stone pointed to a company report detailing the millions of tips Facebook and Instagram sent to the National Center in the third quarter of 2023 — including 48,000 involving inappropriate interactions that could include an adult soliciting child sexual abuse material directly from a minor or attempting to meet with one in person.


Discover Related

Meta expands 'Teen Accounts' to Facebook and Messenger

Meta's Mark Zuckerberg not liable in lawsuits over social media harming children

Meta must face U.S. state lawsuits over teen social media addiction

Meta adds new teen safety features following renewed criticism
