Meta Seeks to Bar Mentions of Mental Health—and Zuckerberg’s Harvard Past—From Child Safety Trial | EUROtoday
As Meta heads to trial within the state of New Mexico for allegedly failing to guard minors from sexual exploitation, the corporate is making an aggressive push to have sure data excluded from the court docket proceedings.
The firm has petitioned the choose to exclude sure analysis research and articles round social media and youth psychological well being; any point out of a latest high-profile case involving teen suicide and social media content material; and any references to Meta’s monetary assets, the non-public actions of workers, and Mark Zuckerberg’s time as a pupil at Harvard University.
Meta’s requests to exclude data, often known as motions in limine, are an ordinary a part of pretrial proceedings, by which a celebration can ask a choose to find out upfront which proof or arguments are permissible in court docket. This is to make sure the jury is introduced with info and never irrelevant or prejudicial data and that the defendant is granted a good trial.
Meta has emphasised in pretrial motions that the one questions the jury must be requested are whether or not Meta violated New Mexico’s Unfair Practices Act due to the way it has allegedly dealt with youngster security and youth psychological well being, and that different data—akin to Meta’s alleged election interference and misinformation, or privateness violations—shouldn’t be factored in.
But a few of the requests appear unusually aggressive, two authorized students inform WIRED, together with requests that the court docket not point out the corporate’s AI chatbots, and the intensive repute safety Meta is in search of. WIRED was in a position to assessment Meta’s in limine requests by way of a public data request from the New Mexico courts.
These motions are a part of a landmark case introduced by New Mexico lawyer normal Raúl Torrez in late 2023. The state is alleging that Meta failed to guard minors from on-line solicitation, human trafficking, and sexual abuse on its platforms. It claims the corporate proactively served pornographic content material to minors on its apps and did not enact sure youngster security measures.
The state criticism particulars how its investigators had been simply in a position to arrange faux Facebook and Instagram accounts posing as underage ladies, and the way these accounts had been quickly despatched express messages and proven algorithmically amplified pornographic content material. In one other check case cited within the criticism, investigators created a faux account as a mom seeking to visitors her younger daughter. According to the criticism, Meta didn’t flag suggestive remarks that different customers commented on her posts, nor did it shut down a few of the accounts that had been reported to be in violation of Meta’s insurance policies.
Meta spokesperson Aaron Simpson informed WIRED by way of electronic mail that the corporate has, for over a decade, listened to folks, specialists, and legislation enforcement, and has performed in-depth analysis, to “understand the issues that matter the most,” and to “use these insights to make meaningful changes—like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences.”
“While New Mexico makes sensationalist, irrelevant and distracting arguments, we’re focused on demonstrating our longstanding commitment to supporting young people,” Simpson mentioned. “We’re proud of the progress we’ve made, and we’re always working to do better.”
In its motions forward of the New Mexico trial, Meta requested that the court docket exclude any references to a public advisory printed by Vivek Murthy, the previous US surgeon normal, about social media and youth psychological well being. It additionally requested the court docket to exclude an op-ed article by Murthy and Murthy’s requires social media to come back with a warning label. Meta argues that the previous surgeon normal’s statements deal with social media firms as a monolith and are “irrelevant, inadmissible hearsay, and unduly prejudicial.”
https://www.wired.com/story/meta-child-safety-trial-ask-judge-bar-mental-health-harvard/