
Meta lost its lawsuit against the state of New Mexico last week, marking the first time a company has been held accountable by the court system for endangering child safety. This was a groundbreaking decision in itself. But the next day, Meta lost another lawsuit, when a jury in Los Angeles found that the company knowingly designed the app to be addictive for children and teenagers, putting the mental health of the plaintiff, a 20-year-old known as KGM, at risk.
This precedent opens a wave of lawsuits related to Meta’s intentional tracking of young users despite knowledge that the app could have negative psychological effects on them. Thousands of cases like KGM’s are pending, and 40 state attorneys general have filed lawsuits against Meta similar to the New Mexico case.
Social media platforms are legally protected from being held liable for what users post on their platforms, but this time it is not the content of these platforms that is on trial. These were design features themselves, such as endless scrolling and 24-hour notifications.
“They took the model that was used against the tobacco industry years ago, and instead of focusing on things like content, they focused on these addictive features, how the platform was designed, design issues that are different from content,” Allison Fitzpatrick, a digital media attorney and partner at Davis+Gilbert, told TechCrunch. “In at least these two cases, it turned out to be a winning argument.”
After a six-week trial, a jury in the New Mexico case found Meta liable for violating the state’s unfair practices law and ordered the company to pay fines of up to $5,000 per violation, for a total of $375 million. The Los Angeles case, which found Meta 70% liable and YouTube 30% liable for plaintiff KGM’s distress, will result in fines totaling $6 million for the companies. (Snap and TikTok settled the case before trial.)
“It’s nothing to the Metas of the world,” Fitzpatrick said. “But if you multiply that $6 million by all the cases they oppose, it becomes a huge number.”
“We respectfully disagree with this ruling and will appeal,” a Meta spokesperson told TechCrunch. “Reducing complex issues like teen mental health to a single cause risks leaving unaddressed the broader issues facing teens today, and overlooks the fact that many teens rely on digital communities to connect and find a sense of belonging.”
Tech Crunch Event
San Francisco, California
|
October 13-15, 2026
As part of the lawsuit, new internal documents from Meta were released. The documents not only revealed a pattern of inaction to address the platform’s negative impact on minors, but also included intensive attempts to increase the time teens spend on the app, even at schools and through “finstas,” “fake Instagram” accounts created specifically for teens to hide from their parents or teachers.
One document lists a report containing findings from a 2019 study, in which Meta conducted 24 face-to-face one-on-one interviews with people flagged as having problems using the product. This designation applies to approximately 12.5% of users.
“The best external research suggests that Facebook’s impact on people’s well-being is negative,” the report said.
Several documents reference comments made by Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri about prioritizing teen engagement. Zuckerberg even said that for Facebook Live to be successful with teens, “I think we’re going to have to be good at not alerting parents/teachers.”
In other documents, Meta employees spoke flippantly about the company’s goal of increasing teen user retention.
One employee wrote in an email to Meta CPO Chris Cox: “We learned that one of the things we had to optimize for was taking a quick look at our phones while studying chemistry.”
“Nobody wants to maximize the number of times they open Instagram that day,” Meta VP of Product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product team is trying to do.”
A Meta spokesperson told TechCrunch that while many of the newly released documents are nearly a decade old, the company is hearing from parents, experts and law enforcement on how it can improve its platform.
“We are not targeting how much time teens spend today,” the spokesperson said, citing Instagram teen accounts introduced in 2024. This account offers built-in safety features for teen users. These protections equate to setting your account to private by default and only allowing people you follow to tag or comment on your posts. Instagram will also send time-limited notifications to teens telling them to quit the app after 60 minutes. This notification can only be changed by children under 16 years of age with parental permission.
The revelations come as no surprise to Kelly Stonelake, Meta’s director of product marketing, who worked at the company from 2009 to 2024. (Stonelake is currently suing Meta for alleged gender-based discrimination and harassment.)
“The mountain of unpublished evidence really shows what I have experienced firsthand,” she told TechCrunch.
At Meta, Stonelake led the “go-to-market” strategy with the launch of VR social app Horizon Worlds to teens. She raised concerns about the Metaverse’s lack of effective content moderation tools, but claims her objections were not taken seriously.
The U.S. government has taken a keen interest in the issue of children’s online safety, especially since Meta whistleblower Frances Haugen leaked damning internal documents in 2021.
Congress has proposed numerous bills to address children’s online safety, but many of these efforts would do more to spy on adults and censor their speech than to protect minors, some privacy advocates say.
“There is no world in which censorship under the guise of child safety or the passage of ‘age verification’ laws does not lead to mass online censorship of content and speech that Trump does not like,” Evan Greer, executive director of Fight for the Future, said in a statement.
Stonelake once lobbied on Capitol Hill for children’s online safety laws. The bill has gained the most traction of any of these legislative efforts, garnering support from companies like Microsoft, Snap, X, and Apple. But as the bill evolved and changed, she became critical of it.
“I’m urging a ‘no’ vote on the current version,” she said, citing the bill’s preemption clause that would override state regulations for tech companies. “The latest version has language that would close school districts, survivors and state courts, and that’s nonsense.”
For example, the language could preempt a case brought by New Mexico against Mehta.
“We need people to come to the table with solutions instead of what they’re doing now, which is just telling different stories on both sides of the aisle and making them angry and alarmed,” Stonelake said. “Real solutions are complex, nuanced, and require consideration of multiple priorities.”









