Meta was finally held accountable for victimizing the teens. Now what?


Meta lost a lawsuit against the state of New Mexico last week, marking First time That the company has been held accountable to the court system for endangering the safety of children. This was a historic decision in itself, but the next day, dead Lost another case When a Los Angeles jury found that the company intentionally designed its apps to cause addiction in children and teens, thus endangering the mental health of the plaintiff, a 20-year-old known as KGM

These precedents open the floodgates to a wave of lawsuits related to Meta’s intentional stalking of teenage users, despite knowing that its apps could have negative mental effects on teens. Thousands of cases KGM’s is pending, while 40 state attorneys general have filed lawsuits against Meta similar to the New Mexico case.

While social media platforms Legally protected So that they do not bear responsibility for what users post on their platforms, this time, it was not the content on these platforms that was being prosecuted. It was the same design features, like endless scrolling and around-the-clock notifications.

“They took the model that had been used against the tobacco industry for many years, and instead of focusing on things like content, they focused on these addictive features — how the platform was designed, and the issues with design, which is different from content, where you have this First Amendment argument,” Alison Fitzpatrick, a digital media attorney and partner at Davis+Gilbert, told TechCrunch. “It turned out to be, at least in these two cases, a winning argument.”

The jury in the New Mexico case, after a six-week trial, found Meta liable for violating the state’s unfair practices law, and ordered the company to pay the maximum $5,000 per violation, for a total fine of $375 million. The Los Angeles case, which found Meta 70% liable and YouTube 30% liable for plaintiff KGM’s suffering, would fine the two companies a combined $6 million. (Snapchat and Tik Tok Resolve the issue before trial.)

“This is nothing for the meta in the world,” Fitzpatrick said. “But when you take the $6 million and multiply it by all the cases against them, it becomes a huge number.”

“We respectfully disagree with these rulings and will appeal them,” a Meta spokesperson told TechCrunch. “Reducing something as complex as teen mental health to a single cause risks leaving unaddressed many of the broader issues teens face today and ignores the fact that many teens rely on digital communities to connect and find belonging.”

TechCrunch event

San Francisco, California
|
October 13-15, 2026

Over the course of the litigation, new internal documents have been uncovered from Meta, which displays a pattern of inaction regarding the known negative impact of its platforms on minors, as well as Focused attempt To boost the time teens spend on its apps, even during school or via “finstas,” which are fake “Instagram” accounts that teens create specifically to hide from parents or teachers.

One of the documents showed a report With the results of a study conducted in 2019, Meta conducted 24 one-on-one interviews with people whose use of the product was flagged as problematic – a classification that applied to an estimated 12.5% ​​of users.

“The best external research suggests that Facebook’s impact on people’s well-being is negative,” the report says.

numerous Documents Referenced statements from Meta CEO Mark Zuckerberg and Instagram chief Adam Mosseri about prioritizing teens’ time sharing. Even Zuckerberg comments That for Facebook Live to be successful with teens, he “thinks we’ll need to get pretty good at not notifying parents/teachers.”

In other documents, Meta employees spoke disparagingly about the company’s goals of increasing the retention rate of teenage users.

“We’ve learned that one of the things we need to improve on is peeking at your phone during chemistry :)” one employee wrote in a message. Email To Meta CPO Chris Cox.

“No one wakes up thinking they want to maximize the number of times they open Instagram that day,” Meta VP of product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product teams are trying to do.”

A Meta spokesperson told TechCrunch that many of the newly released documents date back nearly 10 years, but the company is listening to parents, experts, and law enforcement about how to improve the platform.

“We’re not aiming for the time of teens today,” the spokesperson said, citing Instagram’s teen accounts, introduced in 2024, which offer built-in safety features for teen users. These protections include making virtual accounts private and allowing only people who follow them to tag them or mention them in posts. Instagram will also send time limit reminders telling teens to leave the app after 60 minutes, which can only be changed for those under 16 with parental permission.

For Kelly Stonelake, Meta’s director of product marketing who worked at the company from 2009 to 2024, these revelations come as no surprise. (Currently Stonelake Sue dead Due to discrimination and harassment based on gender.)

“The mountain of unsealed evidence really shows what I went through firsthand,” she told TechCrunch.

At Meta, Stonelake led “go-to-market” strategies for social VR app Horizon Worlds during its launch to teens. She claims to have raised concerns about the lack of effective content moderation tools in the Metaverse, but her objections were not taken seriously.

The US government has taken a keen interest in the issue of children’s online safety, especially after Meta Whistleblower Frances Haugen Damning internal documents were leaked in 2021 that showed Meta knew Instagram was harming teenage girls.

While Congress has proposed several bills aimed at addressing children’s online safety, many of these efforts would do more to monitor adults and police speech than they would to protect minors, and some Privacy activists say.

“There is no world where censorship is passed or passed.”Age verification“The law, under the guise of children’s safety, does not lead to widespread online censorship of content and speech that Trump does not like,” Evan Greer, director of Fight for the Future, said in a statement.

Stonelake once lobbied on Capitol Hill for Children’s Internet Safety Actwhich had the most momentum of any of these legislative efforts, drawing support from companies like Microsoft, Snap, X, and Apple. But as the bill developed and changed, she became critical of it.

“I urge a ‘no’ vote on the current version,” she said, citing preemption provisions in the bill, which would override state regulations on technology companies. “There is language in the latest version that would close the court’s doors to school districts, to bereaved families, to states — and that’s bizarre.”

This language could, for example, preempt the case New Mexico brought against Meta.

“We need people to come to the table with solutions, instead of what they’re doing now, which is just telling a different story to both sides of the aisle to anger them and dismay them,” Stonelake said. “The actual solution must be complex, nuanced, and take into account multiple priorities.”

Leave a Reply

Your email address will not be published. Required fields are marked *