Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Both Google and Meta deny the allegations in the complaint. “Providing a safer and healthier experience for young people has always been at the core of our work,” Jose Castañeda, a Google spokesperson, said in a statement. “Collaborating with youth, mental health and parenting experts, we have built services and policies to provide youth with age-appropriate experiences, and provide parents with strong controls.”
“For more than a decade, we have listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” Meta spokeswoman Stephanie Ottway said in a statement. “We’re using these insights to make meaningful changes — like offering teen accounts with built-in protections and providing parents with the tools to manage their teens’ experiences.”
KGM started watching YouTube at age six, had an Instagram account when she was 11, got Snapchat at 13, and TikTok a year later — with each app allegedly adding to “an anxiety and depression spiral fueled by low self-esteem and body dysmorphia.” According to her lawyer, Joseph VanZandt. She and her mother, Karen Glenn, filed a lawsuit against Meta, Google’s YouTube, Snap, and TikTok, claiming that features such as “autoplay” and “infinite scrolling” contributed to her social media addiction, and that social media use contributed to her anxiety and depression, making her feel insecure about herself. (Snap and TikTok settled the case with KGM before trial. Terms were not disclosed.)
Glenn Saw last year She did not realize the harm that these platforms could cause to her daughter, and that she would not have given her a phone if she had known about these harms previously. Bergman says KGM’s lawsuit was chosen as the “flagship” case because it “represents many other young women who have suffered serious mental health harm, illness, and emotional distress as a result of social media.”
“The goal of lawyers who bring these cases is not just to win and get compensation for their individual clients,” says Benjamin Zipursky, a law professor at Fordham University School of Law. “They are aiming for a series of victories in this sample of so-called ‘pioneering trials.’ Then they will try to pressure the companies into a collective settlement in which they pay billions of dollars and also agree to change their practices.”
KGM’s is the first of 22 groundbreaking trials to be held in Los Angeles Superior Court. A favorable outcome in favor of the plaintiff could give the remaining roughly 1,600 litigants significant leverage — and perhaps force tech companies to adopt new safeguards. The trial also promises to raise broader awareness of social media business models and practices. “If the public has a very negative reaction to what is shown, or what the jury finds, that could impact legislation at the state or federal level,” Zipursky adds.
Bergman, who has spent 25 years representing asbestos victims, says this trial feels like a repeat of what happened in the past. “When Frances Hogan testified before Congress and revealed for the first time what social media companies knew their platforms were doing to target vulnerable young people, I knew this was asbestos again,” Bergman says.
In seeking to draw parallels from product liability cases against Big Tobacco and the auto industry, the plaintiffs’ main argument is that the big tech companies designed their social media platforms in a negligent manner, meaning they did not take reasonable steps to avoid causing harm. “Specifically, plaintiffs argue that design features such as infinite scrolling and autoplay caused certain injuries to minors, including eating disorders, self-harm, and suicide,” says Mary Anne Franks, a law professor at George Washington University.
On the other hand, technology companies are more likely to focus on causation and defenses of free speech. “Defendants will argue that it was third-party content that caused plaintiffs’ injury, not access to that content provided by the platforms,” Franks says. She says the companies are also likely to argue that “to the extent corporate decision-making about content moderation is involved, the decision-making process is protected by the First Amendment,” citing a 2024 U.S. Supreme Court ruling in Modi v. Netchoice.