Social media addiction knowingly inflicted on children, lawsuits say


from Colin LetcherCalMatters

"A
People, school districts and states suing tech companies say their platform designs and marketing have hooked kids on social media. A child takes photos and videos to upload to Instagram during a community event in San Francisco on February 22, 2024. Photo by Laure Andrillon for CalMatters

This story was originally published by CalMatters. Sign up for their newsletters.

The Meta researcher’s tone was alarmed.

“Oh my god, IG is dope,” the user experience specialist allegedly wrote to a colleague, referring to social media platform Instagram. “We’re actually pushers… We’re causing reward deficit disorder because people binge on IG so much that they can no longer feel reward.”

The researcher concludes that consumer addiction is “biological and psychological” and that company management is keen to exploit the dynamic. “Top-down directives do everything to ensure that people keep coming back for more,” the researcher added.

The conversation was recently included as part of a long-simmering lawsuit in federal court in California. Condensing complaints from hundreds of school districts and state attorneys general, including California, the suit alleges that social media companies knew about the risks to children and teens but continued to market their products to them, putting profits over children’s mental health. The lawsuit seeks monetary damages and changes in the companies’ business practices.

The lawsuit and a similar one filed in Los Angeles Superior Court target Facebook, Instagram, YouTube, TikTok and Snap. The cases reveal embarrassing internal conversations and findings at the companies, particularly Facebook and Instagram owner Meta, further tarnishing their brands in the public eye. They are also testing a specific attack vector against the platforms that targets not so much worrisome content as design and marketing decisions that accelerate the damage. Some believe the result could be new forms of regulation, including at the federal level.

One document discussed at a hearing this week included a 2016 email from Mark Zuckerberg about Facebook’s live video feature. In the email, the Meta executive wrote, “we’re going to have to be very good about not notifying parents/teachers” about teen videos.

“If we tell the parents of the teenagers about their live videos, it will probably ruin the product from the start,” he wrote, according to the email.

In slides summarizing the tech company’s internal documents released this week as part of the lawsuit, an internal discussion at YouTube suggests that underage accounts in violation of YouTube’s rules have been active on the platform for years, creating content for an average of “938 days before detection — giving them plenty of time to create content and continue to put themselves and the platform at risk.”

A spokesperson for Meta did not immediately respond to requests for comment.

A YouTube spokesman, Jose Castaneda, described the slide released this week as a “selective view of a much broader safety framework” and said the company uses more than one tool to detect underage accounts, while taking action whenever it detects an underage account.

“If we tell the parents of teenagers about their live videos, it will probably ruin the product from the start.”

Mark Zuckerberg, CEO of Meta, in a 2016 email

In court, the companies argued that they were making editorial decisions permitted by the First Amendment. That trial is scheduled for June.

The state court case moved to jury selection this week, increasing pressure on social media companies.

While the state and federal cases differ slightly, the underlying argument is the same: that social media companies knowingly designed their products to hook young people, leading to disastrous but predictable consequences.

“It leads to mental health issues, severe anxiety, depression for many. For some, eating disorders, suicidality,” said Previn Warren, one of the lead attorneys on the federal court case. “For schools, it’s a loss of control over the educational environment, the inability of teachers to really control their classrooms and teach.”

Federal suit

Meta and other companies have faced backlash for years over their treatment of children on their platforms, including Facebook and Instagram. Parents, lawmakers and privacy advocates say social media has contributed to the mental health crisis among young people and that tech companies have failed to act when that fact became clear.

Those allegations received new scrutiny last month when a brief release citing still-sealed documents in the federal suit became public.

While the lawsuit also names TikTok, Snap and Google as defendants, the document includes allegations against Meta that are particularly detailed.

In the more than 200-page filing, for example, the plaintiffs allege that Meta intentionally misled the public about how harmful their platforms are.

Warren pointed to claims in the brief that Meta researchers found that 55 percent of Facebook users had “mild” problematic use of the platform, while 3.1 percent had “severe” problems. Zuckerberg, according to the brief, indicated that 3% of the billions would still be millions of people.

But the brief claims the company published research, noting only that “we estimate (as an upper bound) that 3.1% of US Facebook users experience problematic use.”

“That’s a lie,” Warren said.

In response to the recent interest in costumes, Meta posted blog post this month, arguing that the litigation “oversimplifies” the issue of youth mental health and pointed to past cases in which he has worked with parents and families with child protection functions.

The federal case faced a key hearing this week as the defendants argued that the judge should dismiss the case on a fast-track basis. A decision on that proposal is likely to come in the next few weeks, Warren said.

Social media companies, like other web-based services, receive protection from certain legal claims under a section of federal law. Section 230 of the Communications Decency Act gives legal immunity to website operators for potentially illegal content on their platforms.

Mary Anne Franks, a First Amendment legal scholar at George Washington University who has long studied Section 230, said that not only the online content itself, recent social media cases have focused on the design of the platforms and their marketing.

“The litigation strategy says it is the way that you’re providing that space and directing that at people who are vulnerable, that’s really the problem here,” she said. “That’s your own behavior, not somebody else’s.”

Companies make key decisions behind the scenes, she said, and can be held accountable for them.

“You manipulated things,” she said, prosecutors allege. “You have deliberately made choices about what comes to the top or what is directly accessible or can be tempting to vulnerable users.”

A trial begins in the state of California

Meanwhile, a related state case went to jury selection this week.

The case, which contains similar allegations of personal injury caused by social media companies, has also drawn national attention, and industry bigwigs such as Zuckerberg are expected to take the stand.

The personal injury lawsuit centers on an unnamed plaintiff who claims her mental health was damaged by an addiction to social media.

In last-minute development this week, TikTok and Snap they are reported to have reached undisclosed settlements in this case. Meta and Google continue as defendants.

Franks said these trials could be a turning point in regulating how technology companies design and market their products. While the companies have come under scrutiny in the past, she said, the glare of scrutiny during the process can be particularly glaring.

“It’s always been talked about, and members of Congress kind of said, ‘maybe we’ll regulate you,'” she said. “I think now platforms are getting really nervous about what it’s going to mean if they look really bad on the stand.”

This article was originally published on CalMatters and is republished under Creative Commons Attribution-NonCommercial-No Derivatives license.

Leave a Reply

Your email address will not be published. Required fields are marked *