Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

After one day The Wall Street Journal A trending story about Meta’s dismal findings on the mental health of teenage girls ran on Instagram, and CEO Mark Zuckerberg wondered whether Meta should change how it studies the potential harms of its platforms.
“Recent events have made me think about whether we should change our approach to research and analysis on social issues,” Zuckerberg wrote in a Sept. 15, 2021, email to senior executives including then-COO Sheryl Sandberg and Chief Global Affairs Officer Nick Clegg. the day before, magazine Posted a story Based on documents obtained from a whistleblower, later revealed to be Frances Haugen, which showed that research conducted by the private company found that “thirty-two percent of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse.” The subject line of Zuckerberg’s email read: “Social Issues Research and Analysis – Privileged and Confidential.”
The 2021 email was revealed Thursday after New Mexico Attorney General Raul Torrez discovered it as part of a Case claims dead It deceptively positioned its products as safe for teens when it was aware of harmful design choices that the state claims were addictive to children and allowed child predators to flourish. In the complaint, the Attorney General’s Office alleged that disclosing the harms identified by Meta on its platforms “would have corrected the misleading and deceptive nature of its public statements declaring its platforms to be ‘safe.’” Meta spokesman Andy Stone said, Edge In a statement it said the company “is proud of our ongoing commitment to transparent, industry-leading research. As we have done for years, we continue to use these insights to make meaningful improvements, such as offering teen accounts with built-in protections and providing parents with the tools to manage their teen’s experiences.”
Email is just one example of this type Internal conversations It is expected to be revealed throughout that trial, and in a A series of cases with similar claims In california. Opening statements in the New Mexico case are expected to begin next week.
In the email, Zuckerberg wrote that it appears that Meta’s peers have been able to avoid public criticism on social issues by doing less proactive research on the harms to their platforms. “Apple, for example, doesn’t seem to be considering any of these things,” he wrote. “As far as I understand, they don’t have anyone reviewing or moderating content and they don’t even have a reporting flow in iMessage. They’ve taken the approach that what they do on the platform is people’s own responsibility, and by Apple not taking that responsibility on themselves, they haven’t created a task force or a large number of studies examining the trade-offs in their approach. This has worked surprisingly well for them.”
“(W)hen Apple tried to do something about CSAM, it was heavily criticized for it.”
While Apple appeared to be dodging criticism, in Zuckerberg’s view, Meta instead “faced more criticism” because Report more Child Sexual Abuse Material (CSAM)which “makes it seem as if there is more of this behavior on our platforms.” On the other hand, he noted, “When Apple tried to do something about CSAM, it came under fire for it, which may encourage it to double down on its original approach.” Zuckerberg may have been referring to Apple’s announcement from earlier that year New features aimed at protecting childrenincluding scanning users’ iCloud photos for CSAM. But privacy advocates worry that the move could create a giant backdoor to monitor user accounts. Apple later I backed away from the proposals. Apple did not immediately respond to an email request for comment.
Apple and Meta have it Quarrel for a long time In public places And private in their different approaches to policy issues such as privacy and age verification. But Zuckerberg also had similar remarks toward other meta peers. “YouTube, Twitter, and Snap follow a similar approach, to lesser degrees,” he wrote. “It appears that YouTube is deliberately burying its head in the sand to stay under the radar and not be the center of attention. Twitter and Snap may not have the resources to do this type of research.” Many platforms have publicly shared research and initiatives over the years to study the safety of their platforms, including YouTube Youth & Families Advisory Committee Composed of independent experts Guiding the well-being of adolescents On the platform as well Snap’s Digital Wellbeing Index (started 2022).
“I think we should be commended for the work we do to study, understand and improve social issues on our platforms.”
Zuckerberg appears to believe that the public response to his internal research has been unfair. “I think we should applaud the work we do to study, understand and improve social issues on our platforms,” he wrote. “Unfortunately, any research or recommendations produced are more likely to be used by the media to say that we are not doing everything we can (implying cowardly purposes) rather than that we are taking these issues more seriously than anyone else in our industry by studying them and looking for solutions, not all of which are reasonable to implement because everything has trade-offs.”
In response to the email, at least two senior executives supported continuing some level of research into social issues, even despite the risks of public perception. “Leaks are bad, and they will continue to happen unless we find a way to eliminate them,” wrote Javier Olivan, vice president of central products at the time. “Given that — is it still worth trying to understand these issues? I think that’s the responsible thing to do/I’d like us to continue to try to understand how we can make our products better for everyone, but maybe we should scratch the surface in those areas where we see at least a clear degree of correlation between the use of our products/the specific issue.” David Ginsberg, then vice president of product, selection and competition, said that “after a lot of wrestling with this myself the last few days,” he largely agreed with Olivan. “I think internal work is important to provide a good product and a good user experience – separate and apart from any societal issues goals.”
A few days later, Jay Rosen, a product executive who leads the Integrity business, shared several potential options for how to change the company’s organization around internal and external research, including the pros and cons of each. This was only a “preliminary/judgmental exercise” to understand the “range of options,” Rosen wrote. These options ranged from centralizing teams researching highly sensitive topics in an attempt to better control access to materials, to the more extreme option of dissolving teams researching sensitive topics and outsourcing the work when needed. Ultimately, executives recommended the less extreme option of centralizing search teams, and plan to announce it shortly after Instagram CEO Adam Mosseri testifies before Congress. “Announcing this after my testimony (sic) is worse than before, and we talked (about) this. It will leak, and it will make it look like I was hiding something,” Mosseri, who was recently added to the email thread, said. He ended up dead Announcement of changes He accepted Al-Musari’s testimony, and he said it He continues studying Sensitive topics such as the well-being of teenagers.
In the initial email, Zuckerberg lamented that internal document leaks make it difficult to do this work. “This may be part of the reason why the rest of the industry is choosing a different approach to these issues.”
Correction February 5: An earlier version misspelled Francis Haugen.