Lawmakers want to allow users to sue harmful social media algorithms


The law least favored by tech critics is under siege again, this time with the focus on its recommendation algorithms.

Sens. John Curtis (R-Utah) and Mark Kelly (D-Ariz.) on Wednesday introduced the Algorithm Accountability Act, which amends Section 230 of the Communications Decency Act to make platforms responsible for preventing their recommendation systems from causing certain foreseeable harms. Section 230 is the law that protects online platforms — including social media sites, digital forums, and blogs with comment sections, and their users — from being held liable for other people’s unlawful posts, or engaging in content moderation in good faith. But the Algorithms Accountability Act requires commercial social media platforms to “exercise reasonable care in designing, training, testing, deploying, operating, and maintaining the recommendation-based algorithm” in order to “prevent physical injury or death.” If a platform should have been reasonably able to predict that its content recommendations would result in physical harm, Section 230 would not provide protection for the display of those recommendations.

Under the Algorithm Accountability Act, victims who have suffered physical harm, or their representatives, will be able to sue tech platforms for damages.

This approach, known as duty of care, is similar to the Children’s Online Safety Act (KOSA), Boxed bill With broad support in the Senate that stalled in the House amid technology pressure and speech concerns. Under the Algorithm Accountability Act, victims who have suffered physical harm, or their representatives, will be able to sue tech platforms for damages if they believe they breached their duty of care. But it only applies to a subset of web services: specifically, for-profit social media platforms with more than 1 million registered users.

The bill’s sponsors insist it would not infringe on First Amendment rights, anticipating regular criticism of Section 230 reforms. Like KOSA, the new bill states that it would not prevent platforms from directly serving users information they search for. It would also not restrict feeds being presented in chronological or reverse chronological order, and it would be prohibited for the legislation to be applied based on users’ views.

Curtis blamed Section 230 for enabling a toxic social media environment that he believes contributed to the September killing of conservative activist Charlie Kirk by a gunman in his home state of Utah. Recently Wall Street Journal Editorial“Online platforms likely played a major role in radicalizing Kirk’s alleged killer,” he said, a phenomenon that is “driven not only by ideology but also by algorithms — code written to keep us engaged and angry.” In a CNN The town hall at the university where Kirk, Curtis and Kelly, whose wife Gabby Giffords survived an assassination attempt, were killed. Preview their new bill Next to a message calls for “Easing political tensions on both sides of the aisle.”

Recommendation algorithms were a key issue in a major lawsuit against YouTube, Meta and other platforms earlier this year, when a gun safety group claimed they carried… Responsibility for the radicalization of a racist mass shooter By featuring hate speech in recommendation algorithms. Hate speech is legally protected, and The court dismissed The case points to both Section 230 and First Amendment concerns. But the new law could change the balance of power in a whole host of lawsuits against tech companies for everything from drug abuse to self-harm. Even in cases where expression is ultimately legal, losing protection under Section 230 can entangle platforms in lengthy legal proceedings due to challenges in hosting or moderating users’ posts.

But groups that have opposed KOSA and previous attempts to reform Section 230 such as the Electronic Frontier Foundation (EFF) I was warned Even with these safeguards, platforms will be incentivized simply to remove information that may or may not be visible may It could be construed as an infringement, perhaps even a drain on resources allocated to preventing the very harmful behavior that lawmakers want to mitigate.

Leave a Reply

Your email address will not be published. Required fields are marked *