Apple has been sued over iCloud CSAM allegations


Apple faces a lawsuit It was filed Thursday by West Virginia Attorney General J.B. McCosky Claims that iCloud It is used to store and distribute child sexual abuse material over the Internet. Makowski claims Apple has known about this “for years” and “chose not to do anything about it.”

The lawsuit contains alleged screenshots of iMessage between Apple executives Eric Friedman and Herve Siebert admitting to storing and distributing CSAM on iCloud in February 2020.

“In an iMessage conversation about whether Apple was focusing too much on privacy and not enough on trust and safety for children, Friedman bragged that iCloud was ‘the greatest platform for distributing child pornography’ and that Apple ‘chose to not know in enough places that we can’t really tell,'” the lawsuit alleges.

In the same conversation, Friedman pointed to a New York Times article about the CSAM discovery and revealed that he suspects Apple of underreporting the extent of the CSAM problem it has in its products.

The lawsuit cites the number of reports of detected child sexual abuse submitted to the National Center for Missing and Exploited Children in 2023 by Apple (267) compared to Google (1.47 million) and Meta (30.6 million).

The lawsuit alleges that Apple failed to implement CSAM detection tools, including a proprietary scanning tool it was working on. In 2021 Apple An initiative was launched To scan photos stored on iCloud for CSAM, which she abandoned the following year.

The role of end-to-end encryption

It also refers to Apple Security provides advanced data protectionwhich became Available December 2022 on iCloud It allows end-to-end encryption of images and videos on the cloud storage platform. The lawsuit alleges that end-to-end encryption represents an “obstacle to law enforcement, including identifying and prosecuting child sexual abusers and their abusers.”

“Maintaining the privacy of child abusers is absolutely inexcusable,” Makowski said in a statement Thursday. “Since Apple has so far refused to police itself and do the morally right thing, I am filing this lawsuit to demand that Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared.”

Apple told CNET that “safety and privacy” are at the heart of its decisions, especially for children.

“We innovate every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” Apple said Thursday. “All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in shared messages, photos, AirDrop, and even FaceTime live calls — are designed with the safety, security and privacy of our users at their core.”

the Communications safety The feature is turned on by default for users under 18 years of age. It attempts to protect children from CSAM content, but does not address adults who distribute and store CSAM.

The balance between privacy and security on the one hand, and law enforcement and cybercrime on the other, has been at the heart of the debate over end-to-end encryption.

Privacy advocates, such as the Electronic Frontier Foundation, praised the introduction of encryption to iCloud in 2022, noting that “the constant scanning of child abuse images can lead to unwarranted investigations and false positives.” He pointed out Protect people’s sensitive iCloud data Such as images against potential cloud data breaches and government demands.

“Banning the use of end-to-end encryption would be counterproductive and interfere with everyone’s security and privacy online,” Thorin Klosowski, a security and privacy campaigner at the EFF, told CNET in a statement Thursday. “Encryption is our best way to protect online privacy, which is especially important for young people.”

Data breaches are on the riseas it is government and Requests by law enforcement agencies for user data to Various reasons. You can see Apple Transparency Report on the number of government requests it receives for user data, although it appears the cap will expire in December 2024.

End-to-end encryption is also used by Google has its own messaging servicesIn addition to popular messaging applications such as WhatsApp, Signal, and Telegram.

The complaint was filed in Circuit Court of Mason County, West Virginia, on February 19.

A follows Class action It was filed at the end of 2024 in a Northern California District Court by 2,680 plaintiffs who claim that Apple’s abandoned CSAM scanning software amounts to the tech giant knowingly allowing it to be distributed and stored on iCloud. In August 2024, a similar lawsuit was filed on behalf of a 9-year-old sexual assault victim In North Carolina.



Leave a Reply

Your email address will not be published. Required fields are marked *