Apple sued for abandoning CSAM detection for iCloud

Apple is being sued over its decision not to implement a system to check iCloud Photos for child sexual abuse material (CSAM).

According to The New York Times, the lawsuit claims that by not doing more to stop the spread of this material, victims are being forced to re-experience their trauma. The lawsuit states that Apple announced “well-publicized improved designs aimed at protecting children” but “took no steps to implement these designs or to detect and restrict this material.”

Apple first announced the system in 2021, explaining that it would use digital signatures from the National Center for Missing and Exploited Children and other groups to detect known CSAM content in users’ iCloud libraries. But those plans appear to have been abandoned after security and privacy advocates suggested they could create a backdoor for government surveillance.

It is reported that this lawsuit was filed by a 27-year-old woman who sued Apple under a pseudonym. She said a relative molested her when she was a child and shared photos of herself online, and that she still receives notices almost daily from law enforcement that someone has been charged with possessing those images.

James Marsh, an attorney involved in the lawsuit, said there are 2,680 potential victims who could receive compensation in the case.

TechCrunch has reached out to Apple for comment. “The company is urgently and proactively innovating to fight these crimes without compromising the security and privacy of all our users,” a company spokesperson told The Times.

Last August, a 9-year-old girl and her guardian sued Apple, accusing it of failing to address CSAM in iCloud.