Apple sued for abandoning Child Sexual Abuse Material (CSAM) detection for iCloud

Apple is facing a lawsuit over its decision to remove a system for detecting child sexual abuse material (CSAM) in iCloud pictures....

Apple sued for abandoning Child Sexual Abuse Material (CSAM) detection for iCloud

Image Credits: canva

Apple is facing a lawsuit over its decision to remove a system for detecting child sexual abuse material (CSAM) in iCloud pictures. The lawsuit claims that Apple’s inactivity is causing victims harm by enabling these damaging images to circulate further.

Background of the Lawsuit

The lawsuit is filed by a 27-year-old woman using a pseudonym. She alleges that Apple failed to carry out its 2021 plans to scan iCloud for CSAM. The woman, a victim of child sexual assault, claims that by failing to take action, Apple is forcing survivors to relive their pain. The photographs of her torture are still widely circulated online. She continues to receive alerts from law police about people being prosecuted with possessing them.

Apple’s Initial Plan and Backlash

In 2021, Apple announced plans to scan images stored in iCloud for known CSAM. To detect the material, this kind of technology would use digital signatures provided by the National Center for Missing and Exploited Children. However, the corporation abandoned this proposal after privacy groups expressed concerns. The system may open a backdoor for government spying, risking user security and privacy.

The Lawsuit’s Possible Effect

The lawsuit claims that Apple’s inability to fulfill its original pledge to eradicate CSAM hurts victims by encouraging the spread of unlawful content. Attorney James Marsh, who represents the woman, believes that up to 2,680 other victims may be eligible for compensation.

Apple’s response

Apple has yet to comment on the pending litigation. The business informed The New York Times that it is actively working on methods to handle CSAM while protecting user privacy.

Similar Legal Actions:

This complaint follows a similar one filed in August by a 9-year-old girl and her guardian accusing Apple of failing to resolve CSAM on iCloud. Apple’s handling of CSAM detection is still under scrutiny as the corporation balances security, privacy, and its role in combating online abuse.

Suggested Articles:

Grok has been made free for all X Users: No subscription required

Gemini AI Assistant Gets New Utilities Extension for Smartphone Control

Written by Varinderjeet Kaur
Passionate Blogger, skilled SEO Executive, and innovative Digital Marketer
Profile  

Leave a Reply

Your email address will not be published. Required fields are marked *