– Apple's Legal Battle: Apple faces a lawsuit over its decision to halt CSAM (Child Sexual Abuse Material) detection in iCloud. –
– Background: In 2021, Apple announced a plan to detect CSAM on iCloud, aiming to protect children from explicit content. –
– Abandonment: After privacy backlash, Apple suspended the feature, citing concerns over potential misuse and privacy risks. –
– Lawsuit Filed: A group of privacy advocates claims Apple's reversal endangers children and weakens safety measures. –
– Google's Involvement: Google also faces scrutiny over similar practices, as its cloud services are under investigation for CSAM detection methods.