Apple Kills Its Plan to Scan Your Images for CSAM. Right here’s What’s Subsequent
In August 2021, Apple introduced a plan to scan images that customers saved in iCloud for little one sexual abuse materials (CSAM). The instrument was meant to be privacy-preserving and permit the corporate to flag doubtlessly problematic and abusive content material with out revealing anything. However the initiative was controversial, and it quickly drew widespread criticism from privateness and safety researchers and digital rights teams who had been involved that the surveillance functionality itself may very well be abused to undermine the privateness and safety of iCloud customers around the globe. At the start of September 2021, Apple stated it