Apple Delays CSAM Photo-Scanning Feature
Apple will "take additional time to make improvements" before launching its technology to detect and report known child sexual abuse material (CSAM) in users's iCloud photos. The tools, first revealed a month ago, are designed to preserve user privacy by using sophisticated hashing algorithms that run on the user's device. Only once a threshold of at least 30 CSAM images is detected can photos be decrypted by Apple for manual inspection and potential reporting to authorities. Privacy experts have expressed concern over the feature, as some governments may be tempted to force Apple to search for other types of imagery. Privacy advocates also expressed concern that the system could be abused by third parties to implicate innocent people. Apple has responded to these concerns, stating that its database of known CSAM will never include images reported from just one country, and researchers will be able to verify that all Apple devices are using the same database of known CSAM.
Comments