Aug 6, 2021

Apple to scan iPhones for child sex abuse images

Apple has announced details of a system to find child sexual abuse material on customers' devices. Apple says the technology will also catch edited but similar versions of original images. "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes," Apple said. The company says that the new technology offers "Significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known CSAM in their iCloud Photos account. "Regardless of what Apple's long term plans are, they've sent a very clear signal. In their opinion, it is safe to build systems that scan users' phones for prohibited content," Matthew Green, a security researcher at Johns Hopkins University, said.

Read the full story

 Related companies

Make a complaint about Apple by viewing their customer service contacts.