Apple to scan iPhones for child sex abuse images in US

In a recent announcement, Apple is to scan iPhones to find child sexual abuse material (CSAM) on US customers’ devices before storing the image in iCloud. Images will be scanned by software which will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

Read more on BBC