The mega tech company Apple has tardy plans to roll out detection technology that would have scanned US users’ iPhones in search of child sexual abuse material.
It faces widespread criticism from various privacy groups and others. People worry that on-device tracking sets a dangerous example.
Apple said that it had heard all the negative feedback and will think again.
Moreover, there were concerns the system could be abused by authoritarian states.
The so-called NeuralHash technology would have scanned images just before they are uploaded to iCloud Photos.
Then it would have matched them against known child sexual abuse material on a database maintained by the National Center for Missing and Exploited Children.
If a match was found then it would have been manually reviewed by a human and, if required, steps taken to disable a user’s account and report it to law enforcement.
In a statement, Apple said:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material.
“It was due to launch later in the year.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Privacy activists expressed fear that the technology could be prolonged and used by controlling governments to spy on citizens.
The Electronic Frontiers Foundation said that while child exploitation was a serious problem, Apple’s attempt to “build a backdoor” into its data storage and messaging systems was fraught with issues.
“To say that we are disappointed by Apple’s plan is an understatement,” it said at the time. It went on to gather 25,000 signatures from consumers opposed to the move.
Apple as a global tech company has always been a promoter of privacy of its users and end-to-end encryption in the past.