From now on, Apple will be scanning users’ phones quite officially

According to the Financial Times, Apple plans to scan photos stored on iPhones and iCloud for child abuse. The new system could help law enforcement agencies in criminal investigations.

The system, dubbed neuralMatch, “will proactively alert a team of experts if it believes illegal images are found, and the experts will then contact law enforcement if the material needs to be verified,” the Financial Times reports.

The neuralMatch AI, which has been trained using 200,000 images from the National Center for Missing and Exploited Children, will first be launched in the United States. The photos will be hashed and compared to a database of known child sexual abuse images.

“According to people briefed on the plans, each photo uploaded to iCloud in the U.S. will be given a ‘security voucher’ indicating whether the photo is suspicious or not,” the Financial Times reported. “Once a certain number of photos have been flagged as suspicious, Apple will allow all suspicious photos to be decrypted and, if they turn out to be illegal, turn them over to the appropriate authorities.”

Johns Hopkins University professor and cryptographer Matthew Green expressed concerns about the system on Twitter Wednesday night. “A tool like this could be a boon to finding child pornography on people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”

“Even if you think Apple won’t allow these tools to be abused, we have a lot to worry about,” the professor added. “These systems rely on a database” of problematic media hashes” that you, as a consumer, can’t review.”

Apple already checks iCloud files for known images of child abuse, just like any other major cloud provider. But the system described here would go further, providing centralized access to local storage. It would also be trivial to extend the system to crimes other than child abuse, which is of particular concern given Apple’s extensive business in China.

The company briefed some U.S. academics this week, and according to the Financial Times, Apple may reveal more about the system “as early as the other day.”

Apple has previously advertised privacy protections built into its devices and is known to have spoken out against the FBI when the agency wanted Apple to build into iOS a backdoor to access the iPhone used by one of the shooters in the 2015 San Bernardino attack.

The company did not respond to a Financial Times request for comment.