Apple will scan photos stored on iPhone and iCloud for child abuse images

Apple will scan photos stored on iPhone and iCloud for child abuse images

366

Apple plans to scan photos stored on iPhones and iCloud for child abuse images. The new system could assist law enforcement in criminal investigations but could open the door to increased lawsuits and government lawsuits for user data.

The system, called neuralMatch, will "proactively alert the human review team if it believes illegal imagery was detected, which will then contact law enforcement if the material can be verified," the Financial Times said. neuralMatch, which is trained using 200,000 images from the National Center for Missing & Exploited Children, will launch first in the US. Photos will be hashed and compared against a database of known child sexual abuse images.

Also Read: Best Cryptocurrency Exchange Apps In 2021

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

Johns Hopkins University professor and cryptographer Matthew Green voiced concerns about the system on Twitter late Wednesday. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”

“Even if you believe Apple won’t allow these tools to be misused [crossed fingers emoji] there’s still a lot to be concerned about,” he added. “These systems rely on a database of ‘problematic media hashes’ that you, as a consumer, can’t review.”

Apple already checks iCloud files for known child abuse images, like every other major cloud provider. But the system described here goes a step further, allowing central access to local storage. It would also be trivial to extend the system to crimes other than child abuse – a particular concern given Apple's vast business in China.

The company notified several US academics about it this week, and Apple could share more about the system "as soon as this week," according to two security researchers briefed on Apple's previous meeting, the Financial Times reported.

Apple has previously touted the privacy protections built into its devices and famously defended the FBI when the agency wanted Apple to build a backdoor into iOS to access the iPhone used by one of the shooters in the 2015 attack on San Bernardino. The company did not respond to a request for comment on the Financial Times report.

Keywords: apple, iphone, ipad, icloud, child abuse, apple child abuse warn

Share:



Prev Post:


Next Post: