Search

Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries - 9to5Mac

Apple is reportedly set to announce new photo identification features that will use hashing algorithms to match the content of photos in users’ photo libraries with known child abuse materials, such as child pornography.

Apple’s system will happen on the client — on the user’s device — in the name of privacy, so the iPhone would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.

Apple has previously said it employs hashing techniques as photos are uploaded to iCloud. This new system would be done on the client side, on the user’s device. Apple is yet to officially announce this new initiative, and the details will matter.

At a high level, this kind of system is similar to the machine learning features for object and scene identification already present in Apple Photos. Analysis happens on-device, and users can take advantage of better search functionality.

However, cryptography and security expert Matthew Green notes that the implications of such a rollout are complicated. Hashing algorithms are not foolproof and may turn up false positives. If Apple allows governments to control the fingerprint content database, then perhaps they could use the system to detect images of things other than clearly illegal child content, such as to suppress political activism.

However, note that all photos uploaded to iCloud Photos for backup and sync are not stored end-to-encrypted anyway. Photos are stored in an encrypted form on Apple’s server farms, but the keys to decrypt are also owned by Apple. This means that law enforcement agencies can subpoena Apple and see all of a user’s uploaded photos. (This is not unusual, all third-party photo services work this way.)

It is possible that in the future Apple could roll out similar systems to scan content on the client side, that would later be stored on a server in an end-to-end encrypted manner. Many governments have campaigned for such a system from E2E private messaging apps like iMessage and WhatsApp as they are worried that the increasing shift to encrypted communications will make it harder for law enforcement to find and prosecute child abuse cases.

Green speculates that Apple wouldn’t have invested in developing this system if applying it to end-to-end encrypted content wasn’t a long term goal.

Check out 9to5Mac on YouTube for more Apple news:

Adblock test (Why?)

Article From & Read More ( Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries - 9to5Mac )
https://ift.tt/3CtXIAO


Bagikan Berita Ini

0 Response to "Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries - 9to5Mac"

Post a Comment

Powered by Blogger.