An examination of Apple’s plans to ‘scan’ your iPhone photos for abusive content

The proliferation of child sexual abuse material on the internet is harrowing and sobering, but Apple may help tackle this issue

Who holds the key?

Digital files can be protected in a sort of virtual lockbox via encryption, which garbles a file so that it can be revealed, or decrypted, only by someone holding a secret key. Encryption is one of the best tools for protecting personal information as it traverses the internet.

Can a cloud service provider detect child abuse material if the photos are garbled using encryption? It depends on who holds the secret key.

Many cloud providers, including Apple, keep a copy of the secret key so they can assist you indata recoveryif you forget your password. With the key,the provider can also matchphotos stored on the cloud against known child abuse images held by the National Center for Missing and Exploited Children.

But this convenience comes at a big cost. A cloud provider that stores secret keys mightabuse its accessto your dataor fall prey to adata breach.

A better approach to online safety isend-to-end encryption, in which the secret key is stored only on your own computer, phone, or tablet. In this case, the provider cannot decrypt your photos. Apple’s answer to checking for child abuse material that’s protected by end-to-end encryption is a new procedure in which the cloud service provider, meaning Apple, and your device perform the image matching together.

Spotting evidence without looking at it

Though that might sound like magic, with modern cryptography it’s actually possible to work with data that you cannot see. I have contributed to projects that use cryptography tomeasure the gender wage gapwithout learning anyone’s salaryand todetect repeat offenders of sexual assaultwithout reading any victim’s report. And there aremany more examplesof companies and governments using cryptographically protected computing to provide services while safeguarding the underlying data.

Apple’s proposed image matchingon iCloud Photos uses cryptographically protected computing to scan photos without seeing them. It’s based on a tool calledprivate set intersectionthat has been studied by cryptographers since the 1980s. This tool allows two people to discover files that they have in common while hiding the rest.

Here’s how the image matching works. Apple distributes to everyone’s iPhone, iPad, and Mac a database containing indecipherable encodings of known child abuse images. For each photo that you upload to iCloud, your deviceapplies a digital fingerprint, called NeuralHash. The fingerprinting works even if someone makes small changes in a photo. Your device then creates a voucher for your photo that your device can’t understand, but that tells the server whether the uploaded photo matches child abuse material in the database.

If enough vouchers from a device indicate matches to known child abuse images, the server learns the secret keys to decrypt all of the matching photos – but not the keys for other photos. Otherwise, the server cannot view any of your photos.

Having this matching procedure take place on your device can be better for your privacy than the previous methods, in which the matching takes place on a server – if it’s deployed properly. But that’s a big caveat.

Figuring out what could go wrong

There’s aline in the movie “Apollo 13”in which Gene Kranz, played by Ed Harris, proclaims, “I don’t care what anything was designed to do. I care about what it can do!” Apple’s phone scanning technology is designed to protect privacy. Computer security and tech policy experts are trained to discover ways that technology can be used, misused, and abused, regardless of its creator’s intent. However, Apple’s announcementlacks information to analyze essential components, so it is not possible to evaluate the safety of its new system.

Security researchers need to see Apple’s code to validate that the device-assisted matching software is faithful to the design and doesn’t introduce errors. Researchers also must test whether it’s possible to fool Apple’s NeuralHash algorithm into changing fingerprints bymaking imperceptible changes to a photo.

It’s also important for Apple to develop an auditing policy to hold the company accountable for matching only child abuse images. The threat of mission creep was a risk even with server-based matching. The good news is that matching devices offer new opportunities to audit Apple’s actions because the encoded database binds Apple to a specific image set. Apple should allow everyone to check that they’ve received the same encoded database and third-party auditors to validate the images contained in this set. These public accountability goalscan be achieved using cryptography.

Apple’s proposed image-matching technology has the potential to improve digital privacy and child safety, especially if Apple follows this move bygiving iCloud end-to-end encryption. But no technology on its own can fully answer complex social problems. All options for how to use encryption and image scanning havedelicate, nuanced effectson society.

These delicate questions require time and space to reason through potential consequences of even well-intentioned actions before deploying them, throughdialoguewith affected groups and researchers with a wide variety of backgrounds. I urge Apple to join this dialogue so that the research community can collectively improve the safety and accountability of this new technology.

Article byMayank Varia, Research Associate Professor of Computer Science,Boston University

This article is republished fromThe Conversationunder a Creative Commons license. Read theoriginal article.

Story byThe Conversation

An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with

More TNW

About TNW

Apple store workers in France to strike during iPhone 15 launch

EU investigates Apple, Google, Meta in first-ever probe under DMA competition law

Discover TNW All Access

Apple begrudgingly allows EU customers to use rival app stores on iPhone

EU brings Apple, Google to heel in €15.4B courtroom double-whammy