Craig Federighi Thinks We’re All Confused About Apple’s New iCloud Photo Scanning

“So to be clear, we’re not actually looking for child pornography on iPhones. That’s the first root of the misunderstanding”
The Wall Street Journal video Interview with Craig Federighi

In a video interview today with The Wall Street Journal, Apple’s SVP of software engineering, Craig Federighi, sat down to address the “root misconception” surrounding Apple’s new “Communications safety in Messages” and iCloud photo scanning for detection of Child Sexual Abuse Material (CSAM).

Along with admitting wide spread confusion surrounding these new policies related to the fight against child pornogrophy, he revealed a number of new details about the algorithm and audit they created in order to do this.

If you’re customer using iCloud Photo library – which you don’t have to – but if you’re using iCloud Photo library to store your photos in the cloud, then what’s happening is a multi-part algorithm where there’s a degree of analysis done on your device, as it uploads a photo to the cloud, so that the cloud can then do the other half of the algorithm, and if and only if you meet a threshold of something on the order of thirty of known child pornographic images matching, only then does Apple know anything about your account and know anything about those images.

– Apple SVP Craig Federighi

Federighi also admitted that Apple handled last weeks controversial announcement poorly by simultaneously announcing the two side by side that lead many to believe it was one feature.

It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood. We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.


In hindsight, introducing these two features at the same time was a recipe for this kind of confusion. By releasing them at the same time, people technically connected them and got very scared – what’s happening with my messages? The answer is: nothing is happening with your messages.

– Apple SVP Craig Federighi

Originally, when we covered this here, we broke it up into a two-parter: The first, addressed “communications safety in messages” that notified parents when their child received or sent a sexually explicit photo. The second, tackled the more controversial issue of “enhanced detection of Child Sexual Abuse Material (CSAM)” that would scan all of Apple’s users’ iCloud photos. The reason we chose to do that is because it appeared to be two separate features, and in their newly published FAQ, Apple confirms this, and now in this video interview with The Wall Street Journal, Federighi confirms this.

Although many might have been confused about that part in particular, we don’t think anyone was confused about the controversial side that stated Apple will in fact scan all photos you choose to store in iCloud. Federighi again confirms this but packages it in a way that explains it like this: unlike many services – like Google, Facebook, and Microsoft, it will run searches on the device, not fully remotely. “Imagine someone was scanning images in the cloud. Well, who knows what’s being scanned for?” Federighi continued, in reference to remote scans. “In our case, the database is shipped on device. People can see, and it’s a single image across all countries.”

Federighi further elaborated on how Apple will not overstep and expand the database to anything other than the material of illegal CSAM images to offer some peace-of-mind, particularly in countries with restrictive censorship policies.

We ship the same software in China with the same database we ship in America, as we ship in Europe. If someone were to come to Apple – with a request to scan for data beyond CSAM- Apple would say no. But let’s say you aren’t confident. You don’t want to just rely on Apple saying no. You want to be sure that Apple couldn’t get away with it if we said yes, there are multiple levels of auditability, and so we’re making sure that you don’t have to trust any one entity, or even any one country, as far as what images are part of this process.

Apple SVP Craig Federighi

While Apple had previously said that this new system would only be rolling out in the United States and that it would consider launching in other countries on a case-by-case basis, Apple says it will ship the hash database of known CSAM on the operating system in all countries, but it will only be used for scanning in the US.

During the WSJ interview, its further clarified that there will be an independent auditor who will verify the pictures are in fact what they appear to be and not just a keepsake photo of your child in the bath. Apple has said before that a single match won’t trigger a red flag — a measure intended to prevent false positives. Instead, there will be an unspecified number that is the exact threshold. For obvious reasons, Apple has declined to publicize that exact number, but Federighi says it’s “on the order of 30 known child pornographic images.”

He continued with a claim that the iPhone and iPad’s device-level scanning is going to help security experts verify that Apple will use the new system responsibly, and for everyone’s sake, let’s hope that claim is correct.

Check out Apple’s latest PDF published today here that further details Federighi’s talking points and watch the full WSJ video interview below:

What’s your Reaction?
Haha
128
Haha
Love
278
Love
Hmmm
215
Hmmm
WAAAT
294
WAAAT
Noooo
90
Noooo
WTF
265
WTF
Prev
LEAKED: Pixel Fold Shows Up in Android 12, Further Confirming Its Existence

LEAKED: Pixel Fold Shows Up in Android 12, Further Confirming Its Existence

Google has been very vocal about unreleased products this year

Next
Twitter Pauses Verification, AGAIN, After Verifying Fake Accounts LOL

Twitter Pauses Verification, AGAIN, After Verifying Fake Accounts LOL

The Twitter Verification saga continues

You May Also Like