Apple Will Scan Your iPhone Photos for Child Abuse and Report You

Today, Apple announced two new protections related to child safety that will roll out by the end of the year. The first includes new safety features in Messages, click here to read more about that.

The second relates to Child Sexual Abuse Material (CSAM) — images / videos that depict sexually explicit activities involving a child.

The basics: Apple will scan and detect CSAM images that are stored in iCloud Photos, and report them to the National Center for Missing and Exploited Children and/or law enforcement authorities.

Effectively, your iPhone will be able to scan images within your iCloud Photos against known photos in the National Center for Missing and Exploited Children’s database, which is not publicly accessible — if there’s a match and the number of materials exceeds Apple’s pre-determined threshold, it will go into manual review — where an Apple employee will determine if the images and user need to be reported.

Apple users (also called clients) store photos in iCloud. Apple would like to detect if any of these photos belongs to NCMEC’s database of CSAM photos. If the number of these matches exceeds some pre-determined threshold, indicating systematic presence of CSAM, Apple will report the user to appropriate authorities.

– Apple

Apple says this is all done with user privacy in mind. The scans happen on-device and are only viewed by Apple themselves if the matches are made and the threshold are met — though, Apple doesn’t disclose what that internal threshold is.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes, without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match.

– Apple

This is a good thing, right? We can all agree that child abuse — of any form — is bad, and Apple is taking respectable measures to protect children.

But I do have a problem with this. Hear me out…

The real privacy concerns:

I do NOT want to sound like Alex Jones here, but this sort of technology worries me. Of course we all agree that children should be protected and that Apple’s announcements today are for the greater good, but is that the point? Is the goal to start with children because, if it’s protecting children, it can’t be morally argued?

Is this a gateway to something more? How long before Apple announces that they’re just going to start scanning your iCloud Photos for “drug use” or “pirated movies / music”?

Once these doors are open, it’s hard to close them again, and I’m worried that allowing Apple to do this, opens those doors… But what are we supposed to do? NOT allow Apple to do this? Obviously, we can’t say anything or argue against protecting children — and maybe that’s the point. Because if we argue against it, it appears we have something to hide or condone child abuse. It’s a catch twenty-two.

I’m already seeing people on twitter shaming or guilting users who argue against this technology.

“What? Are you not against pedophiles!?”

“How could you support child abuse!?”

This way of using guilt to silence valid opinions is naive, at best.

Of course, I don’t support ANY type of abuse against children — that’s not what I’m arguing. I’m arguing that the acceptance of this technology may lead to deeper privacy issues and open up the doors to further scans of things less nefarious, especially in the wrong hands.

Apple says this feature will be implemented into iOS 15 later this year.

What’s your Reaction?
Haha
20
Haha
Love
42
Love
Hmmm
199
Hmmm
WAAAT
127
WAAAT
Noooo
183
Noooo
WTF
100
WTF
Total
19
Shares
Prev
Apple Will Notify Parents When a Child Receives or Sends Sexually Explicit Photos

Apple Will Notify Parents When a Child Receives or Sends Sexually Explicit Photos

Today, Apple announced new child safety protocols that will be bringing new

Next
EXCLUSIVE: Google Pixel 5A Launching Later This Month at $450 — Full Specs

EXCLUSIVE: Google Pixel 5A Launching Later This Month at $450 — Full Specs

THIS LEAK WAS CORRECT — Updated August 17, 2021 Just a few days after Google

You May Also Like