Follow

This is the same feature that :jester: added here several months ago, where file hashes are scraped and compared to a database of known child exploitation images.

The thing that has people concerned with Apple is that now it won't just be images in your iCloud, but also those stored locally on users' own devices.

Apple will scan photos stored on iPhones and iCloud for child abuse imagery - The Verge
theverge.com/2021/8/5/22611305

^ Heading off the people who might say, "you have nothing to fear if you have nothing to hide":

Databases are not perfect. False positives happen. It's one thing to make that mistake and manually review on a cloud account. If a personal device gets flagged though, that can lead other places.

This also reminds me of the arguments LEOs make for encryption backdoors.

To be clear, I absolutely support Apple scanning the hashes of everything in iCloud, since Apple owns the servers. I simply think that if scanning is taking place on a personal device, then it should require a court order.

@voltronic "Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud." While they are scanning them locally, I read that as being done "when they're bout to be uploaded." If you're not connecting your phone to iCloud for photo sharing you won't be "uploading them" so there won't be a scan.

macrumors.com/2021/08/05/secur

@sfleetucker
Hmm, that may be. Still, it begs the question of why they need to scan them locally on the user device at all.

@sfleetucker
Follow the last link in the article you posted about their 2019 privacy policy update. Even back then, it said they were scanning files "uploaded to iCloud."

HOW they were / are doing that while preserving EEE is not clear, however.

@voltronic Yeah, it might be that as they increase their privacy, they try to do more and more "locally" on your device. I think that's one reason they're moving Siri's processing more local.

@voltronic I don’t think I like this - not that I have sketchy shit on my phone, but this is troubling

@voltronic Well, GDI, Apple.

I understand the idea; but Apple was one of the few companies that actually took user privacy seriously!

While I want all child predators caught - this is a worrying precedent from the company that refused to decrypt phones from even known terrorists!

Sign in to participate in the conversation

CounterSocial is the first Social Network Platform to take a zero-tolerance stance to hostile nations, bot accounts and trolls who are weaponizing OUR social media platforms and freedoms to engage in influence operations against us. And we're here to counter it.