This is the same feature that added here several months ago, where file hashes are scraped and compared to a database of known child exploitation images.
The thing that has people concerned with Apple is that now it won't just be images in your iCloud, but also those stored locally on users' own devices.
#cososec
Apple will scan photos stored on iPhones and iCloud for child abuse imagery - The Verge
https://www.theverge.com/2021/8/5/22611305/apple-scan-photos-iphones-icloud-child-abuse-imagery-neuralmatch
@voltronic "Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud." While they are scanning them locally, I read that as being done "when they're bout to be uploaded." If you're not connecting your phone to iCloud for photo sharing you won't be "uploading them" so there won't be a scan.
https://www.macrumors.com/2021/08/05/security-researchers-alarmed-apple-csam-plans/
@sfleetucker
Hmm, that may be. Still, it begs the question of why they need to scan them locally on the user device at all.
@voltronic because they’re encrypted on apple’s servers and can’t be. https://support.apple.com/en-us/HT202303
@voltronic Yeah, it might be that as they increase their privacy, they try to do more and more "locally" on your device. I think that's one reason they're moving Siri's processing more local.
@voltronic FYI: https://www.macrumors.com/2021/08/05/apple-csam-detection-disabled-icloud-photos/