This is the same feature that added here several months ago, where file hashes are scraped and compared to a database of known child exploitation images.
The thing that has people concerned with Apple is that now it won't just be images in your iCloud, but also those stored locally on users' own devices.
#cososec
Apple will scan photos stored on iPhones and iCloud for child abuse imagery - The Verge
https://www.theverge.com/2021/8/5/22611305/apple-scan-photos-iphones-icloud-child-abuse-imagery-neuralmatch
This also reminds me of the arguments LEOs make for encryption backdoors.
To be clear, I absolutely support Apple scanning the hashes of everything in iCloud, since Apple owns the servers. I simply think that if scanning is taking place on a personal device, then it should require a court order.