Meta came up with its “Made with AI” labeling system for its platforms like Instagram, Facebook and Threads, in order to help users distinguish what's what. And while this can be a great feature, it looks like, despite Meta's good intentions, the feature isn't always accurately labeling images on its platforms.
According to TechCrunch, Meta is using its “Made with AI” label on images that aren't created using AI. Of course, things aren't always black and white whenever we see images on the internet, with some having some type of alteration done. The news outlet shares that this isn't just a simple matter of images being mislabeled, but perhaps that images with edits are also being labeled as well.
https://techcrunch.com/2024/06/21/meta-tagging-real-photos-made-with-ai/
I believe all edited pictures that use any form of AI should be labeled, even pictures that are just cropped, if users use a service that uses AI in any form for editing pics/vids it should be labeled as such.
@ecksmc a lot of (phone) _cameras_ claim to use AI. By now it's safer to accept that photos are closer to interpretive dance than objective reflection.