Security researchers reverse-engineered Apple’s recent iOS 17.5.1 update and found that a recent bug that restored images deleted months or even years ago was caused by an iOS bug and not an issue with iCloud. […]
This has nothing to do with the Files app, nor does it have anything to do with re-indexing of the Photos library. This has to do with fighting CSAM. Apple has started (in this or a previous update), to scan your device (including deleted files) for anything containing nudity (search for “brasserie”) and adding it to your photos library in a way that it is hidden. That way, anything that the models detect as nudity is stored in your iCloud database permanently. Apple is doing this because it allows them to screen for unknown CSAM material. Currently it can only recognize known fingerprints, but doing this allows them (and the other parties that have access to your iCloud data) to analyze unknown media.
The bug mentioned here accidentally made those visible to the user. The change visible updates the assets in the library in a way that removes the invisibility flag, hence people noticing that there are old nudes in their library that they cannot delete.
…
And speaking of deleting things, things are never really deleted. The iPhone keeps a record of messages you delete and media, inside the KnowledgeC database. This is often used for forensic purposes. Apple is migrating this to the Biome database, which has the benefit of being synchronized to iCloud. It is used to feed Siri with information, among other things. Anything you type into your devices, or fingerprints of anything you view are sent to Apple’s servers and saved. Spooky, if you ask me. But the only way we can have useful digital assistants is when they have access to everything, that’s just how it works.Nudes are meant to persist on iPhone. You’re just not meant to notice.
I don’t see these quotes in the article
See comment section
It’s not something the user was supposed to see, and it is not something to be concerned about, because it is done for protection.
This goes way beyond fingerprinting for CSAM detection. local device hidden nudes is now a target for hackers.
I don’t see these quotes in the article
Why is it so hard to just implement a simple delayed Delete function?