I wonder what kind of of csam detection they have. If they’re only relying on hash matching, they’ve gonna get fucked from novel genai csam. This is why stuff like the fedi-safety exists which they could use as well
Yes, it’s the bare minimum precaution you can implement. But at the same time, it’s the bare minimum precaution you can implement. There’s really no excuse for not doing it, and it catches a shocking number of images.
Crossposting from the other thread
I wonder what kind of of csam detection they have. If they’re only relying on hash matching, they’ve gonna get fucked from novel genai csam. This is why stuff like the fedi-safety exists which they could use as well
Hash matching is really easy to get around. Literally modify 1 bit of the image or just re-encode the video and you’ve gotten around it.
Yes, it’s the bare minimum precaution you can implement. But at the same time, it’s the bare minimum precaution you can implement. There’s really no excuse for not doing it, and it catches a shocking number of images.
https://blog.cloudflare.com/the-csam-scanning-tool/#fuzzy-hashing
They’ll be using perceptual hashes, not file hashes.