A controversial European Union legislative proposal to scan the private messages of citizens in a bid to detect child sexual abuse material (CSAM) is a
Unfourtunately, I couldn’t find a source stating it would be required. AFAIK it’s been assumed that they would use perceptual hashes, since that’s what various companies have been suggesting/presenting. Like Apple’s NeuralHash, which was reverse engineered. It’s also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.
Is there a source stating that they’re going to require these?
Unfourtunately, I couldn’t find a source stating it would be required. AFAIK it’s been assumed that they would use perceptual hashes, since that’s what various companies have been suggesting/presenting. Like Apple’s NeuralHash, which was reverse engineered. It’s also the only somewhat practical solution, since exact matches would be easily be circumvented by changing one pixel or mirroring the image.
Patrick Breyer’s page on Chat Control has a lot of general information about the EU’s proposal.
Stupid regulation, honestly. Exact matches are implementable but further than that… Aren’t they basically banning e2ee at this point?
Now I see why Signal will close in EU.