• Ragdoll X@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    This did happen a while back, with researchers finding thousands of hashes of CSAM images in LAION. Still, IIRC it was something like a fraction of a fraction of 1%, and they weren’t actually available in the dataset because they had already been removed from the internet.

    You could still make AI CSAM even if you were 100% sure that none of the training images included it since that’s what these models are made for - being able to combine concepts without needing to have seen them before. If you hold the AI’s hand enough with prompt engineering, textual inversion and img2img you can get it to generate pretty much anything. That’s the power and danger of these things.