Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • Rentlar@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I am very sorry you had to go through such a terrible experience.

    It is my sincerest hope that you will be able to find a workable solution to this problem, from Lemmy or elsewhere.

    I am (and have been) okay with admins taking any action necessary to accomplish the goals of the Beehaw project. So removing image hosting, implementing lemmy-safety, restricting federation severely, do whatever you need.

    And please, also do whatever you need to care for yourself, including if it means needing to take a break from the site.