I don’t know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn’t take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.

    • ShadowA
      link
      fedilink
      English
      arrow-up
      42
      ·
      10 months ago

      Mods and admins care, but we’re not all online all the time.

      • Rooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        10 months ago

        You are right on the point! We are all do this in our free time and we are searching for admins that are free in a timezone we still dont have covered yet.

        We are open if someone is interested in assisting us, just hit us with an email with some details about you and when you can be active on lemmy.world.

    • Andy@slrpnk.netOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      10 months ago

      That’s pretty shocking.

      What tools are available to us to manage this?

      • poVoq@slrpnk.netM
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        10 months ago

        The best tool that is currently available is lemmy-safty AI image scanning that can be configured to check images on upload or regularly scan the storage and remove likely csam images.

        It’s a bit tricky to set up as it requires an GPU in the server and works best with object storage, but I have a plan to complete the setup of it for SLRPNK sometimes this year.

        • silence7@slrpnk.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          This is probably the best option; in a world where people use ML tools to generate CSAM, you can’t depend on visual hashes of known-problematic images anymore.