Some of you may have noticed a lot of people freaking out about CSAM and a bunch of communities closing, instances restricting registrations, turning off image uploads or shutting down completely. It’s a bit of a chaos.

Fortunately your admin has been fighting this fight for the past year so I have developed some tools to help me out. I repurposed one of them to cover lemmy images

Using this approach, I’ve now turned on automatic scanning of new uploads.

What this means for you is that occasionally you will upload an image for a post and it will stop working after a bit. C’est la vie. Just upload something else. Changing format or slightly altering the image won’t help you.

Also, sometimes you might see missing thumbnails on post from other communities. Those were the cached thumbnails hosted by us. The original images should still work in those cases.

Unfortunately this sort of AI scanning is not perfect and due to the nature of the beast, it will catch more false positives but to an acceptable degree. But I find that this is OK for a small social network site run as a hobby project.

Cool? Cool.

  • aldalire@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    55
    ·
    1 year ago

    behavior like this baffle me. I wonder what they stand to gain by spamming CSAM? Purely destructive and psychopath behavior :|

    • db0@lemmy.dbzer0.comOPM
      link
      fedilink
      arrow-up
      69
      arrow-down
      2
      ·
      1 year ago

      Most likely 4channers. They do stuff like that constantly and laugh at the panic

        • db0@lemmy.dbzer0.comOPM
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          We (thankfully) haven’t been targeted yet. This measure is precautionary, due to the issues affecting large instances like lemmy.world, causing potential federation of such content

    • ShittyRedditWasBetter@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      10
      ·
      edit-2
      1 year ago

      They know these servers are being run by amateurs who lack the capacity to deal with it at any real scale or for long term.

      So they are being targeted.

      That being said there have been a large vocal minority who get shit on everytime they point out that lemmy and it’s top servers are 100% not prepared to protect privacy nor have the ability to respond to this shit. This was predicted but the community is absurdly against paying anybody to run a proper fucking server employing professionals and giving a fuck about how to properly deal with csam. “I’ll donate rather than subscribe.” Sure you will Chad, sure you will. At best you’ll donate they’ll $20 total which will cover maybe a month of compute and bandwidth.

      This place will self destruct in a few months when volunteers are sick of looking through fucked up images for no pay or even professional mental support which is like day fucking 0 shit for anyone dealing with scam. Even fucking Reddit has a responsive team to come help out mods with this shit.

      • theJWPHTER88@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        You have a common-sense point there, and so what should we (if not all Lemmy and Kbin online rafts and their denizens, most of them) do at this point?

        Have at least one competent, small-enough mental health team per instance or three, whose primary job functions alike that of an active mental health hotline, albeit geared towards the Fediverse citizen in need, admins, moderators, community builders, coders, creatives, and lurkers alike, but make sure they don’t overstep their bounds a bit too much, and that they also connect with others representing other instances, keeping in mind the right to privacy and all other related human rights to make sure the ecosystem stays healthy at all facets, not just in the online, but also in the “outside world” sense.

        For that, diversity, genuine compassion and science-and-humanity-based objectivity are also needed for those willing to take up those roles.

        • ShittyRedditWasBetter@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          9
          ·
          edit-2
          1 year ago

          I believe a non profit(s) model eg Mozilla is the answer. Charge $14/mo right out the gate, publish where the money is going each year and be 100% unapologetic in charging for services.

            • ShittyRedditWasBetter@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              7
              ·
              1 year ago

              Then enjoy Reddit or your shitty servers going down each week, a healthy dose of kiddie porn, and admins free to do whatever the fuck they want with your personal data.

                • ShittyRedditWasBetter@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  1 year ago

                  I don’t think it is. You’re talking 200k total comp for senior AWS folks, you’re paying 250k+ for a credible CIO, overhead, etc. Running tech is expensive which is why you see it so heavily subsidized by ad revenue.

  • Mojojojo1993@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    1 year ago

    Catch and hunt these degenerates. I’d hope law enforcement would get involved. If these sick fucks have these images and have posted them hopefully they can track them and imprison them

  • Rentlar
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Hellllllllll yeaaaaaaaaaaahhhh, my pal db0. Nip that shit in the bud. Hope it’s not too energy intensive or use too much computational power.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    1 year ago

    …will catch more false positives but to an acceptable degree.

    I’d much rather see it catching plenty of false positives than not because it at least shows it’s working as it should.

  • moitoi@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Thanks for the great job!

    I was asking myself if it could be a sort astroturfing? I don’t know why but it looks like that to me.

  • nyakojiru@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    This is perfect moment / place for interpol and other organizations to catch some pedo mother fuckers . Those people are absolute human trash. Also Expect heavy vigilance from big eyes over the fediverse.

  • hemko@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Very cool! Personally I’d think even blocking any pictures of children on the website, as those are almost always posted by adults without real consent and rest of the time children who do not understand the consequences of uploading their photos to internet.

    But this is very cool!

  • lambalicious@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    and due to the nature of the beast, it will catch more false positives but to an acceptable degree

    1. What is this “acceptable degree”? Where is it documented?
    2. What is the recourse for the uploader in case of a false positive? And no I don’t mean “upload something else”, I mean what do you answer to “my legit content is being classified by a shared internet tool as CSAM, of all things”.