• Corkyskog@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    7 months ago

    It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

    • Phoenixz
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      7 months ago

      Just to play devil’s advocate:

      What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)

      • bitfucker@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

        • Adalast@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          7 months ago

          And yet, it is still actually illegal I’m every state. CSAM of any kind in any medium is legally identical. Hand drawn stick figures with ages written under them is enough for some judges/prosecutors.

          Honestly, I am of the firm belief that the FBI should set up a portal that provides user account bound access to their seized materials. This may seem extreme and abhorrent, but it provides MANY benefits.

          • They are able to eliminate the black market for it by providing free, legal access to already existing materials, no more children will be harmed in the production of “new materials”.
          • They can mandate that accounts are only able to be made by those actively pursuing mental health treatments for their mental illness. It is a mental illness long before it is a crime.
          • They are able to monitor who is accessing and from where, and are able to coordinate efforts with mental health providers to give better treatment.
          • They can compile statistical data on the prevailing patterns of access to get a better analytical understanding of how those with the mental illness behave so they can better police those who still utilize extra-legal avenues.

          Always keep in mind that this is a mental illness. Often times it is rooted in the person’s own traumatic past. Many were themselves victims of sexual abuse as children and are as much victims as the children they abuse. I am not, in ANY way, absolving them of the harm that they have done and they absolutely should repent for it. What I am attempting to articulate is that we need to, as a society, avoid vilifying them into boogy-people so we can justify hate and violence. They are people, they are mentally ill, they can be treated, and they can be healthy. It is no different than something like BPD, Malignant Narcissism, or Munchausen by Proxy. All can do real harm, all should face consequences of their harm, but those three are all so normalized at this point that unless the abuse results in death, most people will handwave the actions and push for treatment. Now I feel we have gotten too lax on these (and others) and are far too harsh on others. All mental illnesses deserve ardent and effective treatment.

          • bitfucker@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 months ago

            Nay, I just replied to you in the context of the commenter. The other commenter stated about real life children so your point about hentai is irrelevant to him. I do know the legal definition of CSAM is the end result and not the act. And hence, why I stated that yours is a different discussion entirely.

            Edit: Sorry I read it again and I think I didn’t get my point across very well. I think your point about artwork falls into the debate about the definition of CSAM. Why? Because the word abuse implies an abusive act is being done. But the current definition states that what matters is the end result only. This poses a problem in my opinion because it slightly touch your freedom of expression. By the current definition, art has its limit

        • Phoenixz
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

    • bitfucker@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      7 months ago

      What he probably means is that for a “photo”, an actual act of photography must be performed. While “artwork” can be fully digital. Now, legal definition aside, the two acts are indeed different even if the resulting “image” is a bit-by-bit equivalent. A computer could just output something akin to a photograph but no actual act of photography has taken place. I said the legal definition aside because I know the legal definition only looks at the resulting image. Just trying to convey the commenter words better.

      Edit to clarify a few things.

    • Madison420@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      7 months ago

      This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

      I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        And nothing was lost…

        But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          7 months ago

          Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…