• Gigan@lemmy.world
    link
    fedilink
    English
    arrow-up
    63
    ·
    2 months ago

    If they’re going to go this way, I don’t think it should be limited to just porn. There are plenty of ways you could ruin someone’s life without a deepfake being sexually explicit.

    • Deestan@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      2 months ago

      There already are a lot of laws covering that. This one is to cover an additional angle where people create deepfake without provably publishing it, the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.

      If you create a deepfake and share it, you’re slapped with two crimes.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 months ago

        the intent being that showing it to friends and verbally threatening to “leak” them should be easier to prosecute.

        That’s blackmail, which is already illegal.

        • nogooduser@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 months ago

          Using a mobile phone while driving has always been illegal if you could argue that it was dangerous driving or driving without due care and attention. They made a law specifically saying that using a mobile phone without hands free is illegal anyway. This makes it easier to prosecute because you don’t need to argue that they were driving dangerously or without due care.

          I imagine that this law has the same intent of making this specific act illegal to prevent them having to argue that it fits another crime.

    • billbasher@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Yeah the way people can recreate someone “in need of assistance” to trick family or associates is really scary especially for people who aren’t exactly tech savvy. That seems to me to be a worse crime than an explicit video that is pretty obviously doctored

  • usualsuspect191
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    2 months ago

    I wonder what happens when it just accidentally looks like someone but was intended to be a fictional person. Also, how much can you base it on a real person before it’s considered a deep fake of that person? Would race-swapping be enough to make it a “new” person so it’s not illegal anymore? My intuition is that just eye colour or something wouldn’t be enough, but it’s a sliding scale where the line must be drawn somewhere even if it’s a fuzzy line.

    What about an AI generated mashup of two people like those “what the child would look like” pictures back in the day. Does that violate both people or neither?

    What about depicting a person older than they are now? That’s technically not somebody that exists, but might in the future.

    What if you use AI but make it look like it’s hand-drawn or a cartoon?

    What if you use AI to create sexual voice clips of a real person but use images that don’t look like them or no image at all?

    There are just so many possibilities and questions that I feel it might be impossible to legislate in a way that isn’t always 10 steps behind or has a million unforeseen consequences.

    • Not_mikey@slrpnk.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      There’s already laws against using someone’s likeness for commercial purposes without their consent, I’m guessing this will require the same fuzzy cutoff and basically just be up to the jury to decide or the judge to dismiss.

    • WamGams
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      10
      ·
      2 months ago

      Well, let’s find out. Please give me 20 sample photos of you, 30 minutes of audio and 10 of video.

      I’m going to have you get gangbanged by 100 German men and upload it to xvideos.

      Now, that is probably something you deserve to consent to, isn’t it?

  • cygnus
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    14
    ·
    2 months ago

    I have a hard time accepting this as a crime. What if the illustration hand-drawn, or clothed but still sexual in character? Is caricature illegal, by this standard?

      • MareOfNights@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        2 months ago

        Yea, this is a funny thing to think about.

        You can jerk off to photos of people, you can imagine some wild things involving other people etc.

        If you just create some deepfake porn to masturbate by yourself to, I don’t see a big problem with that.

        The only issue I can find is, that due to neglect someone else sees it, or even hears about it.

        The problem starts with sharing. Like it would be sexual harassment to tell people who you are masturbating to, especially sharing with the “actors” of your fantasies.

        There is however another way this could go:

        Everyone can now share nudes with way less risk.

        If anyone leaks them, just go: “That’s a deepfake. This is obviously a targeted attack by the heathens of the internet, who are jealous of my pure and upstanding nature. For me only married missionary with lights out.”

            • WamGams
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              2 months ago

              So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?

      • capem@startrek.website
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        9
        ·
        edit-2
        2 months ago

        Ooohh, can’t wait to see us waste billions of dollars deliberating what is acceptable just like with copyright law.

        This is another law that only exists to protect rich people. Poor people can’t afford a lawyer and don’t have time to show up in court.

        • andrewta@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          2 months ago

          You seriously can’t see why deep fakes are a serious problem to everyone?

          This law won’t protect just rich.

          Imagine the chaos as some idiot teen creates a deep fake of some other teen in a compromising position.

          Go talk to an attorney and see what they have to say about it.

    • Deestan@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      2 months ago

      Is caricature illegal, by this standard?

      No.

      The official government announcement is linked in the article btw.

    • BolexForSoup@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      I understand this won’t be a popular statement, but to me, it falls under I know it when I see it.

      I don’t know the exact location of the line, but there is no artistic, scientific, or any other kind of merit to someone making deepfake nudes of a 14 year old and circulating them around school. The victim comes first in these cases. I don’t want to debate what is or isn’t child porn. I think we all agree this girl was a victim and this should never have happened.

      To get away from the minors-argument: it’s just like how I can’t go around shooting photos of random people when they’re naked and then circulate them. Hell you can barely do that even if they aren’t naked except in particular circumstances where consent to be photographed is taken as a given.

      Non-consensual deepfakes should, by and large, not be allowed.

      • cygnus
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        I can’t go around shooting photos of random people when they’re naked and then circulate them.

        That’s wildly different. It’s like saying that writing about murder and actually committing it are the same thing.

        • BolexForSoup@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 months ago

          Deepfakes are already so good in many cases that the differences are basically trivial. The gulf between writing and committing murder is far wider. Not a great parallel IMO.

          How would you feel if someone spread nude deepfakes of you? Your partner? Your child? Are you telling me that’s just like writing about doing something and not closer to committing the act and you’d just shrug and move on with your life?

          • cygnus
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            Deepfakes are already so good in many cases that the differences are basically trivial.

            Then if anything it gives deniability to real nudes. “It wasn’t me, it’s fake!”

            How would you feel if someone spread nude deepfakes of you? Your partner? Your child? Are you telling me that’s just like writing about doing something and not closer to committing the act and you’d just shrug and move on with your life?

            It depends. There’s already a legal framework for defamation, so if the deepfake is made public and has a negative impact on me I can use that avenue. Simply making the deepfake, though, is akin to drawing me naked (not that anyone wound want to do that). It’s deeply weird but should not be illegal IMO.

            • BolexForSoup@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 months ago

              It’s not about deniability. I don’t want incredibly photorealistic nudes of me or my family spreading around with little to no consequences. I certainly don’t want to get into prolonged court battles over it. Why does somebody’s unfettered use of AI trump my dignity as a person? We have restrictions on photography and video baked into our legal framework. Why should this be any different?

              I can’t imagine you would be so flippant if this was happening to you.

              • cygnus
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                We have restrictions on photography and video baked into our legal framework. Why should this be any different?

                Because it isn’t real. Why should someone be charged for creating a work of fiction? Do you not see how dangerous that precedent is?

                • BolexForSoup@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  2 months ago

                  How is a paper facsimile generated with glass and light any more or less real than a near-duplication in a digital format? You are splitting hairs here. If the average person essentially can’t distinguish between a deep fake and a “real“ photo or agrees it is sufficiently similar for their purposes than it’s moot. Your argument hinges on whether or not something is “real“ and that is not a prior that most people agree with, nor is it a scientific or otherwise objective/measurable benchmark. You can’t just vacillate between science-y sounding responses and opinions like that.

                  There are deep fakes that look more “real” than some old photos. Where does that factor into this?

                  I’m being dead serious here when I ask: what constitutes “real”? Because that seems to be doing a lot of heavy lifting in your responses. And I don’t really see that word tossed around much in legal frameworks that’s for sure, certainly not as you seem to be using it. I’ve been in the visual/audio media industry for 15 years and I can tell you that your lines in the sand are yours and yours alone. The thousands of releases I’ve been responsible for over the course of my career make that pretty obvious.

  • gedaliyah@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    5
    ·
    edit-2
    2 months ago

    This is why we should be making laws around likeness rights. If you damage somebody by publicly using their name to spread falsehoods, that’s defamation or libel. But, if you produce an image or video of their likeness instead of using their name, there’s no legal recourse. Makes no sense this day in age

    • cygnus
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      8
      ·
      2 months ago

      Who decides how similar somebody is “allowed” to look to another? There are people who bear an uncanny resemblance to others. And what of identical twins? Can one sue the other if they do porn?

  • Flying Squid@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    2 months ago

    “Without consent.” I’m very curious who would consent to having deepfake porn made of themselves.

    • BraveSirZaphod@kbin.social
      link
      fedilink
      arrow-up
      12
      ·
      2 months ago

      I can imagine a non-zero amount of people would consent to a deep-fake porn video of themselves having sex with some generic hot woman, just as one example.

    • Not_mikey@slrpnk.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Could be very lucrative if you are already in porn and want to make some money from your likeness. This guy’s gonna pay me $500 to make a video and I don’t even have to do anything?

      Could also be very good for porn stars who have “aged out” but can still make videos using their younger bodies as weird as that may be.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 months ago

      A user shared a story a while back about his wife and her sister giving photos and agreeing to it. Lots of kinky people out there.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      2 months ago

      A naked picture of me simply existing is not equivalent to sexual assault. If you want to make it illegal then treat it as its own thing.

    • capem@startrek.website
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      2 months ago

      The only way to deal with it is to let so much of it flood the digital world that nobody cares anymore because there’s a deepfake porno of everyone.

      This is a waste of money to ensure rich people don’t get porn made of them by poor people.

      Poor people won’t be able to afford lawyers and aren’t able to take time off to show up in court.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    6
    ·
    2 months ago

    “Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence and intimidate women — both on and offline,” Meta Oversight Board Co-Chair Helle Thorning-Schmidt said in a statement.

    considers

    I think that there’s an argument for taking the opposite position. If someone could make deepfake porn trivially and it were just all over the place, nobody would care about it; one knows that it’s fake.

    In fact, it’d kind of make leaked actual pornography no-impact as a side effect, unless there were a way to distinguish distinguish between deepfakes. And that’s a harder issue to resolve. I was reading a discussion yesterday about sextortion on here and talking about how technically-difficult it would be to keep someone from recording sex video chats, that there’d always be an analog hole at least. But…there is another route to solve that, which is simply to make such a video valueless because there’s a flood of generated video.

    • intrepid
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      Deep fake recognition is already available. And, while what you predict sounds logical, these criminals prey on emotions. I feel that a lot of innocent people will be victimized even if deep fake porn becomes common.