‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Pyr_Pressure
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      edit-2
      1 year ago

      Whoooooosh.

      In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

      • mossy_@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        so because it’s not a problem in your culture it’s not a problem?

          • mossy_@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 year ago

            You caught me, I’m an evil villain who preys on innocent lemmings for no reason at all