Archive

Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.

In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.
[…]
The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”

Forty-eight of the 60 victims were their classmates at Lancaster Country Day School, a small private school approximately 80 miles west of Philadelphia. The school is so small that nearly half of the high school’s female students were victimized in the images and videos. The scale of the underage victims makes this the largest-known instance of deepfake pornography made of minors in the United States.
[…]
Experts say that it is rare for criminal charges to be brought in deepfake pornography cases where both the victims and the perpetrators are minors.

“My guess [as to why there aren’t more such prosecutions] is just that there may be a general recognition that arresting children isn’t going to resolve this,” Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence who has long studied the intersection of child sexual abuse material and AI, told Forbes.

  • Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    9
    ·
    4 hours ago

    there may be a general recognition that arresting children isn’t going to resolve this

    Of course not. The genie is out of the lamp though and getting cheaper and easier to create more realistic things, so what’s going to change? This could probably qualify as a problem concerning AI safety, something that has been sidelined because profits are more important.

  • BudgetBandit@sh.itjust.works
    link
    fedilink
    arrow-up
    12
    ·
    11 hours ago

    Wait… all AI porn I found came with a cost, I did not find a single free AI porn creation site. so you’re telling me that high schoolers paid money to do that?

      • Rhaedas@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        4 hours ago

        It’s not like it takes much to do it. If they or their parents bought a computer that can run a recent game, it can generate pictures. The software is abundant and free.

        • curbstickle@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          3 hours ago

          Agreed, I’m just saying its likely they’ve got at least decent hardware for a local stable diffusion or similar. I doubt they are using 10yr old machines at home that might struggle.

    • PrettyFlyForAFatGuy@feddit.uk
      link
      fedilink
      arrow-up
      15
      ·
      10 hours ago

      you can set up a local instance of stable diffusion for free if you have the hardware to run it, a lot of training data is also free.

      Don’t do this with it though

      • BudgetBandit@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        8 hours ago

        Nah, my PC has a GTX 970 and my laptop is a MacBook Air M1 that while making an AI image got up to 117°C before shutting down