In the seven years since I first wrote about deepfakes, before there was even a word for the AI-powered face-swapping technology, people have finally started to realize that sexually explicit deepfakes meant to harass, blackmail, threaten, or simply disregard women’s consent have always been the primary use of the technology—not spreading disinformation or endangering democracy. This has always been the case, but now, when deepfakes capture national attention, it’s typically because a big-name celebrity has been the target (most recently, Taylor Swift). And even when it’s a lesser-known person whose face is transposed onto a nude or sexualized body, the narrative centers on that person as the sole victim.

But there are at least two people in every deepfake: the one being impersonated, whose face is being used, and the one whose face has been erased entirely, plastered over by an algorithm, leaving their body exposed. The latter is almost always a porn worker, someone who makes their living with that body and carefully chooses who to share it with in their work.

  • dumples@midwest.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    19 days ago

    We are back in another wave of anti-porn and sex worker legislation. It would be great to protect both people as part of the deep fakes. I doubt it but since its such a new technology it might be possible to get something good before old people get some crazy ideas about it

  • interdimensionalmeme@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    19 days ago

    Deep fake laws add nothing, we already have libel laws.The real problem is people who would demean you for having participated in pornography, these people are who this law is for. It exists solely to justify their hate.Those people, deserve to meet the pointy end of the stick.