- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
So you’re admitting they are correct?
No, I’m admitting they’re stupid for even bringing it up.
Unless their argument is that all AI should be illegal, in which case they’re stupid in a different way.
Do you think regular child porn should be illegal? If so, why?
Generally it’s because kids were harmed in the making of those images. Since we know that AI is using images of children being harmed to make these images, as the other posters has repeatedly sourced (but also if you’ve looked up deepfakes, most deepfakes are of an existing porn and the face just changed over top. They do this with CP as well and must use CP videos to seed it, because the adult model would be too large)… why does AI get a pass for using children’s bodies in this way? Why isn’t it immoral when AI is used as a middle man to abuse kids?
As I keep saying, if this is your reasoning then all AI should be illegal. It only has CP in its training set incidentally, because the entire dataset of images on the internet contains some CP. It’s not being specifically trained on CP images.
You failed to answer my questions in my previous comment.
Ok, if you insist…yes, CP should be illegal, since a child was harmed in its making. It can get a bit nuanced (for example, I don’t like that it can be illegal for underage people to take pictures of their own bodies) but that’s the gist of it.
That’s not all of the questions I asked
What other questions? Sorry, I missed them.