A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.
I suppose they could get some kind of sex+children detector going for all generated image, but you’re going to have to train that model on something, so now it’s a chicken and egg problem.
Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.
If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?
My guess would be he wasn’t self hosting the AI network so the requests were going through a website.
The service should have NSFW detection and ban them instantly if they detect it
ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.
Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.
I suppose they could get some kind of sex+children detector going for all generated image, but you’re going to have to train that model on something, so now it’s a chicken and egg problem.
He was found extorting little girls with nude pics he generated of them.
Why the fuck isn’t that the headline? Jesus, that’s really awful and changes everything.
Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.