A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.
While I absolutely don’t want to sound like I’m defending the practice (because I’m not), I’m really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is “no”, then why is child porn different? If “yes”, then should we declare all the other ones illegal as well?
It’s not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone’s life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn’t mean it’s a very good justification for making it illegal if there is no victim.
The reason why I’m not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing “murder simulators” which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn’t the case.
I’d even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it’s a relatively easier and safer way for them to satisfy an urge. I wouldn’t be willing to bet on that though
While I absolutely don’t want to sound like I’m defending the practice (because I’m not), I’m really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is “no”, then why is child porn different? If “yes”, then should we declare all the other ones illegal as well?
It’s not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone’s life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn’t mean it’s a very good justification for making it illegal if there is no victim.
The reason why I’m not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing “murder simulators” which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn’t the case.
I’d even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it’s a relatively easier and safer way for them to satisfy an urge. I wouldn’t be willing to bet on that though