A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
I don’t think it’s the same concern. It’s not that people will become pedophiles or act on it more because of the normalization and exposure. It’s people will see less of a problem with the sexualization of children. The parallel being the amount of violence we are OK being depicted. The difference being we can only emulate in a personal level the sexual side.
Maybe there’s the argument that violence is escapist, sexual desire is ever present and porn is addictive.
I don’t think it’s the same concern. It’s not that people will become pedophiles or act on it more because of the normalization and exposure. It’s people will see less of a problem with the sexualization of children. The parallel being the amount of violence we are OK being depicted. The difference being we can only emulate in a personal level the sexual side.
Maybe there’s the argument that violence is escapist, sexual desire is ever present and porn is addictive.