• Schadrach@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    OK, so this is just the general anti-AI image generation argument where you believe any image generated is in some meaningful way a copy of every image analyzed to produce the statistical model that eventually generated it?

    I’m surprised you’re going the CSAM route with this and not just arguing that any AI generated sexually explicit image of a woman is nonconsensual porn of literally every woman who has ever posted a photo on social media.

    • LustyArgonian@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      No, I am not saying that.

      I’m saying if an AI image is sexual and seeded from irl child models, then it is CSAM.

      Adult women who consensually post their image to social media are WAY different than children who can’t consent to even enter a contract.

      Also, I did argue that already when I mentioned how many AI porn models are directly seeded from Scarlett Johansson’s face. Can you read?

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 hours ago

        To be clear, when you say “seeded from” you mean an image that was analyzed as part of building the image classifying statistical model that is then essentially running reverse to produce images, yes?

        And you are arguing that every image analyzed to calculate the weights on that model is in a meaningful way contained in every image it generated?

        I’m trying to nail down exactly what you mean when you say “seeded by.”