• xor@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    29 days ago

    And you can keep hand waving away the fact that lower precision because of less light is not the primary cause of racial bias in facial recognition systems - it’s the fact that the datasets used for training are racially biased.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      29 days ago

      Yes, it is. The idea that giant corporations “aren’t trying” is laughable, and it’s a literal guarantee that massively lower quality, noisier inputs will result in a lower quality model with lower quality outputs.

      Less photons hitting the sensors matters. A lot.