Detroit woman sues city after being falsely arrested while pregnant due to facial recognition technology::A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    121
    ·
    1 year ago

    According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.

    The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.

    • rockSlayer@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      ·
      1 year ago

      We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.

      • SatanicNotMessianic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 year ago

        The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.

        The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.

        That’s why it’s so important for these kinds of issues to make the front pages.

        • brygphilomena@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          It’s great how statistics can be used to basically support anything the author wants them to. Identifying initial biases in the data is super important just as verifying the statistics independently.

      • gelberhut@lemdro.id
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        19
        ·
        1 year ago

        I do not see a bias here. It did not assumed that criminal is black by default or so, it simply works much worse for black people.

        There could be different reasons for that. For example it can suck recognizing black faces in bad light conditions.

        • phillaholic@lemm.ee
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          1
          ·
          1 year ago

          This is a Systemic Bias; in this case Systemic racism.

          The outcome a product or service disproportionately targets Black people. It wasn’t designed to do it, so it’s not overt racism, it just worked out that way.

          Camera systems inherently have a harder time with dark skin. That’s a fact. However it’s been found time and time again that these systems are predominantly created by and tested on light skin individuals. So the bias is built into the flawed creation. You can see this in Hollywood where lighting has only recently been set up to highlight dark skin with majority black casts and show runners in shows like Atlanta and Insecure.

          • gelberhut@lemdro.id
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            11
            ·
            1 year ago

            Could you please point where it targets disproportionally black. Does it recognizes black people instead of white? this would be a racism.

            If it just recognizes completely wrong faces for black people - it is a shitty quality which, buy the way, works in favour of black criminals.

              • gelberhut@lemdro.id
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                8
                ·
                1 year ago

                Yes, I know. And I agree that when this happens in conditions when a human can do the work well - it is a bias. However, when this happens in conditions when a human cannot do the work either it could be a physic, like dark is worse visible in darkness.

                And in exactly discussed case it is not that clear what was the reason.

                it is a matter of interpretation as well: one can say “a system helps black criminals to avoid being arrested”.

                For me, this false reconginision statistic is an alarming signal which says that the system works bad and deeper analyses must be done, and meanwhile, police must be more accurate dealing with its results.

                • phillaholic@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  1 year ago

                  The outcome of the bad technology and policing is disproportionately effecting dark skinned people. That’s where it becomes systemic racism. No one decided to design a system to arrest more blacks people. The outcome of various factors ended that way however. Sometimes it’s just a consequence of nature, but most of the time there are clear reasons like lack of representation in design and testing that would have found the problems earlier.

                  • gelberhut@lemdro.id
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    edit-2
                    1 year ago

                    Where did you read about “arrest more black people”? They say it points to wrong people when a criminal has black skin. You can also describe it “helps black criminals hide them self”.

                    I’m absolutely with you being against racism and other discriminations, but exactly in this case rasims and bias is not that relevant. Overusing terms like “rasism” makes the team weaker and people start consider it as a minor thing. Like one associates “racism” with non ideal snarcamera settings, what is dangerous.

    • fabian_drinks_milk@lemmy.fmhy.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven’t gotten them back.

      The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that’s somehow allowed.

      Wikipedia (English): https://en.m.wikipedia.org/wiki/Dutch_childcare_benefits_scandal