back to article Politicians fume after Amazon's face-recog AI fingers dozens of them as suspected crooks

Amazon’s online facial recognition system incorrectly matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union. As a result, the ACLU, a nonprofit headquartered in New York, has called for Congress to ban cops and Feds from using any sort of computer-powered …

  1. Anonymous Coward
    Anonymous Coward

    Well, that's a good way of getting politicians to actually take notice, certainly...

    1. This post has been deleted by its author

    2. Pascal Monett Silver badge

      Absolutely. Nothing like hitting their reputation to get things sorted out.

      Our reputations ? They can lie in waste, no problem (for them). But theirs ? No way we can let computers call them liars - people might start believing it, and we can't have that, now can we ?

    3. bombastic bob Silver badge
      Trollface

      being politicians, I thought that meant 'crook' by definition?

  2. Will Godfrey Silver badge
    Angel

    Poetic Justice?

    I've heard there are some as would brand all politicians as criminals.

    I couldn't possibly endorse such scurrilous claim.

    1. Fazal Majid

      Re: Poetic Justice?

      Of course, it's the crooks that should be the ones complaining.

      "It could probably be shown by facts and figures that there is no distinctly native American criminal class except Congress."

      Mark Twain, Pudd'nhead Wilson's New Calendar

      1. wsm

        Re: Poetic Justice?

        They may be a criminal class according to the facts and figures, considering the high rate of those convicted of crimes or resigning just before being arrested.

        We call them unindicted co-conspirators or persons of interest until such time as they are actually serving a sentence, but why quibble over details.

      2. Anonymous Coward
        Anonymous Coward

        Re: Poetic Justice?

        And aren't most of them Lawyers to boot?

        Pity they can't give the POTUS the boot. The prospect of another 4 years of [redacted] makes me real sad for the USA. They'll have to start another World War to recover /sic

        1. Michael Wojcik Silver badge

          Re: Poetic Justice?

          Pity they can't give the POTUS the boot

          It really, really isn't, as long as Mike Pence is next in line. You think things are bad now? See what happens if Pence gets the top job.

          Fact is, we'd have to dig pretty deep into the line of succession to improve the situation measurably.

    2. Anonymous Coward
      Anonymous Coward

      Re: Poetic Justice?

      The truth will out. Maybe the program is looking to the future.

  3. Jim Mitchell
    Alert

    I'm confused, this article appears to totally lack the requisite amount of snark for an article that has both legislators AND criminals. This can't be a real El Reg article, I'd like to see the facial recolonization results of the author, please.

    1. diodesign (Written by Reg staff) Silver badge

      Ehhhh, we were gonna leave it to the comments to make the obvious gag link.

      C.

      1. CrazyOldCatMan Silver badge

        make the obvious gag link

        I didn't know that they were Tories..

  4. LenG

    And this is a surprise how?

    "Amazon’s online facial recognition system matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union."

    (my correction of original text)

  5. Voland's right hand Silver badge

    Applause

    Can we have the same test applied to Cressida's toy in London as well.

    1. cantankerous swineherd

      Re: Applause

      can they put the guns away first though?

      1. Doctor Syntax Silver badge

        Re: Applause

        "can they put the guns away first though?"

        In this particular instance, no. That should concentrate their minds.

  6. Oengus

    Predictive

    Maybe the AI isn't matching the pictures but predicting the future of the politicians shades of "Minority Report".

    1. Anonymous Coward
      Anonymous Coward

      Re: Predictive

      The other thing that happened in Minority Report was persionalised targeted adverts (albeit based on a retina scan). Thats the true goal, side 'benefits' are flagging people.

      1. Anonymous Coward
        Anonymous Coward

        Re: Predictive

        The other thing that happened in Minority Report was persionalised targeted adverts (albeit based on a retina scan). Thats the true goal, side 'benefits' are flagging people.

        I'd better get myself some new ones. Anybody up for a swap? Sadly you'll be getting a pair of well used, pr0n scarred, exceptionally myopic eyeballs*.

        Maybe there's some truth in what they say about it being bad for your eyes.

    2. Rocketist
      Childcatcher

      Re: Predictive

      Maybe the average member of Congress has a similar physiognomy to a certain class of criminals?

      I seem to remember there was a study about a year ago where certain behavioral patterns could be predicted from an analysis of the person's features; something that has been suggested in the 19th century but vehemently (and rightly) criticized by most serious scientists ever since.

  7. Mark 85
    Facepalm

    Nope, not ready for prime time and yet they're trying sell those junk. Given some of the police actions of late, I'm not sure how many false positives will die but it could be enough to raise a public outcry and that's too late for any innocent who's dead or injured.

    Put it back in the shed, Amazon and let the folks there tinker under the hood some more. Profit can wait until you get it right. And by "right".. that equals 100%.

    Disclaimer: It should be banished, buried, and burned. Facial recognition can't possibly come to a good end.

    1. cbars Bronze badge

      *automated facial recognition

      I find my facial recognition ability quite useful, and as far as I know it hasn't caused any harm - yet

      1. Dr. Mouse

        "I find my facial recognition ability quite useful, and as far as I know it hasn't caused any harm - yet"

        Depends on how many of those facials ended up in someone's eye, I've heard that can be quite painful...

      2. Anonymous Coward
        Anonymous Coward

        Let me guess... "yep, that's a face alright".

    2. Michael Wojcik Silver badge

      Given some of the police actions of late, I'm not sure how many false positives will die but it could be enough to raise a public outcry and that's too late for any innocent who's dead or injured.

      Yes. Combining half-assed automation that has abysmal accuracy with police militarization and you have a recipe for a sharp increase in trigger-happy assholes killing innocent civilians for Texting While Black and similar offenses.

      Police departments need to get their house in order before adding any more automation, and vendors like Amazon need to make their products much, much better before peddling them to the police.

      I'd like to hope Amazon catches some flak from investors (the only thing they care about) over this, but I'm not holding my breath.

      1. Anonymous Coward
        Anonymous Coward

        Yes. Combining half-assed automation that has abysmal accuracy with police militarization and you have a recipe for a sharp increase in trigger-happy assholes killing innocent civilians for Texting While Black and similar offenses.

        IIRC the majority of people killed by US police officers are white. But in general I agree with the point you're making.

        1. 's water music
          Big Brother

          No idea of the relaibility of the source or reliability of the data available but it appears that non-whites are statistically over-represented, demographically speaking.

          Watching but not wearing the right spectacles?---->

  8. Anonymous Coward
    Anonymous Coward

    Congress

    Where did the ACLU find "28 innocent members of Congress"?

    1. kain preacher

      Re: Congress

      Ok mildy bemused non convicted/ not in the system yet .

    2. Anonymous Coward
      Anonymous Coward

      Re: Congress

      "not found guilty of anything yet" rather than innocent.

      This AI face-recog clearly has a pre-crime system,

    3. bombastic bob Silver badge
      Trollface

      Re: Congress

      "Where did the ACLU find '28 innocent members of Congress'?"

      in the same place you find Unicorns and Leprichauns and other mythological creatures

  9. Anonymous Coward
    Anonymous Coward

    Accessorize to a Crime: Real and Stealthy Attacks on State-of-the-Art Face Recognition

    A couple of years old but interesting research on misdirecting facial recognition software https://www.cs.cmu.edu/~sbhagava/papers/face-rec-ccs16.pdf.

    I would think that it works, at least in part, because the software designers did not anticipate any attempt to defeat recognition other than by wearing a disguise.

  10. Anonymous Coward
    Devil

    Only 28?

    Surely that's the number of innocent members of congress. Maybe there's a signed/unsigned error in Amazon's recognition system?

    1. Joe Werner Silver badge

      Re: Only 28?

      Yes, I assume the other 500 were correct matches... ;p

      1. Fungus Bob

        Re: Only 28?

        They were.

  11. veti Silver badge

    New training rule needed

    Looking at the photo accompanying the article, it seems to me: you can fairly safely assume that anyone who poses in front of a flag is up to no good.

    1. Eddy Ito

      Re: New training rule needed

      It also seems to pick out facial hair nicely. Only about 6% of congress has facial hair yet there are clearly 8 in this group who certainly have facial hair and it looks like several others could easily have some stubble that may fool the digital filters.

      1. allthecoolshortnamesweretaken

        Re: New training rule needed

        Or the face-recog "AI" has parameters like "the bad guy always has a beard and/or wears a black hat".

        (Hey, it worked in all the classic John Wayne movies, so proof of concept, right?)

    2. JohnFen

      Re: New training rule needed

      And actually hugging or wrapping yourself in a flag is proof positive.

      1. Anonymous Coward
        Anonymous Coward

        Re: New training rule needed

        And actually hugging or wrapping yourself in a flag is proof positive.

        A good friend of mine got married a couple of years back. As Hindus, he and the bride were wrapped in a large and beautifully embroidered swastika cloth as part of the ceremony. I hate to think what "AI" would make of all the pictures on social media.

  12. Anonymous Coward
    Anonymous Coward

    "It’s not totally clear why Amazon’s face recognition technology is so inaccurate."

    Pretty simple, really. All facial recognition is horribly inaccurate. Studies done on humans show that even we're only good at recognizing faces for people we already know. We've all had instances where we've mistaken a stranger for someone we know, even. Machine learning algorithms have already been able to out do humans, but that comes with huge caveats from the training sets. The ML algorithms tend to end up saying "All Asians look alike" or "All blacks look like apes", because they didn't have enough relevant training data (read: they are usually trained and tested with a heavy Caucasian bias, except in China.) ML algorithms require insane amounts of examples to generalize from. But facial recognition is a memorization problem instead, so, the goal is to use a huge amount of examples to determine all of the ways that faces can differ, so you can determine where a specific face falls in that feature space, then you match "reasonably" similar faces. That's complicated by translation, rotation, lighting, as well as physical modifications to the face. With 7.5 billion people to distinguish between, and low-resolution cameras, you're going to have a nasty trade off between false positives and false negatives. I'd expect they would either try to split the errors evenly or err on the side of false positives, since those can be checked by a human.

    1. Yet Another Anonymous coward Silver badge

      And in practice worse since you have a training set of mug-shots (3 views square-on staring ahead, fixed lighting) and are hunting for matches in poor quality CCTV footage at random angles and lighting

    2. Robert Helpmann??
      Childcatcher

      Not properly House trained

      The simplest explanation is that since the focus is on catching crims, the training data was mostly or completely composed of mugshots. This is based on the high false-positive rate that matches the incarceration rate in the US. Nothing like building in a self-perpetuating bias.

    3. John Smith 19 Gold badge
      Unhappy

      "Those 28 mismatches therefore represent a five per cent error rate. "

      Compared to the something like 97% false positive rate of the system the UK Metropolitan Police are trialing that is actually quite good.

      Still pretty s**t, but in population of 30 million adults (like the UK) that would 1.5 million false positives.

      Frankly you'd better to issue every officer with a fingerprint reader.

      But then of course you'd need to actually do some real police work and they might start harassing officers who did this with too little evidence to begin with.

    4. Michael Wojcik Silver badge

      low-resolution cameras

      Surely that can be addressed by having some superior officer shout "Enhance!" at the machine periodically.

  13. Anonymous Coward
    Black Helicopters

    And in this day and age...

    Some supremacist group will use the ACLU test results to promote the groups racist agenda. I can see the "Fake News" headline now:

    "Advanced Facial Recognition Proves They Are More Criminal Than Us"

  14. Eddy Ito

    In other news

    TSA's mother TLA, the DHS, has asked Amazon how quickly they could roll out this technology at every airport in the US. The rationale is that this would do preliminary screening of travelers as they arrive at the curb in order to select individuals for extended groping enhanced screening as they pass through security on their way to their flight.

    1. John Brown (no body) Silver badge

      Re: In other news

      Yet another reason to never visit theUS if that goes ahead. (nott that I have plans to go there again in the near to medium future anyway.)

    2. phuzz Silver badge

      Re: In other news

      To be fair, it's no worse than the methods they currently use.

      1. Michael Wojcik Silver badge

        Re: In other news

        To be fair, it's no worse than the methods they currently use.

        This is the TSA. Flipping a coin would be an improvement.

        We're talking about an organization that cleared 73 people on their own terrorist list to work at airports, and granted PreCheck status to a "notorious convicted felon" (not on the friggin' list, which we all know is stupid and useless). (And, personally, I have no problem with Sara Jane Olson having PreCheck, except that it shows just how pointless PreCheck and indeed everything touched by the TSA is.)

        We're talking about an organization that has managed, over nine years, to get their success rate in controlled tests from 0% to 4%.

        (I'm using Underhill as my source here because he provides good citations, and more importantly funny comments to help soften the despair.)

        And there are many, many other criticisms we might level against the TSA. Like the way their employees like to pretend to be Federal officers, even though they aren't. Or their penchant for stealing stuff from passenger luggage. Or their arbitrary invasions of many people's privacy. Or their recruitment of local law enforcement to assist in their bullying. Or how they funnel vast amounts of money to themselves and their accomplices in the fake-security industry.

        And, yes, I'm sure there are plenty of decent, hard-working TSA employees. I've generally had perfectly cordial relations with them (but then I take the precaution of being a wealthy white male US citizen, which I heartily recommend if you're going to be using US airports). But the vast majority of the verified evidence shows the TSA is abysmal. It's the worst part of the terrible idea that is the DHS.

        1. Anonymous Coward
          Anonymous Coward

          Re: In other news

          And, yes, I'm sure there are plenty of decent, hard-working TSA employees. I've generally had perfectly cordial relations with them (but then I take the precaution of being a wealthy white male US citizen, which I heartily recommend if you're going to be using US airports).

          If you're posting under your real name, then those "cordial relations" might need writing with a past tense. Individually the TSA staff won't be making up their list of hated taxpayers, but as a bureaucracy, you can be sure it does. Somewhere, there will be a "social media team" looking out for the TSA, and maybe your name is on their radar. Few forces in this world are as persistent and spiteful as a spurned bureaucracy.

  15. Mephistro
    Unhappy

    The fact that Amazon is actually trying to sell this crap in its current state...

    ... proves without a doubt that Amazon's management are just a bunch of dangerous sociopaths.

    Regarding police high-ups willing to buy this, I'd say they're split 50%-50% between sociopaths and morons.

    1. Cpt Blue Bear

      Re: The fact that Amazon is actually trying to sell this crap in its current state...

      "Regarding police high-ups willing to buy this, I'd say they're split 50%-50% between sociopaths and morons."

      Having met a few senior police I can tell you they are neither. They are the product of the prevailing police culture of the 1990 and 2000s filtered through the following decades management training. I find your suggestion less scary.

  16. JeffyPoooh
    Pint

    "...a five per cent error rate."

    This is where Bayesian probability analysis is supposed to be used. In short, most randomly-selected people are not criminals. So false positives can easily dominate the findings unless the error rate is infinitesimal (which it isn't).

    A root cause of all these AI and Machine Learning fiascos is that the people involved clearly don't understand the basics. Thus, their Confidence/Competence Ratio dangerously exceeds unity, and that leads to these sorts of fiascos. They need better managers to rein in the foolish expectations.

    El Reg assists society by exposing such failures. Thank you.

    1. JeffyPoooh
      Pint

      Re: "...a five per cent error rate."

      It's occurred to me that legislation and regulations could make use of Bayesian probability analysis. For this example, the vendor is claiming "5%" error rate. An analysis could be performed to adjust this "raw" error rate, considering the relative rarity of actual criminals, to calculate the expected false positive rate when deployed in the field.

      In this case, they have 28 false positives, that's presumably a rate of nearly 100%. Big difference to "5%" claim. That's the sort of gap you'd expect for rare characteristics.

      Legislation and regulations could incorporate this sort of non-naïve approach, mandating acceptable performance in Bayesian terms. Real world false positives below an extremely low threshold (e.g. 0.01%).

      These systems are massively immature. Not ready for primetime, except in a police state context where bothering innocents is not really considered to be an issue.

      1. Michael Wojcik Silver badge

        Re: "...a five per cent error rate."

        It's occurred to me that legislation and regulations could make use of Bayesian probability analysis.

        Yes. Yudkowsky's "intuitive" explanation of Bayesian statistics, particularly the extended example of the positive mammogram, is a good illustration of the problem - he cites studies showing the majority of experts (doctors, in this case) will wildly overestimate the probability of a hypothesis because the intuitive interpretation is so far from the actual Bayesian result.

        Outcome-based regulation, like your suggestion of a mandatory low-false-positive (high-precision) rate, would help neutralize some of the marketing spin.

  17. Sphynx
    WTF?

    Politicians fume after Amazon fingers dozens of them as suspected crooks

    He he he, of course they are crooks - not suspected :)

  18. Maelstorm Bronze badge
    Trollface

    Accuracy?

    Hmm... It sounds like the software is not working correctly with a 95% error rate. All 535 members of Congress are crooks...they just haven't been caught yet. Which means that the ACLU has it backwards.

  19. Mr Dogshit

    Amazon should go back to selling books.

  20. Destroy All Monsters Silver badge
    Windows

    This is not Terminator identifying John Connor

    Biases in training data are known to trickle through to machine learning systems. It could be that the Rekognition and the mugshot dataset contained a disproportionate number of men and people of color.

    Normal people, as opposed to the "equitable outcome" (50% black/50% white, no asian) bunch of crazies call that "bias" reality.

    This is all bullshit anyway, larded with 2018 levels of completely irrelevant Twitter "construct-outrage".

    We are talking here about the standard problem of any information retrieval algorithm since forever, served up so that the uncomprehending hoi polloi can get excited over it: Depending on the sensitivity level, you have a trade-off between false positives (wrong matches) and false negatives (wrong non-matches). Here, we are erring on the side of false positives. So tune that level.

    And why is Amazon in the business of providing information retrieval algos to Blue Forces and probably US-linked dictators anyway?

    1. Anonymous Coward
      Anonymous Coward

      Re: This is not Terminator identifying John Connor

      And why is Amazon in the business of providing information retrieval algos to Blue Forces and probably US-linked dictators anyway?

      Errr...money? Amazon's AI is alleged the force behind its recommendations on its retail website. Clearly all of the investment hasn't paid off. So rather than write it off, why not repackage and sell it to government?

      A bit like IBM and tWatson.

  21. John G Imrie
    Big Brother

    What happens to the crook when.

    a computer spots you and wrongly thinks you're an arrested crook breaking their conditions of bail

    The crook was keeping to their bail conditions and becoming a productive member of society and now because you walked past the store they robbed last year they are back in jail.

  22. Anonymous Coward
    Anonymous Coward

    They've done it!

    The first true A.I.! The system is apparantly smart enough to recognize that there's no such thing as an innocent member of congress.

  23. caffeine addict

    Find the racism...

    The implied conclusion of the article is that the system can't correctly identify black faces. Which is quite possible considering how bad AI is at spotting black faces anyway.

    But given that the prison system contains a disproportionate number of African Americans, it would be interesting to know if the results were skewed by the pool of criminal photos having a disproportionate number of black faces in it.

    Not saying race wasn't a factor, just wondering where...

  24. Doctor Syntax Silver badge

    Maybe they should repeat this against a set of US tech entrepreneurs including Bezos, of course.

    1. Rocketist
      Childcatcher

      Re: Repeat this against a set of US tech entrepreneurs

      Oh no, the false-positive rate would exceed 100%!

      BTW: Does the AI actually stop when it's reached an identification? In that case we don't know yet, but it might well match a single innocent face to more than one criminal.

  25. adam payne

    The results from the ACLU's latest probing showed that Rekognition mistook images of 28 innocent members of Congress for mugshots of cuffed people suspected of crimes.

    Define innocent?

  26. corbpm

    RE: Find the racism...

    Surely if the training dataset had a statistically significant set of one type of facial characteristics like African American it would have been BETTER at spotting those matches (assuming it works!).

    So the interesting bit would be does the matches against the criminals have a greater congruity for the african americans due to the increased training data and does the system need to be trained with more non african americans.

    Looking at the matches it does look like more caucasians where matched than african americans anyway.

    I'm not looking forward to the day when they get this 95% right and i'm fined automatically for dropping litter 200 miles away from my actual location.

    1. caffeine addict

      Re: RE: Find the racism...

      Looking at the matches it does look like more caucasians where matched than african americans anyway.

      I don't think that's disputed. The suggestion was that a African American face was more likely to trigger a false positive.

      Surely if the training dataset had a statistically significant set of one type of facial characteristics like African American it would have been BETTER at spotting those matches (assuming it works!).

      Depends on what the training dataset was. The training was probably done using all the faces Amazon could find, sure. But the matching was done against a small set with an unrepresentative racial balance...

    2. John Brown (no body) Silver badge

      Re: RE: Find the racism...

      "I'm not looking forward to the day when they get this 95% right and i'm fined automatically for dropping litter 200 miles away from my actual location."

      That's been happening for years with ANPR and auto-generated traffic/parking fines.

  27. perlcat

    Doesn't compute

    Only 5% of Congress identified as crooks? Clearly facial recognition does not work; the actual number is substantially higher if you simply compare growth in net worth to actual income.

  28. John Smith 19 Gold badge
    Gimp

    "except in a police state..where bothering innocents is not really considered to be an issue."

    Exactly.

    Think of it as the police state version of "Computer says 'yes'."

  29. Muira

    The face comparison algorithm yields results with a 'similarity' which is a measure (0-100%) of how confident the algorithm is that two faces match. The positive results that they obtained are relatively low similarity (sub-90%). An educated individual would not consider these a good match. You can also ask the API to ignore matches that do not exceed a specific threshold, and this is common practice, but the people who did this experiment did not. Hence the misinformed results and the misplaced outrage.

  30. Anonymous Coward
    Anonymous Coward

    Oh Err,

    Being "incorrectly fingered by software" doesn't bode well for the automated prostrate checking machine.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like