Well, that's a good way of getting politicians to actually take notice, certainly...
Politicians fume after Amazon's face-recog AI fingers dozens of them as suspected crooks
Amazon’s online facial recognition system incorrectly matched pictures of US Congress members to mugshots of suspected criminals in a study by the American Civil Liberties Union. As a result, the ACLU, a nonprofit headquartered in New York, has called for Congress to ban cops and Feds from using any sort of computer-powered …
COMMENTS
-
-
This post has been deleted by its author
-
-
-
-
Friday 27th July 2018 05:13 GMT wsm
Re: Poetic Justice?
They may be a criminal class according to the facts and figures, considering the high rate of those convicted of crimes or resigning just before being arrested.
We call them unindicted co-conspirators or persons of interest until such time as they are actually serving a sentence, but why quibble over details.
-
-
Friday 27th July 2018 19:13 GMT Michael Wojcik
Re: Poetic Justice?
Pity they can't give the POTUS the boot
It really, really isn't, as long as Mike Pence is next in line. You think things are bad now? See what happens if Pence gets the top job.
Fact is, we'd have to dig pretty deep into the line of succession to improve the situation measurably.
-
-
-
-
-
-
-
Saturday 28th July 2018 15:15 GMT Anonymous Coward
Re: Predictive
The other thing that happened in Minority Report was persionalised targeted adverts (albeit based on a retina scan). Thats the true goal, side 'benefits' are flagging people.
I'd better get myself some new ones. Anybody up for a swap? Sadly you'll be getting a pair of well used, pr0n scarred, exceptionally myopic eyeballs*.
Maybe there's some truth in what they say about it being bad for your eyes.
-
-
Friday 27th July 2018 14:06 GMT Rocketist
Re: Predictive
Maybe the average member of Congress has a similar physiognomy to a certain class of criminals?
I seem to remember there was a study about a year ago where certain behavioral patterns could be predicted from an analysis of the person's features; something that has been suggested in the 19th century but vehemently (and rightly) criticized by most serious scientists ever since.
-
-
Thursday 26th July 2018 21:50 GMT Mark 85
Nope, not ready for prime time and yet they're trying sell those junk. Given some of the police actions of late, I'm not sure how many false positives will die but it could be enough to raise a public outcry and that's too late for any innocent who's dead or injured.
Put it back in the shed, Amazon and let the folks there tinker under the hood some more. Profit can wait until you get it right. And by "right".. that equals 100%.
Disclaimer: It should be banished, buried, and burned. Facial recognition can't possibly come to a good end.
-
Friday 27th July 2018 19:18 GMT Michael Wojcik
Given some of the police actions of late, I'm not sure how many false positives will die but it could be enough to raise a public outcry and that's too late for any innocent who's dead or injured.
Yes. Combining half-assed automation that has abysmal accuracy with police militarization and you have a recipe for a sharp increase in trigger-happy assholes killing innocent civilians for Texting While Black and similar offenses.
Police departments need to get their house in order before adding any more automation, and vendors like Amazon need to make their products much, much better before peddling them to the police.
I'd like to hope Amazon catches some flak from investors (the only thing they care about) over this, but I'm not holding my breath.
-
Saturday 28th July 2018 15:17 GMT Anonymous Coward
Yes. Combining half-assed automation that has abysmal accuracy with police militarization and you have a recipe for a sharp increase in trigger-happy assholes killing innocent civilians for Texting While Black and similar offenses.
IIRC the majority of people killed by US police officers are white. But in general I agree with the point you're making.
-
Monday 30th July 2018 09:19 GMT 's water music
No idea of the relaibility of the source or reliability of the data available but it appears that non-whites are statistically over-represented, demographically speaking.
Watching but not wearing the right spectacles?---->
-
-
-
Thursday 26th July 2018 22:11 GMT Anonymous Coward
Accessorize to a Crime: Real and Stealthy Attacks on State-of-the-Art Face Recognition
A couple of years old but interesting research on misdirecting facial recognition software https://www.cs.cmu.edu/~sbhagava/papers/face-rec-ccs16.pdf.
I would think that it works, at least in part, because the software designers did not anticipate any attempt to defeat recognition other than by wearing a disguise.
-
-
-
Saturday 28th July 2018 15:20 GMT Anonymous Coward
Re: New training rule needed
And actually hugging or wrapping yourself in a flag is proof positive.
A good friend of mine got married a couple of years back. As Hindus, he and the bride were wrapped in a large and beautifully embroidered swastika cloth as part of the ceremony. I hate to think what "AI" would make of all the pictures on social media.
-
-
Thursday 26th July 2018 22:28 GMT Anonymous Coward
"It’s not totally clear why Amazon’s face recognition technology is so inaccurate."
Pretty simple, really. All facial recognition is horribly inaccurate. Studies done on humans show that even we're only good at recognizing faces for people we already know. We've all had instances where we've mistaken a stranger for someone we know, even. Machine learning algorithms have already been able to out do humans, but that comes with huge caveats from the training sets. The ML algorithms tend to end up saying "All Asians look alike" or "All blacks look like apes", because they didn't have enough relevant training data (read: they are usually trained and tested with a heavy Caucasian bias, except in China.) ML algorithms require insane amounts of examples to generalize from. But facial recognition is a memorization problem instead, so, the goal is to use a huge amount of examples to determine all of the ways that faces can differ, so you can determine where a specific face falls in that feature space, then you match "reasonably" similar faces. That's complicated by translation, rotation, lighting, as well as physical modifications to the face. With 7.5 billion people to distinguish between, and low-resolution cameras, you're going to have a nasty trade off between false positives and false negatives. I'd expect they would either try to split the errors evenly or err on the side of false positives, since those can be checked by a human.
-
Friday 27th July 2018 00:35 GMT Robert Helpmann??
Not properly House trained
The simplest explanation is that since the focus is on catching crims, the training data was mostly or completely composed of mugshots. This is based on the high false-positive rate that matches the incarceration rate in the US. Nothing like building in a self-perpetuating bias.
-
Friday 27th July 2018 06:19 GMT John Smith 19
"Those 28 mismatches therefore represent a five per cent error rate. "
Compared to the something like 97% false positive rate of the system the UK Metropolitan Police are trialing that is actually quite good.
Still pretty s**t, but in population of 30 million adults (like the UK) that would 1.5 million false positives.
Frankly you'd better to issue every officer with a fingerprint reader.
But then of course you'd need to actually do some real police work and they might start harassing officers who did this with too little evidence to begin with.
-
Thursday 26th July 2018 23:19 GMT Eddy Ito
In other news
TSA's mother TLA, the DHS, has asked Amazon how quickly they could roll out this technology at every airport in the US. The rationale is that this would do preliminary screening of travelers as they arrive at the curb in order to select individuals for
extended gropingenhanced screening as they pass through security on their way to their flight.