Can't wait for this to become bundled...
As a critical Windows-10 security update etc. STOP!
ICE Agents have been dispatched! Wait by your PC!
Microsoft has improved its facial recognition technology so that it is better at identifying humans who aren't white men. Today's announcement of the breakthrough, which promised "significant improvements in the system's ability to recognize gender across skin tones," comes a week after CEO Satya Nadella sent a missive to …
You don't need to cover your whole face, just a few strategically placed sections to throw the computer off.
Possibly makeup that only showed up under IR, would mean you could look normal to humans, whilst confusing cameras.
I guess that means it only ever worked once before, and this time it succeeded twenty times?
Only tried a billion photos, but hey, twenty times!
With such vague results, one can be certain that it's still utterly useless. Probably worse than useless.
One wonders if its even better than pure chance.
It's another example of poor expression which renders a statistical statement meaningless (or deniable).
It could mean error rates are down to 5% (assuming previous error rate was 100%) or lower.
But 'up to' just confuses it.
I started typing this thinking I knew how to express what they meant but I give up. Too many semantic errors in the sentence.
Just assume the facial recognition tech is better. But don't assume how much by or how much in relation to particular genders/ skin tones.
This is often a phraseology type used either by the terminally artful to try and mislead the audience or by clever people trying to sound less so.
If it made sense, the phrase might mean errors dropping to 0.05 of what they were. Until it is explained, we could assume errors have fallen by 0.05%
Google's version couldn't tell the difference between a human with dark skin and a gorilla most of the time, never mind tel the difference between different humans with dark skin.
Whereas, it was able to tell the difference between a gammon (racist, foaming-at-the-mouth, sunburnt white man) and a pig
It is quite amazing how many medical researchers and the like do not want to understand it. (An observation made to me years ago by my statistics supervisor, one of whose side jobs was to report on dodgy statistics in medical trials.)
For everything like this there are two error rates: false positives and false negatives. Often you could reduce the false positives by 95% at the expense of greatly increasing the false negatives*. So which is it?
*Suppose the population has 2% positives and the test gives 40% positives, but only detects half the real positives due to the uncertainties of the test. Changing the threshold to give 2% positives will now almost certainly give only false positives.
.....voiceprints......and GCHQ, the NSA, ICE and every other bad actor on the planet has a handle on your privacy. No....not kidding.....read this:
- https://theintercept.com/2018/06/25/interpol-voice-identification-database/
Welcome to the future!
"If we are training machine learning systems to mimic decisions made in a biased society, using data generated by that society, then those systems will necessarily reproduce its biases."
That's good then, only the poor and discriminated will be eliminated by the machines.
'Goodlife' will be okay.
'Goodlife' see Fred Saberhagen's Berseker books.