back to article Amazon's neural net offer to border cops, Waymo charges fares, the first AI portrait sold at auction, and more

Let's kick Monday off with a bunch of bits and bytes you may have missed last week from the world of AI – alleged intelligence or artificial intelligence, depending on where you stand. Amazon and ICE: A leaked email has shown that Amazon tried to sell its facial recognition software to the US Immigration and Customs …

  1. ratfox

    matched them against a list of innocent US congress members.

    - Now that's an oxymoron

    - I think you mean alleged innocent

    - FAKE NEWS

    - Trust the AI, throw them in jail just to be sure

    Etc. etc.

  2. Richard Jones 1
    WTF?

    Immature Technology Anyone?

    A few view points: ever since I started photography the issue of subject failure has been talked about. Subject failure is a bit of an oxymoron really it means that the photography medium is not able to record the details of the subject accurately. I first came across the term in relation to some flowers where the visual spectrum you or I can see is not the entire range produced by the flower. However, this wider range could be picked up and misinterpreted by the photographic medium producing what appeared substandard results. The issue of different total ranges of faces was also discussed, but in more muted style due to the emotions that it carries. To me it is clear that these issues still exist in modern media's ability to capture and separate tones in a way that will allow accurate representation.

    To me this comes down to the old phrase - more hard work is required.

    As for the issue of so called AI and the frankly crap results obtained from the present training efforts, surely this is covered by GiGo, garbage in garbage out?

    Use an image that is a poor representation of the subject, then match this to a limited range of test samples, matching only a limited subset of data points and (don't) be surprised that the result is weak. Mind you, a human doing the same work will also have significant failure rates unless they are a super recogniser. To my mind the only way that training can ever be judged to have been effective is to have a small but entirely random number of target faces. Then to lose them in a large, truly heterogeneous mass of other faces and train and improve the results on a continuous feed back loop until the results result are way better than 98% success. It is almost certain that such systems will not travel well, the current range of characteristics in different locations is vastly different, take China, India, any European Country or North American city for example. Height, weight and tonal ranges will all be challengingly different. If, as at present the characteristics match range is as truncated as reports suggest it is I cannot see how you can expect anything likely to resemble accuracy.

    No system should ever be the sole arbiter for guilt or innocence. It can only be a single screening grid in a 'gravel sorter' of evidence, this is true for machine and human recognisers, the results need cross matching with something really old fashioned, such as evidence.

  3. Pascal Monett Silver badge
    FAIL

    Of course, now that there's money . . .

    Hey Barrat, when you agreed for your code to be used you didn't inquire as to what it was for. Now is too late to be all angry about it.

  4. DropBear

    Frankly, I subjectively find even Google's "fluid dogs" more artistic than these "portraits" unless one defines artistic as "just hideously ugly"...

  5. cNova
    Boffin

    Elementary School, My Dear Watson

    To be fair, Watson can beat cancer patients at chess nearly every time.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like