back to article US Homeland Security installs AI cameras at the White House, Google tries to make translation less sexist

Hello, welcome to this week's AI roundup. This week we have Google trying to fix its gender bias problem by offering both male and female pronouns for its translation service, but it dismisses gender neutral ones. The US Secret Service is testing facial recognition software to track "subjects of interest," but the ACLU has …

  1. Pascal Monett Silver badge
    Big Brother

    Facial Recognition

    If ever this test period comes out negative, it will be a serious blow to the people who swear it works.

    If it comes out positive, it will be the true start of Big Brother and you can expect it to infect every portion of your life in rapid succession.

    In either case, I will be watching for the follow-up.

    1. Anonymous Coward
      Anonymous Coward

      "In either case, I will be watching for the follow-up."

      So YOU are Big Brother?

    2. fajensen
      Terminator

      Re: Facial Recognition

      Depends on what "Works" means. If those AI cameras were to create an endless stream of potential suspects so that more resources and bigger gunz are always needed to keep up with "keeping us safe" then ... neither the DHS, the MIL-SEC complex nor the stockholders will complain much.

      My personal belief is that most, if not all, the predictions provided by these "solutions" are essentially random garbage, but, that nobody gives a shit because The Computer and The Algorithm is The True Oracles of Our Time so one is never wrong in trusting it (and pouring more funding down that particular drain).

    3. Anonymous Coward
      Anonymous Coward

      Re: Facial Recognition

      From what I recollect, the Met Police have trialed this and it had a lot of false positives. Never stopped them or whomever was creating it from continuing... (see https://www.theregister.co.uk/2018/07/04/met_police_commish_im_not_expecting_facial_recognition_tech_to_result_in_lots_of_arrests/)

      I wish they'd introduce this at Parliament as it's obviously the most important place in the UK. I mean if there's a few false positives, the MPs will get used to stop n search.

  2. DavCrav

    "Sexism in Google Translation is from biased training data. As the system is trained on millions of text scraped from the web, these sentences carry the historical and social biases humans have over time."

    Is this a new definition of bias that I don't know, where something is biased if it's an accurate description of current reality?

    1. Yet Another Anonymous coward Silver badge

      Made worse by using out-of-copyright Gutenberg books for some of the more obscure languages.

      So if the only source text you have for some indigenous language is the bible translated by some C18 missionary your neural net is going to be a bit short of gender fluid pronouns

    2. Mark 85

      Languages often translate imperfectly, that's a given. I'm not sure what the fuss is about other than some gender activists are raising hell. These "activists" need to get a grip. Translation isn't and never will be an exact science and this back and forth translation is sure mix things up a bit.

      1. Anonymous Coward
        Anonymous Coward

        I'm not sure what the fuss is about

        I think the fuss is about the limitations of phrasal recognition in translation without context sensitivity, which is a long standing issue.

        Many languages - Germanic, Romance, Slavic - in the West are highly gendered. Some languages are not.

        Take a language with non-gendered pronouns. Faced with the sentences equivalent to "This is Susan. It is a doctor", a human translator has no problem. She knows Susan is female, and so can translate into English as "This is Susan. She is a doctor." But a phrasal automatic translator will result in, given the usual statistical bias, "This is Susan. He is a doctor." Because it doesn't remember context from one clause to another.

        A reverse example would be to take the sentence (transliterated Russian) "Vot Dima. Ona sobaka." Now sobaka is the common word for a dog in Russian and is feminine in grammar. "Dima" has a feminine ending. So a translation engine is likely to come up with "Here is Dima. She is a dog." In fact, "Dima" is a masculine nickname - as a human translator will know - and the correct translation is "Here is Dima. He is a dog."

        Translation is as you say not an exact science. But humans are very sensitive to words even without taking modern sensibilities into account, and they would consider the misgendering a poor translation.And laugh at the stupid mistake.

        Google Translate does a reasonably good job between related languages but that is actually less of a challenge - where it would be most useful is between unrelated languages. And it's issues like these that will take the most effort to iron out.

        1. find users who cut cat tail

          Re: I'm not sure what the fuss is about

          Your examples are unnecessarily too complicated to illustrate why Google is failing. They involve knowledge of which name is used for which gender, general knowledge of gender in grammar (BTW that's the only gender -- do not confuse it with sex), the often confusing conventions for animal gender, etc. AFAIK the basic problem is that it just translates sentence by sentence, with no context whatsoever.

          So in

          Two girls arrived. They brought apples.

          and

          Two boys arrived. They brought apples.

          the second sentence is Google-translated the same -- to all languages I know where there should be a difference. There is no knowledge of things like names necessary and the gender matches the sex. There may be even enough similar fragments in the corpus for both sexes. But the translation still fails.

          1. Anonymous Coward
            Anonymous Coward

            Re: I'm not sure what the fuss is about

            "Your examples are unnecessarily too complicated to illustrate why Google is failing."

            You say that and then say the problem is lack of context sensitivity, which was exactly what I was saying, and just giving a couple of examples to show how it works.

            Which languages do you know in which third person plural is gendered? Telling us might give some value to your post.

            1. Mephistro

              Re: I'm not sure what the fuss is about (

              "Which languages do you know in which third person plural is gendered?"

              Spanish*, and most latin-derived languages. But the real issue here is, as you said, lack of context.

              I don't envy the job of the boffins that have to fix this. Just thinking about this for ten minutes makes my BRAIN HURT!

              8^)

        2. Anonymous Coward
          Anonymous Coward

          Re: I'm not sure what the fuss is about

          Faced with the sentences equivalent to "This is Susan. It is a doctor", a human translator has no problem. She knows Susan is female, and so can translate into English as "This is Susan. She is a doctor."

          Ironically enough, what would it do with the name 'Ellis'?

        3. Anonymous Coward
          Anonymous Coward

          Re: "This is Susan, She is a Doctor"...

          Unless it's a film about a Man Named Susan... ;)

    3. Tony W

      Guessing is not good enough

      The problem is far more basic: Google should admit when it is guessing. Then it would stop making my Japanese friend keep referring to "your" "his" or "her" mother, son, daughter etc., when she means her own mother etc. I can mentally strip out these wrong pronouns but why cannot Google mark words that it is interpolating or guessing the gender? And the same goes for Microsoft of course.

      It is not about what is "reality." The relative number of male and female surgeons in various countries is quite irrelevant. If the original left the gender ambiguous then so should the translation, otherwise it has a good chance of being incorrect.

      1. anatak

        Re: Guessing is not good enough

        Of course Japanese being famous for leaving out the subject (almost always) and the object (often) it makes machine translations hugely fun to decipher. Basically if you can understand Japanese then you can understand the translation.

    4. big_D Silver badge

      The source data isn't necessarily biased or wrong, when it is information, for example, talking about Dr. John Smit the Surgeon and refers to him as he, it isn't biased.

      It is the selection of data that Google uses, its dataset that has the bias.

      That said, given the apocalytic errors it can make with translations, I would think that this gender bias is a very minor point.

      A few years back, Google Translate would ignore the word "not" when translating from English to German! So "do not open the case" would be translated to "das Gehäuse öffnen" (open the case). For example:

      "Do not open the case, high voltage inside" ´= "Gehäuse öffnen, Starkstrom drinnen" (Open the case, high voltage inside)

      "Do not open the case, no user servicable parts inside" = "Gehäuse öffnen, nichts drinnen" (Open the case, nothing inside)

      I actually put the correct translations into the feedback and it has improved since, but I would think such errors should have priority, as they can be downright dangerous.

  3. Luke Worm
    Thumb Up

    AI in Oval Office

    .. was my first thought: Finally some I even if it’s AI

    1. amanfromMars 1 Silver badge

      Re: AI in Oval Office

      What is AI in Oval Office programming for, and with what sources and resources for future information presentation with Advanced IntelAIgent Processing of Alien Means and Memes?

      Anything at all Earth Shattering and Ground Breaking/Systems Crashing and Networks OverLoding?

      And this is a real good read .... unveiling myriad opportunities .... https://ellis-open-letter.eu/ellis_announcement.pdf

      And just whenever some might be thinking there's nothing much going on out there for around the likes of here with El Reg Congregations and Virtually Advanced IntelAIgent Operating System Hubs.

      And..... if they aint got anything like that, then that is a product which in other hands, hearts and minds is either their overwhelming competition or crushing opposition.

      Methinks also a Prime Premium JEDI Program Candidate for DOD Operational Slush Funding with Top Secret Seeding Investment Feeding Collateral .... Flash Fiat Cash EMPowering NEUKlearer HyperRadioProACTive IT Energy for Synergies ....... Mass JOINT AIdVentures from the Perfect Goal to where you be now at.

      That way round you get to experience all the Fabulous Fun and Great Games of the Fair with CyberWareFare.

    2. Fruit and Nutcase Silver badge
      Black Helicopters

      Re: AI in Oval Office

      Not sure if it' a good idea (then again, it could be...)

      when Trump and his cohorts are marked as "persons of interest" due to the work being done by the Special Counsel and others.

  4. Sgt_Oddball

    Dr strangelove: or how I learned to stop worrying and love the AI

    I do wonder how such systems will cope with those capable of hugely altering their facial appearance without surgery and even their own mannerisms. Usually such notables do it for the laughs (Peter Sellers, Sacha Baron Cohen, Mike Myers to name a few) but what if we have some bad actors (pun intended) doing it? How long before multiple false positives lead to deaths directly because an AI said so..

    Cue a deadly game of (copy)cat and mouse.

    1. fajensen

      Re: Dr strangelove: or how I learned to stop worrying and love the AI

      How long before multiple false positives lead to deaths directly because an AI said so..

      Oh. only about -15 years: "We" have been droning the wrong people for the better part of a decade based on garbage intelligence, the so-called "Signature Strikes".

      If we include the Vietnam war, where the same sensor networks and statistical analysis bullshit and fllim-flam was tried, for better than 50 years.

      But, what matters of course is when it happens to *us*. That serving is coming right up, I suspect. Some "smart" AI-system sending SWAT 'round instead of a teenager? What ever could go wrong?!

      1. Version 1.0 Silver badge

        Re: Dr strangelove: or how I learned to stop worrying and love the AI

        Time to get out the Insane Clown Posse makeup.

  5. Cliff Thorburn

    “That way round you get to experience all the Fabulous Fun and Great Games of the Fair with CyberWareFare.”

    I experience that on a daily basis amfM, just remove the fun ....

  6. Anonymous Coward
    Anonymous Coward

    Should I laugh or cry...

    'negative matches will be deleted automatically.'

    'DHS said it would be transparent about its process'

    I think I'll do both because they will do neither.

    Balance and all that...

  7. a_yank_lurker

    Translation and Artificial Idiocy

    The basic flaw with a translation algorithm is languages are highly idiomatic and context sensitive. It is difficult enough for a skilled human translator to make a readable, accurate, idiomatically correct translation. So to think artificial idiocy can do anything but a haphazard, mediocre translation is fool's errand.

    1. Anonymous Coward
      Anonymous Coward

      Re: Translation and Artificial Idiocy

      I think mediocre does not mean what you think it means. Mediocre means it would be doing about as well as an average translator (the medi as in Mediterranean, lit. middle of the Earth, gives it away.)

      Looking for idiom is I think unrealistic, especially as there are so many within a given language. (Translating, say, Raymond Queneau's street argot must have been a nightmare). But readable and accurate may be achievable for non-literary text, which is what most people are going to want most of the time.

    2. DCFusor

      Re: Translation and Artificial Idiocy

      i agree about the fool's errand, though not the gender fluid silliness you avoided mentioning, yank.

      On the other hand, something that gets things consistently wrong ... is the kind of thing you can fix, or learn from your ideas on fixing being wrong. It's the intermittent bugs that kill ya. In the case of translation, as you point out, there are a lot of other issues that are a lot more important to most people "readable, accurate, idiomatically correct" seem like a good start on a list.

      And the Vodka is still strong but the meat is rotten.

  8. This post has been deleted by its author

    1. Throatwarbler Mangrove Silver badge

      And a rotting one, at that.

    2. Sgt_Oddball

      It's...

      A red-and-blue striped golfing umbrella!

  9. GrapeBunch

    Alas, the meeting with her never happened.

    Alas, the meeting with him never happened.

  10. Alan Johnson

    Meaning of Bias?

    The supposed bias in google translate is the second time i have read stories about gender 'bias' in google AI projects an in both cases there was no bias just an unwillingness to accept what he statistical data was telling the developers and the 'solution' introduced bias.

    The problem for google translate is that it does not work as a human translator understanding the original text and context and then creating text with the equivalent menaing in another language but via machine learning looking at combinations of words and how they are most commonly associated with words in another language. There is no understanding and therefore the translation can be completely wrong. The further the languages are apart in structure and grammar the mor elikely errors including gender errors are. In english if the subject is singular and the sex is nt known or not relevant the pronoun is most commonly 'he', in modern english in order to preserve neutrality about sex it might be 'they'. The latest translate gives options of both 'he' and 'she' but she is never right if it is intended as a neutral pronoun so the translation has become worse and will only be right occasionally by chance. A deliberate introduction of a bias not present in the original machine learning algorithm or data has been introduced. We could just be grown up an recognise that machine translations have many flaws and the issue of gender in traslation is by no means the worst. We can use google translate because of speed or lack of cost but we have to accept it is not a human translator.

    The other case of bias is when google trained an AI to help filter CVs for job applicants and discovered that it was preferentially selecting male candidates whcih again was described as 'bias'. What had actually happened was that the training data showed that the best performing employees were more likely to be men than women and the algorithms naturally incorporated this into its algorithm. We can speculate about why this is but it is well known that on average women work substantially fewer hours and take long career breaks. A human interviewer cannot ask a female candidate in her late twenties if she plans to start a family and take time off but a machine learning algorithm cannot help but incorporate the probability of this into its behaviour. In both cases it is not bias which is the problem but the fact that in certain areas it is politically convenient for people to ignore reality. When a machine learning algorithm makes it hard to ignore then the result is described as bias.

  11. Totally not a Cylon
    Big Brother

    Surely ....

    The best place to test Facial Recognition would be inside the Whitehouse?

    Especially the West Wing where all the staff are, a known pool of people who can be voluntold they're taking part in the trial.

    1. Someone Else Silver badge

      Re: Surely ....

      Well, maybe. What if Komrade Kompromat is recognized as a known Russian agent?

  12. spold Silver badge

    Hmmmm

    I'm wondering if this is Huawei technology? I'm sure the Chinese have well tested this sort of stuff. Perhaps the US is currently angling for a discount?

    (also I wonder if the AI will be the most intelligent thing in the White House...)

  13. JohnFen

    Why should we believe this?

    "Only pictures that correspond to a positive match will be kept, negative matches will be deleted automatically. "

    Oh? Why should we believe this to be true?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like