back to article Zero arrests, 2 correct matches, no criminals: London cops' facial recog tech slammed

London cops' facial recognition kit has only correctly identified two people to date – neither of whom were criminals – and the UK capital's police force has made no arrests using it, figures published today revealed. According to information released under Freedom of Information laws, the Metropolitan Police's automated …

  1. Whitter
    Devil

    Fixed the obvious error...!

    the government has said it isn't possible to automate removal. This means that they remain on the system unless a person asks for them to be removed the system has been turned off

  2. yossarianuk

    Nothing to hide

    They bleated

  3. Anonymous Coward
    Anonymous Coward

    So it's working as intended then, The police can stop who they want when they want and say they got a match as justification.

    1. argospeter

      Wot!

      Wot ever happened to INNOCENT UNTIL PROVEN GUILTY.

      SUDDENLY POLICE HAVE YET AGAIN TAKEN THE PLACE OF DUE PROCESS. ACTIONMAN IS BEING GROOMED FOR JUDGE DRED ROLLS NEXT!

      1. Anonymous Coward
        Anonymous Coward

        Re: Wot!

        Rolls??

        I thought he travelled by flying motorbike

      2. HamsterNet

        Re: Wot!

        It's Innocent UNLESS proven guilty.

        Using Until is a presumption of guilt.

        Please get the fundamentals about your liberty correct. This one little word swap is how we are in this situation.

        1. Scroticus Canis
          Holmes

          Re: Wot! - "Using Until is a presumption of guilt."

          They may have been referring to the way the police work.

  4. Bangem
    Facepalm

    This just in...

    ..Government funded IT project delivers no tangible results after huge outlay in taxpayers money. /Shockedfaced

  5. tiggity Silver badge

    checks and balances on incorrect id say MET

    .. tell that to the relatives of a certain Brazilian electrician who had a magazine emptied into his head at close range by London's finest

    1. Richard Jones 1

      Re: checks and balances on incorrect id say MET

      I understood that in his highly unfortunate case the failed identification was by wetware without any cross checking by machine or face ID to verify the details, so was this a relevant reference? Hopefully the machines will/can get better, but how do you improve the wetware? Improved training and processes can only go so far.

      This in no way disguises the unacceptable outcome in his case.

      1. MonkeyCee

        Re: checks and balances on incorrect id say MET

        "but how do you improve the wetware? "

        If you delve into the details of the case, it would appear that at least one of the issues was that the first piece of wetware that identified the suspect was in fact having a pee at the time. So rather than saying "I've no idea who that is" a certain class of wetware just lies t cover their own ass.

        That subsequently no-one checked and a run on of worst case assumptions lead to some terrible decision making. Normal behavior, like getting off a bus, finding the train station closed, then getting back on another bus, was interpreted as him "seeking a target".

        The inquiry is quite insightful reading. The main thing that they do a good job of dodging is admitting that they where using military assets (surveillance teams, possibly the shooters) in what should be a civvy only situation. Hence why no prosecution for the individuals, since that would inevitably show that sending in a soldier results in a killing, rather than a sending in a cop and getting an arrest.

        But yes, I don't see how it's relevant to the discussion of FR, other than to emphasise that the wetware checks and balances are as susceptible to bias as the computer.

      2. Anonymous Coward
        Anonymous Coward

        Re: checks and balances on incorrect id say MET

        "I understood that in his highly unfortunate case the failed identification was by wetware without any cross checking by machine or face ID to verify the details, so was this a relevant reference?"

        Actually, yes.

        A mechanism that provides more opportunities for the wetware to fail, while setting up a bias in the estimation of risk, will lead to deaths.

        You get the same problem in a slightly different form when the police are called to an address and the automated check turns up the fact that there are firearms registered to that address. Then the police go to the wrong address, and being nervous, shoot and kill the person who answers the door. It happens.

        The interesting thing is that the firearms database does not actually make it safer for anyone, as real criminals do not register their illegal guns.

        This is an excuse for arbitrary and biased policing, and fails the same way airport security fails, by substituting inaccurate processes for trained observation and judgment. Compare North American airline security with Israeli airline security... one is rote security theater, the other actually works.*

        * North American airline security also works, because the purpose is to make passengers feel safe rather than being effective security.

        PS. I can think of at least two airliners lost with all aboard because of anti-hijacking security measures. Getting security right is not easy or obvious.

        1. Prst. V.Jeltz Silver badge
          Trollface

          Re: checks and balances on incorrect id say MET

          I'm hearing a lot of bitching but not many solutions being put forth here :p

        2. argospeter

          Re: checks and balances on incorrect id say MET

          Naughty Criminals. If they don’t register their toys! Should be taken away till they do! Lol lol lol .

          No wonder they call the law an arse!

    2. Anonymous Coward
      Anonymous Coward

      Re: checks and balances on incorrect id say MET

      I'd imagine even if they had today's facial recognition technology with high resolution imagery, the MET still would have still stated Jean Charles de Menezes was wearing an overly heavy overcoat and had jumped the ticket barrier, evading paying for his fare, even today.

      All subsequently found to be untrue, but corrections didn't get the coverage he deserved.

      So much disinformation that day by the MET, like facial recognition data, it's something many of us, retain indefinitely.

  6. illuminatus

    Surely though

    A false positive rate in the 90s, even if there is subsequent human intervention, would be easily challengeable if any case initiated on this basis came to court. A decent barrister, and a halfway decent expert witness would crucify it, one would suppose, and would probably e a highly public embarrassment for Inspector Knacker.

    1. Anonymous Coward
      Anonymous Coward

      Re: Surely though

      It's a prefilter. It throws out a list of possible matches for people to look at and compare. It's no different to having a person screening them for rough matches and passing them on to someone better at making positive identifications. Having a lot of false positives early in the process makes extra work but is largely irrelevant to the result.

      To reduce the number of false positives you would inherently increase the number of false negatives. That would be very relevant to the result.

      1. Anonymous Coward
        Anonymous Coward

        Re: Surely though

        "To reduce the number of false positives you would inherently increase the number of false negatives."

        That's BS if the actual recognition works. And it's obvious it doesn't. Fix that first and after that start to use it to arrest totally innocent people.

        Because it's obvious this piece of crap is used to arrest anyone claiming 'software flagged him/her'. How convinient, isn't it?

        90% false positives is sure sign of it, no-one in Police cares about false negatives: That's not the reason this system is in actual use.

        1. tip pc Silver badge

          Re: Surely though

          i wonder what the rate is for a divining rod?

          1. Teiwaz

            Re: Surely though

            i wonder what the rate is for a divining rod?

            At those odds, probably the same or better (much better, once found a lost ring in a large field with a set).

      2. Oddlegs

        Re: Surely though

        People really seem to be missing what these numbers mean. Imagine the police were searching for me. This system would identify 50 people, one of which would be me. A real person looking through those 50 photos will probably be able to very quickly discount the majority of them leaving only a small number requiring investigation. The alternative would be a massively larger and more expensive investigation.

        Privacy implications aside a system with a 98% 'false positive' rate is still hugely useful to the security services.

        1. Anonymous Coward
          Anonymous Coward

          Re: Surely though

          People really seem to be missing what these numbers mean. Imagine the police were searching for me. This system would identify 50 people, one of which would be me.

          You're making a big assumption there. Odds are the false negative rate of the system is no better than the false positive rate. Maybe you were wearing a hat, or glasses, or the camera didn't see you at the right angle, or you just weren't there that day. In which case of the 50 people identified, none of them would be you.

          1. Anonymous Coward
            Anonymous Coward

            Re: Surely though

            Basic identification uses facial geometry such as eye to eye distance compared to eye to nose and nose to mouth distances to rule out most matches. It's much better at confirming its not someone.

        2. ibmalone

          Re: Surely though

          People really seem to be missing what these numbers mean. Imagine the police were searching for me. This system would identify 50 people, one of which would be me

          That's not what a 98% false positive rate means. From the numbers given in the story it could be what they actually mean. But a 98% false positive rate would mean, of all people who are not you, 98% will be flagged as you. That's not hugely useful.

        3. Dodgy Geezer Silver badge

          Re: Surely though

          We don't know anything about the false negatives. Unless this is zero there is a chance you wouldn't be in the set of stopped people at all. And 50 people would have been hauled off and detained for a short time for nothing...

        4. Doctor Syntax Silver badge

          Re: Surely though

          "This system would might identify 50 people, one of which would be me."

          It depends on the rate of false negatives but given your figures, what happens to the other 49? How many of them get picked up, held for a few hours, searched, miss trains, get locked up because they refuse to give anyone the password to their phone?

        5. Teiwaz

          Re: Surely though

          A real person looking through those 50 photos will probably be able to very quickly discount the majority of them leaving only a small number requiring investigation.

          Unfortunately that's not how most I.T. is applied in the real world - As soon as the kit is installed and in regular use, it's very likely to be a case of 'Computer has fingered him/her/them, round the lot of them up and we'll sort it out back at the station...eventually, maybe.'

        6. Ken Hagan Gold badge

          Re: Surely though

          "Imagine the police were searching for me."

          True, but that doesn't seem to be how they are using it. They appear to be pointing it at large crowds and asking, who's there? The 98% failure implies that they are being told that roughly 50 times as many dodgy geezers are present than is actually the case.

          Not obvious why anyone is still throwing money at this pile of shit. Does our new Home Secretary have an unlimited budget?

          1. Adam 52 Silver badge

            Re: Surely though

            The statistic that you're all getting so het up about is from this line in the report:

            "Metropolitan Police’s facial recognition matches are 98% inaccurate, misidentifying 95 people at last year’s Notting Hill Carnival as criminals"

            Which is clearly bollocks. It identified 95 people from a crowd of hundreds of thousands. That's not a 98% false positive rate.

            Lies, damned lies, and lies from political pressure groups quoted by "journalists".

            1. Richard 12 Silver badge

              Re: Surely though

              94 of whom were not a person of interest, yet were stopped, searched and otherwise inconvenienced because the computer said so.

              When we already know that the Met have great difficulty in avoiding murdering innocent civilians, promoting those responsible all the way instead of firing them for gross misconduct, does one trust that none of those 94 will even survive the night?

            2. georgezilla Silver badge

              Re: Surely though

              " ... Which is clearly bollocks. It identified 95 people from a crowd of hundreds of thousands. That's not a 98% false positive rate. ... "

              You are right. But ...

              ... 98% inaccurate rate means that 93.1 ( so 94 ) people were misidentified. And the one that was correctly identified, was not a criminal. Making the use of it worthless. Period.

            3. Allan George Dyer

              Re: Surely though

              @Adam 52 - It's not a 98% false positive rate, but it isn't a useful result either.

              The problem is, we have no idea how many people in those hundreds of thousand were in the "wanted" database. It's sort of infeasible to find out, I suppose you could ask everyone there, "Are you wanted by the Police?", but there's the possibility someone might lie.

              However, big crowds attract pickpockets, and at least some of the pickpockets would be in the "wanted" database, so it's safe to say there were some people in the crowd the Police would like to find. The facial recognition found NONE of those people, but it did cause 94 interviews with entirely innocent people, and one with a person no longer of interest. This was a waste of Police time, those officers could have been looking around for people sneaking wallets out of pockets and bags instead.

              Cancel the facial recognition system, and charge its developers with Wasting Police Time.

          2. John Brown (no body) Silver badge

            Re: Surely though

            "They appear to be pointing it at large crowds and asking, who's there?"

            Not quite. They pre-load the system with a "watch list" of persons of interest who they suspect might be there, they are not just scanning the crowd in the hope of matching with all know baddies.

          3. Voidstorm
            Holmes

            Re: Surely though

            "Does our new Home Secretary have an unlimited budget?"

            As far as its suits their political agenda, sure they do. ;)

        7. bigbill

          Re: Surely though

          You are absolutely right. Out of the 2 million people that visited Notting Hill, it thought that 95 of them matched a face in a watchlist. When each match occurs it creates a system event showing the original and captured image side by side which a human being reviews and confirms before any action is taken.

          The police will have set the system to have fairly low matching confidence as they'd rather have false positives than miss people. The alternative is to have human beings monitoring CCTV feeds and manually picking faces out of crowds. They used technology to filter those 2 million down to 95 people before then manually reviewing. I don't see that was a waste of time.

        8. strum

          Re: Surely though

          >This system would identify 50 people, one of which would be me.

          Not a given. It's just as likely that they'd have 50 Oddlegs candidates, that take them a day or more to discount.

          Meanwhile you, (you evil bastard), embark on an uninterrupted crime spree, with the plod convinced you're under control.

      3. ibmalone

        Re: Surely though

        98% false positive is not a pre-filter, it's nearly everything. I'm not sure they really mean false positive rate. Could be false detection rate, which sounds like minor pedantry, but is actually a major difference when you're talking about a ratio of thousands of true positives to true negatives.

        (False positive rate: proportion of true negatives that get classed as positives. False detection rate: proportion of detected positives that are true negatives. With 100% sensitivity and 98% FPR in a crowd of 1000 'normal' and 1 target you will flag 981 people, with 98% FDR you'll flag approx 51.)

        1. Anonymous Coward
          Anonymous Coward

          Re: Surely though

          "98% false positive is not a pre-filter, it's nearly everything"

          No. 98% of the "matches" it gives are false positives. For every person in the crowd it tries to identify it throws up about 50 possibles for a human to look at and confirm. 49 of those won't be the person they are trying to identify and 1 of them potentially is.

          1. Why Not?

            Re: Surely though

            Indeed, I'm worried that so many posters & journalists are having such difficulty with the maths.

            Out of 10s of thousands of faces it picked out 95 that looked similar to photo's of people of interest. Of those only two were actual matches which is the bit that needs training and further work if we can get that up to 20% then that is as good as current intelligence with no actual police work.

            Its the same as having one policeman looking at a crowd turn to another and say "does that looks a little like Ronnie Biggs?" and his colleague says "nope its his cousin" or "Ronnie Biggs has less hair". It is just brushing aside the irrelevant to reduce the number of possibilities not "Minority report". Its a tool not a complete solution.

            Now if the Police take the suggestions as certain matches to stop & question then they aren't doing their job properly. We know they can be more selective because stop & search has a 17-20% success rate (http://www.bbc.co.uk/news/uk-43641009).

            However the report doesn't say they stopped & questioned all 95 potentials, just that they used that method to verify some. So the only issue here is that the Police had more potentials to look through than if the system were perfect.

            I would have thought the lefties would be glad if the Police had more targeted stop & searches using nothing more than a publicly visible face. Imagine if we could get stop & search up to over 40% success rate!

            As to shooting Brazilian electricians, then one hopes if the system said loudly "no match" when he was seen on CCTV it could have saved his life. We should train people to realise that computers like trained anti terrorist officers aren't infallible and its reasonable to question decisions, we should also build it into the application.

        2. TheMeerkat

          Re: Surely though

          You don’t understand Math, do you?

          98% of false positives is when from 10,000 you select 100, 2 of whom are criminals.

          1. ibmalone

            Re: Surely though

            You don’t understand Math, do you?

            98% of false positives is when from 10,000 you select 100, 2 of whom are criminals.

            This is a bit of a wriggle, since you've left out a crucial word. The term used throughout the article is false positive rate, which doesn't mean that at all. To work out the false positive rate you'd need to know the proportion of "non-criminals" in that 10,000 (or, more practically, number of people in not in the target database, otherwise we get into questions of just what makes a criminal).

            The "Face Off" campaign website shows something different to the article, a nice pie chart with 2 true positive matches and 101 false positive matches for the metropolitan police. That's a false detection rate of 98%, not a false positive rate. If we assume they got all the criminals and looked at 10000 people then their false positive rate was 1%, specificity 99%. Which sounds great doesn't it? It would be really good. And yet, hypothetical sensitivity 100%, specificity 99%, still gives 98% false detection rate, because you're looking for rare events. This is really important to be aware of when applying detection methods. If you're planning on stopping someone and questioning them on the basis that they might look a bit like someone you're after, not because they were doing something suspicious or any other reason, then you are going to end up doing it to a lot of innocent people.

            Looking at their breakdown for events it was used at is interesting, Remembrance Day 1 TP, 6FP (much better than Notting Hill), Notting Hill '17 1 TP 95 FP (so closer to 99% FDR), Notting Hill '16 0 TP "?" FP.

          2. Jtom

            Re: Surely though

            No, in this case, 100 were selected, only two matched their database, and they were NOT criminals, which makes the whole process completely worthless.

        3. Prst. V.Jeltz Silver badge

          Re: Surely though

          "98% false positive is not a pre-filter, it's nearly everything"

          No its not 35 wrong , 1 right at the Notty carnival means the cops check 35 people instead of 50,000

          1. ibmalone

            Re: Surely though

            "98% false positive is not a pre-filter, it's nearly everything"

            No its not 35 wrong , 1 right at the Notty carnival means the cops check 35 people instead of 50,000

            No. Just no. The story may be wrong, but, and I'm getting a bit tired explaining this, the FALSE POSITIVE RATE is the PROPORTION of TRUE NEGATIVES that are DETECTED AS POSITIVE. 35 people out of 50000 is NOT A FALSE POSITIVE RATE OF 98% unless you've got the algorithm back to front and only 36 of the people in the crowd were people you weren't looking for (which would mean 49964 were people you were looking for, of whom you found 1, giving an astonishing sensitivity of approx 0.002%).

            Now the story may well be wrong, but can people sort out their understanding of the terminology please? It's not difficult and as I've tried repeatedly to demonstrate here it can make an absolutely massive difference to use the wrong term.

            Edit: reading again, I'm not sure why I bothered replying, since it's clear you didn't read the first post past the first sentence either.

      4. Doctor Syntax Silver badge

        Re: Surely though

        "It's a prefilter."

        That was my reaction. If it was doing a good job at that it would be worthwhile. But even for a prefilter that rate of false positives is very high and raises the question of how many false negatives there are. Is there adequate reason to suppose it's doing a better job than picking faces out at random?

      5. JLV

        Re: Surely though

        I believe it was Bruce Schneier who warned (sure he's not the only one) that a system with too many false alarms lowers security.

        Let's pretend you're a guerrilla wanting to attack a fortified camp, which is ringed with smart fence kit. Now, have a herder drive goats near it for a month, setting off the alarm every night. Soon enough, the guards are going to turn off the fence, or ignore its warnings.

        The camp is now less safe than if it relied on Mk 1 Eyeballs. Esp if the army decides it can do with less men cuz 50M$ fences. And even more so if the generals insist those fences work against real world evidence.

        1. Why Not?

          Re: Surely though

          This is not a failure of the technology but a failure of the people.

          The correct response to a goat herder coming too close to the fence, is to explain to him if it does it again the garrison will be eating goat curry from now on and he won't have any goats to herd.

          Then the fence stays on and those in the watchtower might get some sniper practice.

          Technology is limited but if properly designed & used its very effective. Its normally far better on repetitive jobs than any human.

          The proof is MK1 eyeball has been supplanted by MKXX mines & sensors in most scenarios where it matters. By the way its not a MK1 eyeball as it is one of the more evolved parts of our bodies.

          Bruce is probably pointing out that people are broken but will blame technology so you have to have a plan to deal with events.

    2. Antron Argaiv Silver badge

      Re: Surely though

      .... a highly public embarrassment for Inspector Knacker.

      But a sizeable end of year bonus for the CEO of the company that sold the kit to the Met.

      (and, hopefully, the salesperson as well)

      // "fixed in next release"

      1. Mog_X

        Re: Surely though

        The CEO and Inspector Knacker being in the same Lodge would be a complete coincidence....

    3. Prst. V.Jeltz Silver badge

      Re: Surely though

      A false positive rate in the 90s, even if there is subsequent human intervention, would be easily challengeable if any case initiated on this basis came to court.

      How could it be? the cops would show the cctv pic against the file pic and and say looky - same person , and if you cant see that , lets look at the ID recovered off the suspect at the time of arrest - it matches the name on the file pic.

      And it would - because the police would have done these blindingly obvious checks at the time - not merely taken him to the cells for a good kicking on the word of the newfangled wrong-most-of-the-time system.

    4. cloth

      Re: Surely though

      Hmm, so the conversation goes like this...

      "Hello Sir, you have been identified by our systems as a person of interest - are you Dr Death who is an international terrorist?". "No Mr/Ms PC sir, I am not but, as I'm not living in a police state I have no official proof other than these fake credit cards on me". "Oh, in that case, thanks for your time. Did you want a leaflet on why we pulled you up sir and made you feel like a piece in a machine?...."

      Nice !

  7. Christoph

    "Officers can quickly establish if the person has been correctly or incorrectly matched by traditional policing methods, either by looking at the person or through a brief conversation,"

    It's perfectly OK because nobody was wrongly arrested. Just large numbers of people going about their business who were harassed with demands that they prove their innocence, in a public place in front of lots of other people.

    1. Pen-y-gors

      and also, a massive waste of police time confirming the thousands of false positives, time possibly better spent actually investigating and preventing crime in a more traditional manner? (Or hanging around outside a London embassy 24/7 of course - it's what police do best)

      1. Oddlegs

        Why is it a massive waste of police time? What's the alternative? Hundreds of real police officers scanning crowds looking for known trouble makers? That would cost an awful lot of money and I'm not even sure they'd be more effective than a 2% positive match either.

        1. Anonymous Coward
          Anonymous Coward

          Well, if we're going to stereotype. Oddlegs, you sound like you have a white gammon face, with little to nothing to worry about in terms of false positives, as the public events you'll be attending won't have facial recognition. Chelsea Flower Show? O/C Boat Race by chance?

          What about the poor bastards with the face that doesn't fit, but are wholly innocent?

          2016 - Where do we trial Facial Recognition?

          Notting Hill

          2017 - Where do we trial Facial Recognition?

          Notting Hill

          2018 - Where do we trial Facial Recognition?

          Notting Hill.

          You get the picture.

          Anyone would think the MET/Home Office had an agenda against a certain section of our society.

          1. Oddlegs

            I'm still waiting to hear the alternative. This system's not perfect. In fact it's pretty poor but still better than nothing. The decisions on where to trial such technology (and any racial implications thereof) lie wholy at the feet of real people. This AI is probably far more race agnostic than the people administering it.

            1. Anonymous Coward
              Anonymous Coward

              How can the AI be 'race agnostic' when the chosen places it's trialled are deliberately highly skewed against particular groups in society?

              It can hardly be race agnostic if the sampling area is of a particular race/culture (due to historical links to their culture) i.e. Notting Hill.

              It's 'in your face, deliberate use of skewed unreliable technology based on stereotypes' - aimed at creating an uncomfortable "hostile environment" to all those in attendance.

              I don't even think it's about the effectiveness of the technology per se, it seems somewhat political, having that whiff of indirect racial slurs the use of such technology creates. (Theresa May {the gammon generation} way of saying their views without directly saying them)

              It's headlined/transcribed to the media and designed to be unpleasant in its approach.

              It's not subtle one bit regards racial slurs on sections of society.

              It creates tension and undermines the community links that have been created over many years.

              The approach and use of this facial recognition technology by the MET is utterly wrong.

              1. Arthur the cat Silver badge
                Devil

                It can hardly be race agnostic if the sampling area is of a particular race/culture (due to historical links to their culture) i.e. Notting Hill.

                Let's try it at this year's Wimbledon, with the results superimposed on the live TV coverage.

            2. Teiwaz

              The alternative is real human Police presence, but that's expensive and moderately risky (more in the insurance risk area, than lives, because it's money that's at stake). /jaded cynic

              Facial recognition tech is just one of those wet dreams they couldn't and can't pass up, no matter how many stat reports say it doesn't work.

              The 'a.i.' maybe agnostic, it's also apparently just as unintelligent as the people trying to apply it.

              Nothing particularly agnostic about where it's been applied, nor the blind faith by which it's been repeatedly applied.

            3. Chris G

              @ Oddlegs

              "This system's not perfect. In fact it's pretty poor but still better than nothing."

              Nothing is better than this. The system flags 98% wrong, that's 98% of the time officers are acting as part of the human intervention, they are wasting their time. How can that be beneficial? Using an ill conceived, poorly developed system can hardly be called good policing.

            4. JohnFen

              "but still better than nothing"

              I disagree. I think it's the exact opposite of that.

            5. Mark 85

              This system's not perfect. In fact it's pretty poor but still better than nothing.

              Actually "nothing" would be better. As it is, there's a price for this system being used and not an acceptable one. Would you still be defending it if one of your loved ones was gunned down because the system said they were "bad"? There's an old quote that needs to be remembered (and why it should be) about letting 10 guilty men go free rather than convicting one innocent. The system is imperfect, as are humans, and therefore we should tread lightly.

              1. Adam 52 Silver badge

                "There's an old quote that needs to be remembered (and why it should be) about letting 10 guilty men go free rather than convicting one innocent."

                Except it's not that happening here. Suppose you know that, for example, the English Defense League is planning to throw a bottle of bleach over the crowd at Notting Hill. You upload your 500 known or suspected EDL activists and scan the crowd. The computer flags 100 people. You check them. You have potentially saved, roughly, 20 people from life changing injuries for the cost of some minor inconvenience and a couple of PCs. In other words doing nothing is not a zero harm option.

                To recalibrate your perspective there were 5 Police shootings in 2017, 4 of them terrorists in the process of carving people up with machetes and one a man threatening people with a hand gun.

                And 800 acid attacks.

                1. Anonymous Coward
                  Anonymous Coward

                  "Suppose you know that, for example, the English Defense League is planning to throw a bottle of bleach over the crowd at Notting Hill... To recalibrate your perspective there were 5 Police shootings in 2017, 4 of them terrorists in the process of carving people up with machetes and one a man threatening people with a hand gun.

                  And 800 acid attacks."

                  None of them, as far as I'm aware, carried out by members of the EDL, but many - I think, most - of them carried out by members of a different sector of society.

                  If it walks like a duck and quacks like a duck...

                  ...it's either a duck, or something *pretending* to be a duck, in which case it shouldn't be surprised if it's mistaken for a duck.

            6. Anonymous Coward
              Anonymous Coward

              No it's not

              >> This system's not perfect. In fact it's pretty poor but still better than nothing.

              No it's not, you are misunderstanding what has been said, however you do need the full set of figures to really understand what has been done.

              If it has 'matched' 100 people and it has a false positive rate of 98% then 98 of those matches are wrong, No if's, no but's. That is rubbish, it's not better than nothing as you've wasted time on 98 people who you think are a match because the computer goes ping. It's not just the 98 people, its all the police time that's been wasted as well.

              We need to know the False Negative Rate as that is NOT simply the reverse of the FPR. It could well be that the FNR failed to identify 98% of the people it should have done. You would be better off sticking a pin in a photograph and arresting that person.

              The last time I looked at this technology was about 10 years ago and we were getting a fraction under 80% recognition rate on reasonable quality passport sized photos of adults between 18 and 65. Pointless doing it on kids unless the photo is very recent and even then, not good. Often pointless on older people as their face sags with old age making the face detection tricky.

              I'm a little surprised that the Met is getting such rubbish results as technology has moved on a lot. I suspect that they paid peanuts and got monkey technology. However thats just a guess.

          2. Oddlegs

            There are serious privacy implications of this technology which don't sit well with me but that's not what this article's about. This article seems more interested in making the point: 'look at the stupid police using a system which is wrong 98% of the time'

          3. Wandering Reader

            "Anyone would think the MET/Home Office had an agenda against a certain section of our society."

            Well, quite. Instead of all that police effort at the carnival, send the police round to our street, where some dog walkers don't scoop.

            As for facial recognition - the Guardian periodically castigates Google because AI facial recognition isn't good at identifying people with dark skin. Whether it is a programming issue or an AI training issue, it's probably better to try the software out in Notting Hill rather than Inverness, isn't it?

            1. Anonymous Coward
              Anonymous Coward

              "As for facial recognition - the Guardian periodically castigates Google because AI facial recognition isn't good at identifying people with dark skin."

              Neither is my camera. Is yours? Identifying a black face is hard because of the lack of consistent tonal contrast within the face, compared with pale complexions. My camera won't focus on black faces, probably because black faces with their low inherent contrast essentially lack the 3-point triangle simple face recognition systems use to identify faces in the first place.

              And Japanese camera manufacturers, who sell the bulk of their cameras to Japanese, obviously optimise their face recognition systems for Japanese faces, which are easy (perhaps the easiest) to recognise from three points.

              Just saying.

          4. bigbill

            Have you actually read the report? Met Police also used it Remembrance Day.

          5. Anonymous Coward
            Anonymous Coward

            "Anyone would think the MET/Home Office had an agenda against a certain section of our society."

            The most criminal by percentage? I would certainly hope so.

          6. Anonymous Coward
            Anonymous Coward

            "Well, if we're going to stereotype. Oddlegs, you sound like you have a white gammon face, with little to nothing to worry about in terms of false positives, as the public events you'll be attending won't have facial recognition. Chelsea Flower Show? O/C Boat Race by chance?"

            Gosh, yes. What an agreeable life we 'gammon' types lead. You should try it; it costs no more to stand on the bank of the Thames watching the Boat Race than it does to attend the NHC. Pimms is not more expensive than Skunk.

            Or would that be Cultural Appropriation?

        2. Anonymous Coward
          Anonymous Coward

          Well even 5 minutes time spent on this would be wasted at 98% false positive and two identifications of people who weren't actually wanted. And a real live police officer might perhaps deter criminality before it happens?

        3. Anonymous Coward
          Anonymous Coward

          "Why is it a massive waste of police time? What's the alternative? Hundreds of real police officers scanning crowds looking for known trouble makers? That would cost an awful lot of money and I'm not even sure they'd be more effective than a 2% positive match either."

          The sensible alternative is normal numbers of police who arrest anyone who commits a crime, whether they are 'known troublemakers' or not.

          For that matter, there is no justification for harassing 'known troublemakers' unless they actually make trouble, unless your goal is to convince them that they might as well make trouble if they are going to be proactively blamed for it anyway.

  8. }{amis}{
    Trollface

    The Maths

    Given that these figures are probably low balling the reality of the situation the Met seem to have found a new branch of maths where is possible to have a greater than 100% screw up rate.

  9. Sheepykins

    I can see how this would lead to a lot of false positives.

    We may be a nation besieged by CCTV but its terrible quality CCTV, with a conviction rate that hardly ever sticks without facial recognition.

    Honestly as long as its not in my home, I'm all up for high def cameras everywhere.

    My neighbours car was just stolen from beside mine last week, in our private car park, in broad daylight.

    They bust in and recoded his keyless entry in about 3 minutes.

    1. Pen-y-gors

      1) stick to cheap old cars

      2) move

      1. Antron Argaiv Silver badge

        3) bodge in a fuel shutoff valve

        1. Danny 14

          reminds me of my old maestro mg. not so much a fuel shut off valve but a fuel pump shut off relay. it was wire to the front fog light (which was in turn deactivated as i never used them).

    2. Adair Silver badge

      4) Leave the planet

      5) Failing that move into a local prison.

      1. Rich 11

        5) Failing that move into a local prison.

        You just know that somewhere out there is a property developer who is salivating at the thought of the government finally getting its act together and replacing the last of our old Victorian prisons with new ones fit for purpose.

        The developer no doubt thinks that HMP Pentonville et al can be converted to make high-density affordable housing with which to offset the planning conditions for his adjacent glass-and-steel high-rise. And it's not like any of the owners of the luxury flats will care about looking down onto a set of old prison blocks, because the Russian oligarchs will never even visit their expensive new properties.

  10. Anonymous Coward
    Anonymous Coward

    nevertheless

    We continue to pursue the policy of staying on the cutting edge of technology in order to protect the general public, their children, their children's children and their childrens' children's children, from the terrorists, porn-viewers, dissident perverts, perverted dissidents and other related and unrelated menace threatening the harmonious fabric of our thoroughly homogeneous and versatile society. In short: we set the rules, you follow them and if you don't like it, you can fuck off to Russia, that'll teach you to appreciate what you once had.

    1. Christoph

      Re: nevertheless

      And the Deviated Preverts!

    2. Intractable Potsherd

      Re: nevertheless

      @AC: "... we set the rules ..." Who is this "we". I'm a British citizen, and no one asked me about giving the police so many powers to try out potentially invasive technology without proper oversight. If they did ask, I'd tell them that we left Peel's idea of policing by consent three decades ago, and it is time that we went back to it. I'd also tell them that we need to go back to the levels of police officers we had three decades ago, and that ACPO needs to be declared an illegal organisation. I'd also say that "... if you don't like it, you can fuck off to Russia".

  11. Christian Berger

    Face recognition only works

    when your subject is cooperating. Even then it has failure rates far above anything useful.

  12. ibmalone

    Surely they can't really mean 98% false positives? That would mean given 100 people not on the list they're trying to identify, on average 98 of them would get flagged. In that case you may as well have a slightly dodgy light bulb that comes on when you point the camera at someone, you could still achieve 98% true positives.

    1. MonkeyCee

      Numbers

      "Surely they can't really mean 98% false positives?"

      I believe they do, since for the one example given:

      - 1 person correctly identified* - true positive

      - 35 people incorrectly identified and stopped - false positive

      Multiply that up to 100 in total and you get a 97.3:2.7, which is pretty close to the 98:2 rate.

      And yes, it is almost completely useless.

      * although the data was out of date, which isn't really the fault of the software

      1. ibmalone

        Re: Numbers

        That would be false detection rate, and a 98% FDR is usually somewhat better than 98% false positive rate. To avoid writing it all out again, wikipedia lists the possible permutations: https://en.wikipedia.org/wiki/Sensitivity_and_specificity

        98% false positive rate is awful, 98% false detection rate may be acceptable depending on application. False positive rate (or alternatively, specificity) and sensitivity are the easy parameters to measure if you've got a ground truth, all the others: false detection rate, accuracy, etc. depend on the proportion of true positives and negatives in the sample you're applying them too, and when looking for needles in haystacks they can produce very different numbers. I'm not sure what they were actually doing, applying this to crowds from CCTV and pulling in possible matches (in which case actual false positives would seem quite low if there were only 35 matches) or taking snaps of people they stopped anyway and running them through the search (in which case you need to know the number checked this was).

        And all the above is based on yes/no classification. If you've got multiple possible classifications then as others have mentioned your false positives for one match might also be false negatives for others.

        1. doublelayer Silver badge

          Re: Numbers

          Even if we go with a 98% FDR, the time wasted trying to keep up with the useless flags will be rather irritating. If, for every person we want to find, about 49 people are incorrectly flagged, and it takes five minutes to track that person, accost them, and question them enough to realize they aren't someone you're interested in, then over four person-hours are needed for each person you want to track just to deal with the mistakes from the system.

          Now, we have to ask if five minutes is really going to give them enough time to track and exonerate each person. I'm assuming the tracking process can take a while, especially if they are running this in crowds, and that questioning them isn't as basic as saying "Aren't you that guy we're supposed to arrest?" I have no statistics on this, but I'm assuming that they have to verify documentation of identity and question the person as to their links to the person they want to find. I would also hope that paperwork is required to explain what the officer was doing at that time, which would take further time. I really think it will be longer than five minutes.

          By the way, to those who say that many of the false detections can be removed just by having a police officer look at the pictures, that is almost certainly not the case. If the system matches the faces, the picture is either similar enough or of poor-enough quality as to make detection difficult. If we could just use humans to compare faces from indirect and low-resolution video with better accuracy than this system, we probably wouldn't even need this system.

          1. Adam 52 Silver badge

            Re: Numbers

            "then over four person-hours are needed for each person you want to track just to deal with the mistakes from the system"

            Which is fine. You've consumed, worst case 200 man hours. The alternative is to search everyone which is an impractically large problem. So you get stabbings at Notting Hill.

            At football matches you deploy human spotters, but they are rare, highly trained and able to go to choke points. Human spotters have done much to reduce football violence but aren't a scalable option.

            So now consider a Royal Wedding. You've got half a million people lining the route, all within striking range of the couple. Every terrorist, Fathers For Justice campaigner, republican or jilted lover is going to want to have their moment of fame. How do you stop that?

            1. Intractable Potsherd

              Re: Numbers

              Anyone who classes Father's for Justice with terrorists does not deserve any attention. You are a copper, aren't you?

  13. rmason

    Minor point

    I know this is a ridiculously minor point to comment on, but comment I shall;

    "The group had little patience with this, stating in the report the government should provide funding for administrative staff to deal with this problem – one person per force employed for a full year at £35,000 would be a total of £1.5m, it said."

    £35k a year, to sift through photographs.

    Yep, criminal.

    Nope, not a criminal,

    Nope, not a criminal.

    Wherever they find the people to staff these watchdogs etc, they seemingly never select them from this planet. They'd fill those roles (not they they will ever exist) at 14-16k a year. Many, many, many times over.

    1. Teiwaz

      Re: Minor point

      £35k a year, to sift through photographs.

      Yep, criminal.

      Nope, not a criminal,

      Nope, not a criminal.

      They'll be 'hopefully' resurrecting the 'science' of Physiognomy.

      Nope, not at all setting up an online course... £800 for a month, exciting new career beckons, ....followed by hanging from a tree Pink Gabriel when the 'horse' bolts....

    2. Anonymous Coward
      Anonymous Coward

      Re: Minor point

      You're forgetting about the cost of the headcount above them, mouthpieces of utter conjecture don't come cheap, as Amber Rudd has repeatedly shown.

  14. sebt27
    FAIL

    On the arrest photos...

    "Despite a 2012 High Court ruling that said keeping images of presumed innocent people on file was unlawful, the government has said it isn't possible to automate removal."

    DELETE FROM p

    FROM

    Photos p

    LEFT JOIN

    Convictions c

    ON p.PersonID=c.PersonID

    WHERE c.PersonID IS NULL

    There. Done that for you. Where do I pick up my £1.5m?

  15. Haku

    China is doing it better (or worse, depending on your point of view)

    VICE recently reported on the facial recognition tech in China, and it's a little scary, especially when you discover one company chose the name Skynet for its premier piece of technology...

    Imagine using a pedestrian crossing when the green man isn't showing and suddenly you spot your face and name on a big electronic billboard to shame you into not breaking the law - that's happening right now in China and it's just the tip of the future iceberg.

    (skip to 16:44) dailymotion.com/video/x6htdf1

    1. Clarecats

      Re: China is doing it better (or worse, depending on your point of view)

      Thanks for the video.

  16. JohnFen

    It doesn't work anywhere

    I am unaware of any similar use of face recognition anywhere that does better. In terms of law enforcement/public safety, face recognition is currently worthless. All it does is to ensure that more innocent people get harassed and/or detained by the cops.

  17. steviebuk Silver badge

    But that IS a false positive

    "Officers can quickly establish if the person has been correctly or incorrectly matched by traditional policing methods, either by looking at the person or through a brief conversation," a spokesperson said.

    "If an incorrect match has been made, officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice.”

    The very fact you have to tell the system it was wrong is a false positive. You've stopped a person and had a "chat" at which point you realise the system is wrong. So that IS a false positive and should be recorded as such. Because you've wasted that persons time.

    And what government body pays an admin £35k a year to do that job?! The most they'd pay is £21k so that's them lying to justify not having to pay for someone to do it.

    1. Anonymous Coward
      Anonymous Coward

      Re: But that IS a false positive

      "And what government body pays an admin £35k a year to do that job?! The most they'd pay is £21k so that's them lying to justify not having to pay for someone to do it."

      They only need one of those drinking bird toys to press the delete key.

      (Vent radioactive gas? Y E S) - Homer Simpson

  18. Cynic_999

    Obviously it will improve over time

    Every one of the innocent people incorrectly flagged will be arrested. Which means they will be photographed, fingerprinted, DNA swabbed and entered in the PNC as a suspect. Then after a few hours (if lucky) they will be released. But their records will be kept.

    Over time, more and more innocent people will be entered in the database. Eventually everyone will have their mugshot on record, so it will be completely impossible for the facial recognition system to flag a false positive. Foreign terrorists on their first visit to the UK will be the only people not flagged.

  19. Anonymous Coward
    Anonymous Coward

    Just use iPhone Xs

    They seem to work pretty well. Don’t use anything by Samsung, that’ll suck monkey balls.

  20. Anonymous Coward
    Anonymous Coward

    Of course we paid for project!

    "It can't be that hard, NCIS have been using for years."

    "It's proven technology."

  21. 89724105618719278590214I9405670349743096734346773478647852349863592355648544996313855148583659264921

    It will work perfectly someday all that stored data will reveal everybody's movements.

    They'll hang onto it, prairie dogging till they can sh&t on you for repeated-but-forgotten-about littering in a couple of decades, a sure fire future revenue stream.

    1. Christian Berger

      Surveillance has different needs

      If you just want to track people that's the far simple scenario. You can essentially track blobs of movement most of the time, and should you get a clear shot of their faces you can use that. It even doesn't matter if those systems are unreliable as you can use other data to work around that. For example behavioural patterns can be used.

  22. Anonymous Coward
    Anonymous Coward

    You lot are too cynical. This technology clearly has merit. A new filter to the facial recognition software can be added that ensures :

    No banker ever gets prosecuted

    No member of the Conservative party is caught on the fiddle

    No member of the Royal family is caught speeding

    Black people continue to be wrongfully deported keeping racist tabloid readers happy.

    Thank you for your patience and understanding. Now surfs get back to work. Your overlords know what they're doing.

    1. 89724105618719278590214I9405670349743096734346773478647852349863592355648544996313855148583659264921
  23. bish

    Modern Policing

    I'm just waking up so I might be misreading, but:

    "If an incorrect match has been made, officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice.”

    If fifteen arrests at the football represented 0.005% of matches, that means police were giving an explanation, a Fair Processing Notice, and the (oh so tempting) offer of a guided tour around the equipment ("There's the camera that recorded you doing nothing wrong, and this is the laptop that passed your photo to a central server, where our highly advanced software matched your face with that of a completely different person. No sir, I'm afraid you can't see who that is for legal reasons, but I will say she's a looker.") to two thousand, nine hundred and eighty five pissed-up football fans.

    Can someone arrest the system (and/or its devs, and/or the numpties who commissioned it) for wasting police time?

  24. Anonymous Coward
    Anonymous Coward

    I'm losing faith in the reg readership. You're all secretely sterotypical hacker types , that want anarchy and nuclear war , but when it happens will complain that starbucks isnt open any more

    1. Jake Maverick

      dats almost a poem.....

  25. GruntyMcPugh Silver badge

    "or through a brief conversation"

    .. except we're under no obligation to answer Police questions, identify ourselves, show ID, or any of that;

    "Stop and question: police powers

    A police officer might stop you and ask:

    what your name is

    what you’re doing in the area

    where you’re going

    You don’t have to stop or answer any questions. If you don’t and there’s no other reason to suspect you, then this alone can’t be used as a reason to search or arrest you."

    (https://www.gov.uk/police-powers-to-stop-and-search-your-rights)

    Now, a crappy facial recognition system with a 98% false positive rate I wouldn't count as a good reason to play ball with the Police here. Of course, the Police arent going to explain they don't have the power to ramdomly interrogate the populace based on false intel, but the takeaway here really should be we are under no obligation to co-operate. It's time to assert our rights as citizens, push back, 'cos we're certainly getting pushed.

    1. Jake Maverick

      Re: "or through a brief conversation"

      you ever tried doing that in practice? there is no rule.....they'll just kick the shit out of you and violently abduct you anyway....for many it's reason enough to have you ass raped in mental prison! it's SOP, just because media won't report on it dnt mean it's not been happening.....! those who refuse to co-operate with these anonymous psychopaths who rape, torture, steal and murder with immunity are all mentally ill....didn't you know?

  26. tallenglish

    Ask Facebook

    They should just ask Facebook, the have the monopoly (and dataset) to match against and we already know it works (no matter how creapy it is).

  27. Jtom

    Arguing rates - false poitives, false negatives - is rather moot. Unless you believed that no criminals were scanned out of thousands, then they system is useless, since it found none of them.

  28. strum

    Aren't we focusing too much on this 98% figure? While results are this poor, the fuzz aren't going to trust it; they'll check and double-check, using their own eyes.

    We really need to start to worry when the failure rate drops below 50/40/30%. That's when the fuzz start trusting it - not bothering with an eyeball check (and then blaming the computer when the guy sues).

  29. kirk_augustin@yahoo.com

    It is not just that it is so inaccurate as to be useless and abusive, but that even if it did work, it would violate the basic right of privacy. You could never ensure government only used it for legal purposes. If used to identify protesters, organization members, etc., it would be illegal and very dangerous.

  30. Anonymous Coward
    Anonymous Coward

    Maybe there were just no criminals around that day...

    Weren't they lucky ;)

  31. rtb61

    False Positive?!?

    What is missing is what was the response to false positives and exactly how many there were. Not empty percentages but real numbers and records of real actions. How many people where grabbed, thrown to the ground, sexually assaulted (groping counts), illegally kidnapped (false positive) and how many were threatened at gun point, exactly what were those false positives and what were the outcomes and how severe have the penalties been for those false positives, false arrests, false assaults, false kidnappings ie what the fuck where the total number of hours of illegal detention and imprisonment?

  32. W.S.Gosset
  33. Jake Maverick

    I've asked to be removed, repeatedly- it just provokes more violence from them.

    I also suspect they're lying...the technology is much better than that! I know for a fact it was late 90s as I was working on it then....technology can only have improved since then!

    They're actually using the term "fixated individual" now....? you know when you have been FTAC'd :-( it's code for 'political dissident' or activist or campaigner......who are routinely targeted by the state and have their entire lives destroyed so they can't campaign, talk to people, be part of the 'debate' etc....that is the kinda cuntry this is now and has been for well over a decade. You don't see it until it happens to you.

    I can't even go to a train station without being punched in the throat by some yob in a mask. I certainly don't go to 'public events' anymore... :-( There's thousands of us, this cuntry at least.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like