back to article Slurp up patient data for algos that will detect cancer early, says UK PM

The UK prime minister has been wooed by the promises made by proponents of artificial intelligence, today pledging more use of algorithms and data-crunching in the health service. In a speech in Macclesfield, Theresa May confirmed plans to invest in boosting the use of AI in healthcare, as well as the use of data to help early …

  1. monty75

    Hmm, algorithms and cancer screening. I've heard something about that recently

    1. Korev Silver badge

      IIRC there was no AI in that, just some coding errors.

  2. Anonymous Coward
    Facepalm

    “legally inappropriate”

    Funny how, when the government does something wrong, it's “legally inappropriate” while the same would be "illegal" for anyone else....

  3. Chris G
    Facepalm

    AI?

    I wonder if the gov' also knows that AI also stands for Average Idiot?

    Frequently AI is responsible for the failure to do their job correctly or even to understand what it is they are supposed to be doing.

    I suppose while they are bringing AI into the health service the gov' could consider outsourcing in order to bring in efficiency while cutting costs.

    1. Ken Moorhouse Silver badge

      Re: AI?

      The most useful thing about AI is to vastly improve one's Scrabble score.

  4. Frederic Bloggs

    "Opt Out"?

    Do you suppose anyone in the NHS IT department has heard about GDPR ? Specifically about how "Opt outs" are effectively banned?

    Hmmm??

    1. Anonymous Coward
      Anonymous Coward

      Re: "Opt Out"?

      "Specifically about how "Opt outs" are effectively banned?..."

      Only if you're processing exclusively under consent as your justification. The NHS, from a GDPR perspective, is very unlikely to ever use consent as its justification. Mainly because it doesn't need to, but also because as a public body there's a very good chance it is specifically prohibited from doing so (whether or not this prohibition covers the NHS is complicated).

      The NHS would still need to gain consent for anything it's doing with your identifiable data, but that's grounded in medical ethics and the Caldicott guidelines, rather than GDPR. In any case a lot of this data would be considered non-identifiable, through anonymisation or aggregation, so neither Caldicott or GDPR apply.

      And actually PII is one of the least tricky bits about this. I'm sure with appropriate safeguards and honest attempts to inform the public people would happily support any effort by the NHS to better exploit the huge volumes of data it holds, when the intent is to improve our healthcare. Consent and information are, frankly, pretty easily solveable problems for an organisation as near-universally trusted as the NHS.

      The problem is one of technology, organisation and money. The NHS has no tech, no data-centric or tech-centric organisation and no money to build out the things it is missing. And the challenge there really, really can't be understated. A lot of the new fangled stuff in this area is based around genomics. This has come on leaps and bounds in recent years - projects like HAIL and GATK have pushed things forward hugely. But how is an organisation as technically illiterate and poorly organised (at the national scale) as the NHS suddenly supposed to deal with bulk analysis and sharing of datasets ranking in the tens of gigabytes _per sample_? That's a challenge even the most well funded organisations struggle with. Doing it in the NHS and turning that into effective treatments is going to be far more than just announcing a new industrial strategy.

      1. Mage Silver badge

        Re: The NHS has no tech, no data-centric or tech-centric organisation

        Don't they already outsource to Capita, who must in turn outsource because who are the the experts in Capita?

      2. Anonymous Coward
        Anonymous Coward

        Re: "Opt Out"?

        The current state of screening does leave a lot (everything?) to be desired.

        My wife passed her screening in October, and again in March and again in April but in April she was then diagnosed with invasive Grade 2 carcinoma that has already started to spread. Anything that improves diagnosis sounds like a darned good idea.

        Having just had plastic surgery for a smaller and very different cancer I am now very much in favour of better, cost effective diagnosis and early tailored treatment.

      3. Adam 52 Silver badge

        Re: "Opt Out"?

        "Only if you're processing exclusively under consent as your justification. The NHS, from a GDPR perspective, is very unlikely to ever use consent as its justification."

        Hmm...

        "Mainly because it doesn't need to,"

        What's your reasoning here? Not legally required, not necessarily in legitimate interests, the vast majority is not vital interests, not public task (because this isn't a public task) and not necessary to fulfil contract.

        Consent is all that's left. Same as everyone else.

        "The NHS would still need to gain consent for anything it's doing with your identifiable data, but that's grounded in medical ethics and the Caldicott guidelines, rather than GDPR."

        Sadly the NHS abandoned medical ethics long ago. Doctors routinely hand over patient data to all and sundry, including Google, without consent.

        1. Anonymous Coward
          Anonymous Coward

          Re: "Opt Out"?

          "Not legally required, not necessarily in legitimate interests, the vast majority is not vital interests, not public task (because this isn't a public task) and not necessary to fulfil contract."

          Just because you say so doesn't make it so. In NHS land, this class of data processing is generally called "secondary purposes". This class of data processing is facilitated in two ways. One is explicit authorisation through an act of parliament. A good example of this is handing over your records to the Care Quality Commission on demand.

          The second way, which this is, is generally called "Section 251", after the relevant bit of the NHS Act. Which says:

          "The Secretary of State may by regulations make such provision for and in connection with requiring or regulating the processing of prescribed patient information for medical purposes as he considers necessary or expedient"

          Now there is a later clause preventing this from being used to trump the DPA (presumably implicitly amended to include GDPR), but what this means in practise is the Health Secretary has the effective power to define what constitutes legitimate interests and public tasks. You have a right to object, but the NHS has a right to ignore your objection. In the case of these programs, your right to object is manifested as the "national data opt-out". You might not like that (I certainly don't), but that's the actual law.

          You will struggle to find anyone to take you seriously if you're going to argue that national programmes of cancer research are not in the public interest.

          1. Adam 52 Silver badge

            Re: "Opt Out"?

            "One is explicit authorisation through an act of parliament"

            Explicit authorisation is not legally necessary. May is not must. Therefore legally necessary does not apply.

            "The Secretary of State may by regulations"

            Secondary legislation (regulations) is subservient to primary legislation (acts of Parliament). So GDPR (Data Protection Act) wins.

            "argue that national programmes of cancer research are not in the public interest"

            Many things are in the public interest. They are not all legal. Moreover just because cancer research is in the public interest does not mean that using personal data for cancer research is in the public interest. Only a fraudster or politician would attempt to conflate the two issues.

  5. Anonymous Coward
    Anonymous Coward

    A telling insight

    "Nonetheless, the government's latest announcement is expected to look to private firms - which have the technical resources and financial backing the health service lacks - to help with the data-crunching."

    What this says is that the money to do this is available, but it's possessed by people who are not prepared to make it available unless they can make a profit out of it. The problem is how they make their profit out of it.

    1. Woodnag

      It's just a BS "reason" to allow patient data to be sold off

      It doesn't require machine learning to early detect cancers. It requires repeated high resolution scans of the areas of concern, and diff analysis of the scans over time to detect change.

      That is not going to be performed because of the expense.

      Machine learning is fine for digging out needles in haystacks, but it can't invent patient data to determine causality let alone do pre-diagnosis.

      Realistically, the anayses will at best correlate lifestyle/genetics/whatever against statistical cancer risk. For which, of course, there's just a tiny privacy implication.

      1. Korev Silver badge

        Re: It's just a BS "reason" to allow patient data to be sold off

        >It doesn't require machine learning to early detect cancers. It requires repeated high resolution scans of the areas of concern, and diff analysis of the scans over time to detect change.

        At some dosage the radiation will cause more cancers than it saves. Moreover, it'd also pick up tumours that wouldn't cause the patient any bother in their lifetimes.

        1. Adam 52 Silver badge

          Re: It's just a BS "reason" to allow patient data to be sold off

          Those who downvoted might well note that the previous post is effectively echoing the views of Professor Michael Baum, Professor Susan Bewley and Dr Fiona Godlee.

          Cancer screening has become a political football where science is drowned out by those profiting from the status quo.

        2. JohnMurray

          Re: It's just a BS "reason" to allow patient data to be sold off

          Then use MRI for the scans...

          Current detection for prostate cancer starts with basic (PSA/DRE) then onto an MRI scan lower abdomen/pelvis...if anything suspicious is detected then a prostate biopsy is arranged using the MRI scan overlaid with an ultrasound image to sample the area of interest.. Quite frankly, modern xray machines use much lower doses...and even CT scanners use less now, than xray machines did 20 years ago.

      2. Loyal Commenter Silver badge

        Re: It's just a BS "reason" to allow patient data to be sold off

        Machine learning is fine for digging out needles in haystacks, but it can't invent patient data to determine causality let alone do pre-diagnosis.

        I am reminded of the (possibly apocryphal*) tale I read somewhere about AI being used on chest X-rays to spot fluid on the lungs. The software apparently had a very high rate of success on picking out patients who were suffering with this issue. All to great acclaim, until some wag pointed out that the software was picking out the X-ray shadows of the chest drains the doctors had put into those patients, which they were ethically bound to do...

        The biggest problem with AI pattern recognition is that you don't know how it is working and what it is matching on. The matches may well correlate with the thing you are looking for, but causation != correlation. The field is littered with cases of bias creeping into the training data, leading to problems like inadvertent sexism and racism, amongst others.

        *but probably not...

        1. sebt27
          FAIL

          Re: It's just a BS "reason" to allow patient data to be sold off

          "All to great acclaim, until some wag pointed out that the software was picking out the X-ray shadows of the chest drains the doctors had put into those patients, which they were ethically bound to do..."

          What a great illustration of the idiocy of trusting in idiot's AI. I'm sure there are plenty of real AI researchers out there who know this problem backwards; however, I'm sure they don't get invited to tech-marketing conferences.

          "Alexa, are you ready to analyse some data?"

          "Yes"

          "Hold on while I find the file....

          ... Alexa, don't listen to this next bit, OK?"

          "OK, scout's honour"

          "Jim, where's the file of diagnosed patients?

          .... OK, got it. Alexa?

          ...

          ...

          "Alexa?"

          ...

          "You told me not to listen"

  6. Anonymous Coward
    Anonymous Coward

    https://digital.nhs.uk/binaries/content/assets/legacy/powerpoint/l/k/pack_g_-_fit_with_data_protection_bill_%28gdpr%29__published_december_2018.pptx

    See slides 4 and 8. The key point is: the NHS is not relying on "consent" (i.e. opt-in) for processing data; it is relying on the processing being required for the performance of its statutory duties, for which consent is not required.

    The opt-out applies to certain secondary uses which are permitted under section 251 of the NHS act 2006, but which the patient can object to.

  7. Chuunen Baka

    Machines instead of GPs

    Machine learning as a substitute for being able to see a GP in a reasonable time to report unusual symptoms? GPs are so overloaded and thin on the ground that preventative care has all but disappeared.

    1. Daggerchild Silver badge

      Re: Machines instead of GPs

      As the venerable House highlighted, humans lie. Especially to themselves. Garbage in will be garbage out, no matter how shiny the box.

      Now, if everyone had a portable blood screening device/DNA sequencer, you could get Good Data, and a bot could save tonnes of time for humans. Maybe a little vampiric device on your wrist. FitBite!

    2. JohnMurray

      Re: Machines instead of GPs

      Speaking personally....from the GP consultation of lower abdominal pain, to the blood test of raised PSA, then MRI and onto biopsy, took 3 weeks. MRI performed at 1830z on a Saturday evening (the MRI suite is run 24/7)...pre-op assessment at 1745z, Friday. Biopsy a week later. It would have been a few days faster, but I dithered a bit over having biopsy probes inserted into me via the area between my legs. The other route is through the wall of the rectum...

  8. Blockchain commentard

    All very well having AI crunch the numbers but if there are no beds or nurses, you will still die of cancer. Spend the money on reality not fantasy requirements.

    1. Anonymous Coward
      Anonymous Coward

      > Spend the money on reality not fantasy requirements.

      It's OK - they'll iron out all the fantasy with the AI projects for the magical Northern Irish high-tech border. Maybe even re-use some cold. Of course, people will get diagnosed with carrying cheese and Guinness, but even AI written by unicorns can make mistakes. Those mistakes will be corrected with the AI developed for taking down censored content by those nasty web companies. And whatever the next problem is, it will apply to that too.

      Honestly, teaching politicians a new phrase is just so dangerous. Like the time when cool dudes started "pinging" messages to each other.

  9. John Smith 19 Gold badge
    Holmes

    "Late diagnosis of..treatable illnesses is one of the biggest causes of avoidable deaths,"

    So maybe the NHS should look at how other countries manage to do better (because several other countries do much better in Europe than the UK) first?

    Mad woman continues to demonstrate madness by making mad statements in a speech most would describe as "Barking bats**t crazy."

    1. Evil Scot

      Re: "Late diagnosis of..treatable illnesses is one of the biggest causes of avoidable deaths,"

      We still have to get my imaging data onto the system. Meaning I have to make an appointment to arrange an imaging appointment. Can you fix that before deskilling oncology.

  10. F0rdPrefect
    WTF?

    Well that will work OK

    Until they lose our data.

  11. Alister

    In a speech in Macclesfield

    Ha, I bet she was scared to be there. I mean, who knows what wild and woolly tribesmen exist in the wilds of Cheshire, they still practice ritual sacrifice on Alderley Edge, don't they?

    1. John Smith 19 Gold badge
      Unhappy

      "Ha, I bet she was scared to be there"

      $Deity, yes.

      This is the woman who (as Home Secretary) wouldn't go out in London at night.

      With armed police bodyguards and an official armed response a radio message away.

      What can you do with such a (self) limited human being?

  12. Mage Silver badge
    Devil

    Proof

    Is there any properly peer reviewed and statistically significant proof that:

    1) AI exists?

    2) AI would deliver the desired results?

    3) It isn't just a scheme to make money?

    4) It can be properly anonymised?

    The Big IBM AI program in the USA never delivered.

    Previously NHS illegally shared with Google, a company with zero track record in health, but a proven to exploit personal data to make their claims of advertising plausible. IBM at least has some background, but Google is really an advert seller.

    1. JohnMurray

      Re: Proof

      Plenty of proof mate...just look at parliament...loads of artificial intelligences there

  13. Anonymous Coward
    Anonymous Coward

    Cure plebs or just more backdoors to this:

    London’s Royal Free hospital failed to comply with the Data Protection Act when it handed over personal data of 1.6 million patients to DeepMind.

    https://en.wikipedia.org/wiki/DeepMind

    -

    Facebook halts medical data sharing plan

    http://www.bbc.co.uk/news/technology-43668607

    -

    Reg readers know exactly where this is all heading:

    https://www.theverge.com/2018/5/17/17344250/google-x-selfish-ledger-video-data-privacy

  14. Anonymous Coward
    Anonymous Coward

    The problem with AI is that computers are ultimately discrete, binary, a series of 1s and 0s. Talk as much as you like about training, fancy words and fancy algorithms. But you can't escape the basic fact.

    Computers are rubbish at subjective thinking, and computers are especially shit at edge-cases.

    Sure you can mimic the "training" and "experience" aspects of human life with a computer. But you can't mimic the intuition, the unconscious cognition of a human being.

    Why ? Because everything in a computer, by definition, has to be a conscious decision.

    That's why its easy to program a computer to play a solid game of chess. Because everything is conscious and deterministic.

    If AI is supposed to mean self-driving cars are safer than humans. Then why have we seen so many crashes in stupidly dumb scenarios ? You might boast your fancy self-driving car has a bazillion sensors and some fancy on-board computer with fancy algorithms that can process the streaming data at the speed of light. But a human being with the IQ of a gnat could spot a stationary truck and figure it might be a good idea to take evasive action.

    If AI is so great, then why do spam and phishing messages still get through ? Despite no shortage of training data and no shortage of intelligent minds ? I mean even your elderly, computer-illiterate relative could still spot a false-negative spam a mile away, and in most cases spot a phishing message too.

    If AI is so great, then why are those "speak your choice" IVR prompts on phone systems still so goddam awful ?

    If AI is so great, then why do so many hedge funds based on algos and AI perform so badly ?

    If AI is so great, then why are chatbots still a 1960's ELIZA experience ? I mean you have to be seriously dumb to not know in less than a couple of seconds that you're talking to a bot.

    If AI is so great, then why is Google Translate so shit ?

    If AI is so great, then why is fraud still "a thing" ?

    Need I go on ?

  15. Anonymous Coward
    Anonymous Coward

    Here's the question

    Are they planning to give this data to a big data company, or to keep it in-house and get a specialist company to provide the technology for the NHS to do this process entirely within the NHS without anyone outside the NHS ever seeing so much as a single digit of patient data?

    There are plenty of companies who can provide the tech and the hardware for this to be done in this way. Omina Technologies being just the first one to spring to mind, there's probably someone in the UK even who could do it. Unfortunately they'll likely be in Cambridge which will give them bad press but as long as they never handle the data and the NHS keeps the drives at the end no harm.

    My gut feeling is that the NHS isn't going to do this because it would be a good choice. Can someone please prove me wrong?

  16. adnim

    Wooed

    Most people are wooed when they hear what they want to hear.

    It's a shame she is not wooded by privacy advocates and security analysts that say "backdooring crypto breaks it". And other presented evidence from various qualified persons that does not fit in with her agenda.

    Perhaps she likes Unicorns, maybe I can sell her one?

    1. adnim

      Re: Wooed

      She would probably smile at the words... "You are beautiful".

      Instead of forming a critical thought.

  17. iron Silver badge

    England and Wales?

    I take it this is NHS England and Wales despite the mention of the UK PM? As usual Register articles on the NHS fail to mention who it actually affects. The NHS in Scotland is separate to England and Wales.

  18. Geekpride

    Good idea, but...

    There's no doubt that easy access to anonymised patient data would be an excellent research tool, but the difficulty is in the implementation. With a standard medical trial, your consent will be gained in person, you'll be told what the research involves and what they're hoping to learn from it, along with any additional potential risks or benefits to you from taking part in the trial. With this kind of bulk data collecting, it feels more like writing a blank cheque where your health records will be used for all sorts of research without you knowing about it.

    People are also becoming much more aware that their data has value, and there would be concerns that this is going to be used by drug companies to increase their profits with few benefits being seen by the majority of patients. Gaining trust in this project will be difficult, and easily lost, so it needs to be handled very carefully.

    1. John Smith 19 Gold badge
      Unhappy

      "People are also becoming much more aware that their data has value,"

      They are, but very slooooooooooooly.

  19. sebt27
    Mushroom

    "The Register is well aware that none of the data-crunching solutions described thus far, including DeepMind's Streams, can be properly described as AI."

    Except by marketurds. To marketurds, anything more advanced than an abacus must be "described" as Cloud-based Distributed Machine-Learning AI.

    But then marketurds would probably describe being thrown headfirst into a septic tank as "vigorously expressed, invaluable customer feedback".

    (I realise this is just an unsupported assertion. For science's sake, I think we need to all join in a distributed research project to find out if it's really true).

  20. John Smith 19 Gold badge
    Unhappy

    Again it's not the NHS's data to hand over. Still the same arrogant DoH attitude.

    It's the patients.

    And no, they did not ask for this.

    It might work if the job was done in house with skills transfer provided by 3rd parties.

    Handing "anonymized" data over to an outsider is basically BS.

  21. Anonymous Coward
    Anonymous Coward

    not helping private NHS

    fuck em, it's for her mates benefit, not ours!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like