back to article AI eggheads: Our cancer-spotting code rivals dermatologists

An algorithm that promises to diagnose skin cancer as well as dermatologists can may work with mobile phone cameras in the future, according to a paper published in Nature. The recent obsession with machine learning and AI in the tech world has boosted the ability of computers to analyze streams of data and classify images. …

  1. Your alien overlord - fear me

    Based on images alone

    Proper doctors perform physical tests before irradiating/cutting up their patients.

    1. Anonymous Coward
      Anonymous Coward

      Re: Based on images alone

      "This image looks like cancer. Would you like to buy GENERIC CANADIAN ANTICANCER PILLS?"

    2. Gene Cash Silver badge

      Re: Based on images alone

      American doctors just tell you that you can't afford the surgery.

    3. Yet Another Anonymous coward Silver badge

      Re: Based on images alone

      "Proper doctors" glance at 0.1% of the area of a smear sample under an uncalibrated 30year old microscope for 1-2mins, in a single polarisation using the wavelength range of their eyes and make an estimation of cancer cells which is marginaly better than tossing a coin.

      Every couple of yeas you then discover the process was done by someone who lacked the correct certifcation for that state and you recall 1000s of women to be re-scanned. Ironically this is the nearest we get to a study of accuracy.

      1. Count Ludwig

        Better to eliminate true all-clears, rather than detect cancers

        Funnily enough it's possible to kill more people by using a cancer-detecting machine than by not using one. This is because the machine finds the obvious positives, and leaves highly qualified humans with the mind-numbing job of trying to find the remaining hard-to-spot false negatives in a load of true negatives. The humans simply cannot, so the false negative cases remain undiagnosed and are more likely to die.

        If instead the machine eliminates a percentage of true negatives ("completely normal, nothing to see here") then humans are left with a more interesting job - find the true positives in a smaller, "richer" selection of samples that the machine has flagged as "not quite normal". Incremental improvements in the machine should be in the direction of extending the definition of "absolutely normal" leaving humans with an even more interesting job.

        Statisticians and systems bods might get this but tech-dazzled doctors often do not. If you really must have a headline-grabbing cancer detector then run it over the samples after humans have looked at them, not before.

        1. Anonymous Blowhard

          Re: Better to eliminate true all-clears, rather than detect cancers

          "This is because the machine finds the obvious positives, and leaves highly qualified humans with the mind-numbing job of trying to find the remaining hard-to-spot false negatives in a load of true negatives.....If instead the machine eliminates a percentage of true negatives"

          So we need a machine that can prove a negative, without ever giving a false positive? Then we should scan the remaining pool using "qualified humans"?

          This assumes that everyone who didn't get an "all clear" from the "not cancer detector" goes to see a "qualified human" so we would need a lot more "qualified humans" than we have currently.

          The real statistic is that without a simple, cheap, test most victims will never even see a "qualified human" so more people will die as a result.

    4. Ian Michael Gumby

      @ fearless alien ... Re: Based on images alone

      The idea of this is to let a person say "Hey, here's an interesting shaped brown patch on my skin. Could it be cancer? "

      And if the AI says yes, the person goes to a doctor and says... "Hey, I think I have something you need to look at to see if its cancer..." Then the doctor looks at it, and if its something that warrants further investigation, they will biopsy it.

      Most biopsies are benign and of course this is still hype and not real until it goes thru an FDA study.

    5. LionelB Silver badge

      Re: Based on images alone

      Proper doctors perform physical tests before irradiating/cutting up their patients.

      Of course. Presumably if an algorithm flagged up a risk, the same procedure would be followed as if it had been flagged up by a human specialist. Not sure whether the article intended to imply otherwise. I admit to finding the sentence: "Each person was asked whether they would refer a patient for a biopsy, for a treatment, or reassure them that their skin lesion wasn't cancerous, based on images alone" a little hard to scrute.

    6. Alan Brown Silver badge

      Re: Based on images alone

      Before they get to the physical tests, they look at it. If this acts as a filter and weeds out the obviously non-cancerous stuff it frees up docs to work on the problem ones. Even at 91%, it can be useful (anything remotely fuzzy, "see a human")

      Automating this step has been something that cancer specialists have been wanting for a long time. It's not going to take any jobs away.

  2. Bob Rocket
    Boffin

    Aren't Boffins Brilliant ?

    New algorithm can detect skin cancer as well as dermatologists and in the future will also detect mobile phones.

    I'm impressed by the dermatologist detector part of the algorithm but who thought of combining all three ?

    Wow, the future is so bright (if a little mad)

    1. Robert Helpmann??
      Childcatcher

      Re: Aren't Boffins Brilliant ?

      These folks are creating the tricorder in real life. Brilliant, indeed!

  3. Anonymous Coward
    Anonymous Coward

    Nude selfies

    Now people will have an excuse if they have them on their phone - I was checking for skin cancer!

    1. Anonymous Coward
      Joke

      Re: Nude selfies

      Ear cancer to be precise.

      So now these hellish devices can both induce and detect cancer. Convenient.

      1. Dave 126 Silver badge

        Re: Nude selfies

        I came here to seriously propose folk take nude selfies periodically. If doctor (or machine) can see that you have a mole that is bigger than it was last year / month, they can investigate further.

  4. Voland's right hand Silver badge

    If the algorithm can be easily used on mobile phones,

    Last time I checked tensor flow used python. So unless you are running a BlackBerry (they did have a python runtime), the answer is no.

    1. Anonymous Coward
      Anonymous Coward

      Re: If the algorithm can be easily used on mobile phones,

      Could we just take the picture and send it for processing on a server or is this not IoTish enough?

    2. Dave 126 Silver badge

      Re: If the algorithm can be easily used on mobile phones,

      >Last time I checked tensor flow used python. So unless you are running a BlackBerry (they did have a python runtime), the answer is no.

      Um...?

      TensorFlow was designed with mobile and embedded platforms in mind. We have sample code and build support you can try now for these platforms: Android, iOS, Raspberry Pi

      - https://www.tensorflow.org/mobile/

  5. Anonymous Coward
    Anonymous Coward

    Interesting to compare with other image analysis methods

    Wonder how well the AI approach works compared to other methods?

    I know various image processing techniques are used medically to flag images of concern - not a replacement for experts, as an aid.

    Just that the algorithms I have seen used medicinally have typically been developed by coders (liaising with experts to find out what are significant things to look for) rather than by auto processing a large amount of images.

    Essentially clinicians have said what features should be flagged up & image processing code tweaked to flag those.

    I assume the Google machine learning (if better than more "classically" developed software) would be used in same way - as an aid not replacement for experts.

    Though some of the classical systems used for assessing (potential) cancer biopsies can outperform clinicians working alone (computer systems do not suffer eye strain, general fatigue / concentration issues) they are never relied on as the only analysis method.

    AC as I have worked on medical image analysis in the past

  6. Whitter
    Boffin

    91% sounds good but...

    Whether 91% accuracy is a good result or not depends very much on the split between false positives, false negatives and the underlying base rate (see https://en.wikipedia.org/wiki/Base_rate_fallacy).

    For automatic technical solutions, one is almost always best to avoid trying to diagnose but rather to screen instead: bias the technology to avoid false negatives to act as a gatekeeper to reduce the number of cases a genuine doctor has to see.

  7. Medical Cynic

    Where is the responsibility here?

    Can the patient whose MM was flagged as 'doesn't need further Ix sue the programmers?

    If so, the programmers of the app, or the programmers of the AI software that developed it?

    Or just Google as an entity?

    1. Oengus
      Devil

      Or just Google as an entity?

      Of course you sue Google... They have the money. Why would a lawyer go after an App developer who has very little in assets or capital, can't afford to pay a decent settlement that would make the lawyer's cut worthwhile and can't afford to pay for "billable hours" to defend.

      Icon for lawyers...

      1. Whitter
        Devil

        Why google?

        Low hanging fruit perhaps? Google have lawyers on tap while the app designer likely can't afford the bus ticket to court.

  8. allthecoolshortnamesweretaken

    "It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021..."

    Huh?

  9. the Jim bloke
    Mushroom

    I get an annual skin checkup

    .. as an Aussie (world skin cancer champions) who spends a lot of time outdoors. I wont dispute the image recognition technology here, what i doubt is the ability of an untrained amateur user (ie almost everyone) to do a thorough scan of their entire risk-exposed surface (instead of taking dick-pics). especially areas not easily accesible from in front.

    Assistance from a partner would help, but not something I would rely on.

    I see this technology as being a useful diagnostic aide to help professionals, but as a consumer level app its slightly more useful than a banner ad sourced credit rating check.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like