back to article UK NHS-backed health apps 'riddled with security flaws'

As if striking junior doctors weren’t enough, the UK's NHS also has technology worries, according to a study by app security firm Arxan. All of the NHS-approved apps Arxan audited lacked binary protection against code tampering, and most also lacked adequate protection in the transport layer. Flaws also emerged in FDA-approved …

  1. Voland's right hand Silver badge

    Official approvals are given by marketing and PR

    Official approvals have nothing to do with any and all of the following nowdays:

    1. Legal and specifically data protection and consumer protection

    2. Security

    3. Technical merit.

    It is just marketing and PR.

    1. anonoomouse

      Re: Official approvals are given by marketing and PR

      > Official approvals are given by marketing and PR

      No. Not with reference to software for use in the NHS, which is kind of the point here.

  2. James 51

    Someone should tell the torygraph.

    http://www.telegraph.co.uk/news/nhs/12093232/Strike-all-you-like-doctors-technology-will-soon-take-away-your-power.html

    1. Bogle
      FAIL

      Bile today, bile tomorrow

      Ah, the torygraph. Average reporting dipped in bile. They apply the skewed thinking of the small minded and come up with "doctors will resist the advance of science as their patients will be healthier"!

  3. Halfmad

    As any NHS Info sec staff member will tell you..

    We manage risk, have no budget and mention pen testing and you get a dirty look.

    I'm fantastic at pointing out the glaring issues with software we either develop in-house and I'm not even a coder, I'd like the applications externally tested but that costs money and you know - money = operations not carried out and the public wouldn't stand for that.

    It's apparently not a great use of resources to protect our data.. yet.

  4. Anonymous Coward
    Anonymous Coward

    Do users and businesses really care?

    This is pretty shocking but not unsurprising news.

    I've come to the conclusion that in this world where everything is shared on social media that particularly (non-tech professional) millenials really don't give a shit about their data, until they might be financially impacted.

    Then, too many tech professional Millenials who build much of the stuff, well, some of them do give a toss about security but too many have attention deficits so major and adrenalin needs so high that they just skip from one project to another as fast as possible.

    Then, other and older tech professionals who do know and do give a fuck are increasingly sidelined and replaced through outsourcery.

    Then, outsourcerers tend to have major knowledge gaps or don't give a fuck.

    Ultimately, most business leaders, now just focus on short term and the quarterly profits which give rise to all the problems above. It's a vicious circle for which and their financial backers are ultimately responsible for.

    In other words: we're fucked.

    1. Anonymous Coward
      Anonymous Coward

      Re: Do users and businesses really care?

      As someone who has to investigate staff for posting inappropriate information on social media I'd say that generally the public don't give a toss until it's not their choice whether that information was shared or they have a bone to pick with a clinician.

      The more worrying trend I'm seeing is patient relatives recording in wards and sharing on social media, so far in the past 12 months I've seen:

      1. Elderly wheelchair bound patient with dementia recorded in a ward by a relative of a DIFFERENT patient as she was mumbling. This was posted on facebook without consent as it was "funny".

      2. Patients recording consultations secretively (this isn't illegal btw if for personal use only) and then posting it on facebook to prove staff members said X, Y or Z. Problem with that is that it's no longer personal use.

      3. Skype being used during a consultation so the patients husband could listen in, problem here is that it was in a shared area with 4 beds so hardly private.

      Honestly this sort of crap goes on every day in every hospital in the UK, staff are surprisingly open to going along with it though - patients are too, until it's not their choice something is shared, such as a relative posting a photo of them in the hospital bed - you'd be surprised how often patients then decide it's an invasion of privacy, when they did it themselves 5 minutes earlier with a fish pout.

  5. tiggity Silver badge

    Binary Protection

    The binary protection one is arguable in some cases.

    Making a few efforts could be a problem on Android as would essentially just disable app on devices that are (arguably) more secure.

    e.g. do things like detecting root, looking for dev builds / unofficial ROMS, but arguably a rooted android device, many unofficial ROMS are more secure than a vanilla android OS as user can actually exert a decent degree of control over what software runs / is installed, instead of being stuck with whatever apps Samsung, HTC, Huawei, LG or whoever add and will not let the average user uninstall, various privacy suspect Google services / apps that, again, a standard user cannot do much about etc.

    And even so, whatever is done on "binary protection", bad actors can defeat it, just ramps up the difficulty or encourages alternative attack vectors e.g. "fake" access points / DNS poisoning etc., lots of ways to tell android app needs an update & bingo it updates with your malware version

  6. PassiveSmoking

    Don't be too hard on them, they're just making sure their software is Snooper's Charter Compliant.

    1. 0laf

      No that would require a degree of competency. See HalfMad's post above which certainly backs up my experience.

      The NHS infosec folk I have met have all been decent switched on people managing a barbarian horde of managers and clinicians all demanding their particular favoured toy is sorted out yesterday. Pretty much the same as everywhere in the public sector.

      No one likes their product or toy to be pen tested because 1) it costs money 2) when it finds problems (and it always finds problems) it's in writing and something needs to be done.

      They'll all have fun when the new Data Regulations come into force (assuming we're all still in the EU in 2017).

      1. Anonymous Coward
        Anonymous Coward

        NHS infosec bod here, can confirm all this ^^

      2. Mark 85

        No that would require a degree of competency.

        Actually, PassiveSmoking is pretty much spot on... Prior to the focus on security via Snowden and a lot of high-profile intrusions, both the spooks and miscreants were happy. Either by design or ineptitude, they had access. By ineptitude of the management types via profit motive, paperwork, etc. the spooks still get what they want as do the miscreants.

        What we have here is a classic.. the lawmaking authorities demand privacy, etc. to prevent the miscreants getting what they want and yet, these same lawmakers don't want security so the agencies can make us safer (for some value of "safer"). Given the maths, etc. involved, you can't have it both ways.

  7. SecretSonOfHG

    checklist security is useless

    This study looks like yet another checklist applied to a bunch of apps without considering what the app itself actually does. Checklist security is easy, there are multitude of scripts already written that anyone can execute and call itself a "security consultant" without even knowing what they are checking, much less what their scripted discoveries mean or how to exploit them.

    Not saying that the Arxan guys are one of these, I don't know them or have any experience with them. But any "security assessment" that does not take into account what the app actually does is useless. Yes " privacy violations, theft of personal health information, and tampering" can happen. And yes, best practices mean that all these vulns should be addressed. But there's a big difference between hacking a mapping app that helps you locate your nearest doctor and being able to see your complete medical records. And it does not make any sense at all to give the same priority, security or budget to the security of both.

    1. John Smith 19 Gold badge
      Unhappy

      "Checklist security is easy, "

      Except what they are saying is that all tested apps violated 2 bad coding rules at a minimum

      The question is why can't they be written right in the first place.

      1. Jimmy2Cows Silver badge

        Re: "Checklist security is easy, "

        Always comes back to: good, fast, cheap... pick two.

        And the two picked are always fast and cheap. What happens from there is the same shit it always is, and should be no surprise to anyone.

  8. x 7

    meaningless without details of the affected apps

  9. patrickstar

    Wait, are they really suggesting obfuscating executables/protocols as a security measure? The site and paper are all marketing security theater gobbledygook, but "Such vulnerabilities could allow the apps to be tampered and reverse-engineered" seems to suggest that this is indeed the case.

    Dumbest thing I've heard in ages, and that says a lot since dumb security-related ideas aren't exactly uncommon...

    Shame crap like this takes away attention from ACTUAL security issues.

  10. This post has been deleted by its author

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like