back to article UK boffins steal smartmobe PINs with motion sensors

The World Wide Web Consortium might want to take another look at its habit of exposing too much stuff to application interfaces: a UK researcher has demonstrated a JavaScript app can spy on smartphone sensors to guess the codes users employ to unlock the devices. The attack, published in the International Journal of …

  1. John Smith 19 Gold badge
    Unhappy

    "W3C specifications do not specify any policy do not discuss any risks associated with this"

    Perhaps because they feel that mfg should

    a) Be aware of the risks, because why should customers buy from them.

    b) Be free to implement whatever view of privacy they think fit.

    Unfortunately so far it seems most mfg privacy policy is not to bother with giving the customer any.

    Of course that maybe because IRL phone mfg sell to networks, not end users so feel the network is their customer.

    Except for that new UK one El Reg reviewed a little while ago that seems to have quite a good one for stopping apps asking for stupid amounts of data for the (very) dubious privilege of running their (usually) shoddily written code.

    1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: "W3C specifications do not specify any policy do not discuss any risks associated with this"

      Just no.

      Firstly most spec do analyse risk and often have a section specifically about the 'fingerprinting risk' of a new feature. A lot of work goes into keeping the web inherently sandboxed and secure. Or at least it used to.

      The phone-ification of the web is both dumbing it down, and chasing apps in a race to the bottom of privacy. This is not an accident. Google moles such as Domenic Denicola are driving the process. If there's a shitty privacy invading spec, his name is never far away.

  2. Fred Flintstone Gold badge

    And yet, it's so very, very easy to fix..

    What I find baffling is that people find this a difficult problem to solve when you have a touch screen.

    All you need to do is randomise the position of the digits on each PIN request. As it concerns numbers it's not too much of a hunt to find the right button (it's not a good idea with alphanumerics) and it gets rid of such detection (location sensing is also something that happens when observing someone entering a PIN on a doorpad - you can assert probabilities from the way the arm moves).

    Of course, you make such an idea OPTIONAL - for the user who values convenience over security or who has reduced sight or mobility.

    Also, as the video was is showing a fingerprint reader equipped phone, let's make it possible to access such a device with a combination of PIN and FP - that is IMHO better for a device that carries payment information.

    1. Sleep deprived

      Re: And yet, it's so very, very easy to fix..

      I've been wondering about this since the first day I use an Android phone: why is the pattern always to be drawn on screen at the same spot? It could probably be found just by the smear

      1. Anonymous Coward
        Anonymous Coward

        Re: And yet, it's so very, very easy to fix..

        Those pattern unlocks are no more secure than having no security at all. You just need to watch someone unlock their phone once and you have it. If you don't there aren't many combinations of patterns possible in such a small grid and it doesn't lock you out if you try them all.

        So I really don't think a smear showing their pattern makes any difference.

        1. John H Woods Silver badge

          Re: And yet, it's so very, very easy to fix..

          There are 985824 patterns, very nearly the same as a 6 digit pin. With a 30s delay every 5 wrong attempts, there's no significant difference in guess resistance.

          The main problem for oily types like me is smearing, so I tend to use reverses (e.g. middle, left, right) to confuse the issue. But giving the screen good buff is a good option... Increases security and makes it look a bit less gross.

  3. Notas Badoff
    Boffin

    Simples

    Enable the Javascript API only when the browser is active and the particular tab is active. Just like the browsers are now throttling Javascript execution within hidden tabs to save battery. APIs are implemented within the browsers, so the browsers must add on the needed security considerations.

  4. Lord Elpuss Silver badge

    There is a real-world risk here

    Easy peasy 733T hack to weaponise this.

    - Malicious webpage running motion-detect JS displays a screenshot of the OS's PIN entry panel. No active elements needed because the image itself does nothing.

    - Because it's an inactive image, the mobile browser deems it safe & allows it to run fullscreen - obscuring telltale OS elements such as top/bottom status bars.

    - A percentage of users will believe that their device has somehow locked itself, and will tap the image to re-enter their PIN.

    - Motion-detect JS in the underlying (still active) webpage captures PIN.

    Most El Reg readers wouldn't fall for this - but out in the real world it becomes a numbers game. And some numbers are more gullible than others.

    1. Brewster's Angle Grinder Silver badge

      Re: There is a real-world risk here

      Why do you need to motion detect? Who's going to block the "active elements" and yet leave a script running? Come to that, which filter will recognise an image that looks like the user's pin code?

      1. Lord Elpuss Silver badge

        Re: There is a real-world risk here

        @Brewster's Angle Grinder

        Motion detect is the whole freaking point of the article. That a miscreant could use the accelerometer to determine specific handset motion and thereby derive the PIN.

        And mobile browsers don't tend to allow active elements in true fullscreen because people could simulate all kinds of nefarious goings on in a browser; for example a home screen image with an authentic-looking dialogue box asking for your iTunes password. The safety net is that the actual phone UI still shows, so a malicious image looks weird. Using a fullscreen (inactive) image gets around this, and because it's inactive it can't do any harm - except it now apparently can if you can analyse phone motion while it's being displayed. Which was my point.

        And why is it relevant if a filter recognizes an image that looks like the users pincode; that has nothing to do with anything.

        Fucks sake. Did you even read the article?

  5. LuDo

    congratulations, it is a girl!

    Dr. Maryam Mehrnezhad is she, not he.

    1. Wyrdness

      Re: congratulations, it is a girl!

      You're right. Bad Reg should have checked this.

  6. Nick Kew

    APIs not to blame

    An API is just an official way to do something, with a (more-or-less) guarantee it won't fall to pieces next system update. Snarfing users' PINs without an API is a hack; using an API is just regular programming.

    The APIs used in this case aren't the vulnerability, they just expose it and make it (too) easy. And, erm, make it difficult to fix without breaking a stability promise made to app developers in general.

    1. The Mole

      Re: APIs not to blame

      "The APIs used in this case aren't the vulnerability, they just expose it"

      No they expose the attack vector, the process of exposing that attack vector is the vulnerability and in this case the API implementation within the browser is the vulnerability.

      1. Brewster's Angle Grinder Silver badge

        Re: APIs not to blame

        "No they expose the attack vector, the process of exposing that attack vector is the vulnerability and in this case the API implementation within the browser is the vulnerability."

        Before reading this, I read the report on using traffic analysis to determine which Netflix show you're watching. By your logic, the APIs that allow you to monitor network traffic are the vulnerability and we should ban them to fix the problem.

        In reality, you don't deal with side channels attacks by trying to block off the side channel. You stop them by making sure the side-channel carries no information. And if that's done (e.g. by randomizing the layout of the keypad), it doesn't just prevent web pages from sniffing your keycode -- it prevents any malicious app on the phone from doing it.

        1. Charles 9

          Re: APIs not to blame

          But the "passive image" attack could STILL defeat that trick by simply using a random but known layout. The attacker would know the pattern and STILL discern the real PIN based on it.

          The big problem with side channels is that, most of the time, they can't help but convey information because that's their intended purpose, like how CRIME depends on the very fact that compressed data streams are smaller (their intended purpose). Which leaves you with two choices: cut it off completely or live with it. Like how compression is now generally discouraged for encrypted data streams.

  7. Brewster's Angle Grinder Silver badge

    While Javascript makes this available to casual web pages, any app on the phone could do this.

  8. really_adf

    While Javascript makes this available to casual web pages, any app on the phone could do this.

    Good point; although the article was talking about the JS API, as in your other post, it's a general side-channel attack.

    Given the apparent accuracy of the technique for a 4-digit PIN, I wonder if it could also reduce alphanumeric passwords to a small number of possibilities with a high probability that one of them is correct.

    Extending the solution of randomly arranging digits to the entire keyboard doesn't seem workable, so it sounds to me like the only option is to prevent access to the motion sensor data during any sensitive data entry.

  9. Charles 9

    It's times like this that make me think the Web is going off the rails. How do people feel about a campaign to take the Web back to its passive roots?

  10. Anonymous Coward
    Anonymous Coward

    Apple already addresses that last year

    Umm, it may be worth reading CVE-2016-1780 from about a year ago.

    Apple addressed this issue early last year. Motion data has been inaccessible during login from as far back as iOS 9.3, so next time we get such a report it would be nice to be a tad more precise about which platforms this issue applies to. The desire for media coverage should not override accuracy.

    1. Charles 9

      Re: Apple already addresses that last year

      You should read the article further, as well as these comments. That can be worked around by pretending to be the lockscreen during an instance when the accelerometer IS allowed and tricking the user into entering his/her PIN on that screen. No amount of lockscreen alterations will cut it as the fake screen can simply simulate any such changes such as randomized pads (which have the added detriment of tripping up people who rely on their muscle memory to remember their PINs).

      1. Brennan Young

        Re: Apple already addresses that last year

        why would an attacker need to use the accelerometer exploit if they've already gone to the trouble of making a fake login screen where the PIN can be captured?

        1. Charles 9

          Re: Apple already addresses that last year

          Because they may not have access to the touchscreen inputs? Thus they need the accelerometer in order to capture the PIN?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like