back to article Microsoft, Google: We've found a fourth data-leaking Meltdown-Spectre CPU hole

A fourth variant of the data-leaking Meltdown-Spectre security flaws in modern processors has been found by Microsoft and Google researchers. These speculative-execution design blunders can be potentially exploited by malicious software running on a vulnerable device or computer, or a miscreant logged into the system, to …

Page:

  1. Christopher Reeve's Horse
    Holmes

    Well well well well...

    ...well well well well well then.

  2. razorfishsl

    Anyone else get the idea this is a fuck feast, where intel and Co are out to find more spectre flaws to muddy the waters & stave off any lawsuits against intel

    1. Anonymous Coward
      Anonymous Coward

      re: intel and Co are out ... to muddy the waters

      You might very well think that. I couldn't possibly comment.

      Be careful out there.

    2. This post has been deleted by its author

    3. snifferdog_the_second

      @razorfishsl: No

  3. Waseem Alkurdi

    I bet my five dollars/euros/pounds/$currency that we're going to count 13, no more, no less variants by December.

    1. YetAnotherJoeBlow

      Spectre

      I'm aware of a group selling a Spectre vuln. They won't disclose the source as that would be giving it away for free. One has to buy on faith. The government would by it that way - who's going to con the NSA? The price is in the stratosphere. The government price is too low.

      13 by years end? Easily but we will never know how many there were will we? Best wait for new dies.

      1. hplasm
        Coat

        Re: Spectre

        "I'm aware of a group selling a Spectre vuln.

        SPECTRE?

  4. Boris the Cockroach Silver badge
    FAIL

    Its quite depressing really

    we have 1000's of people with the smarts in CPU design.... and things like this pop out.

    And not only pop out in Intel chips.. but everything else out there too.

    Almost as if they are all copying each other.......

    1. Brian Miller

      Re: Its quite depressing really

      They aren't copying each other, it's just that there's only so many ways to make something execute more instructions faster. And yes, speed is freaking important.

      There are a lot of timing attacks and other side channels that yield information. One of the important points of all of this is that too many applications don't encrypt sensitive data, even with minimal encryption.

      1. jaduncan

        Re: Its quite depressing really

        In this context encryption outside of the CPU doesn't really matter; the compromised processor is the thing that must touch decrypted data to, well, process it.

        1. Anonymous Coward
          Anonymous Coward

          Re: Maths

          Mathematics seems to suggest it is impossible to prevent all these types of side channels. Where ever execution is time or resource limited, you can correlate that to data.

          1. anonymous boring coward Silver badge

            Re: Maths

            "Mathematics seems to suggest it is impossible to prevent all these types of side channels."

            Perhaps, but by limiting the accuracy available to the attacker, I think we should be able to make these sort of attacks unfeasible.

        2. Michael Wojcik Silver badge

          Re: Its quite depressing really

          In this context encryption outside of the CPU doesn't really matter; the compromised processor is the thing that must touch decrypted data to, well, process it.

          Not necessarily true - that is, if the data is being loaded prior to decryption (for example, if decryption is being done by the core being probed), then encryption in memory would prune the Spectre attack tree somewhat. It's not a perfect defense by any means, but it narrows the scope for usefully probing that particular data.

          This is simply a specific case of the more general observation that a Spectre probe sequence will reveal much low-value data, possibly in addition to some high-value data. Encrypted data (which the attacker cannot economically decrypt) is low-value.

          Of course, the attacker may be able to find the key by probing elsewhere. It's a very partial measure.

      2. Anonymous Coward
        Anonymous Coward

        Re: Its quite depressing really

        Half right, half wrong:

        "They aren't copying each other, it's just that there's only so many ways to make something execute more instructions faster. And yes, speed is freaking important."

        Right. Someone comes up with an idea that solves a perceived problem, and it gradually becomes standard practice, unless someone else comes up with a better solution. That's just the way a successful society makes progress.

        "One of the important points of all of this is that too many applications don't encrypt sensitive data, even with minimal encryption."

        Wrong. In general, things have to be in comprehensible form for processing. There are a few ways of doing certain limited operations on encrypted data, but this is orders of magnitude slower than operating on the unencrypted data. Better to just scrap speculative execution, as it is a much lower performance hit.

    2. chuckufarley Silver badge

      Re: Its quite depressing really

      Well, think of it like this:

      Every modern CPU that suffers from these vulnerabilities has literally billions of transistors. Your higher end CPUs (and GPUs) have more transistors per chip than there will be people on the Earth tomorrow or twenty years from now. It's amazing that we don't have more of these flaws to deal with and that they are not worse. Perhaps there will be more that come to light soon, or in the next decade. What matters is that we find the flaws and learn how to fix them. It's a case of not being able to make progress until we fail and learn from our mistakes.

    3. anonymous boring coward Silver badge

      Re: Its quite depressing really

      Computing is still in its infancy.

      We have only (relatively) recently started accepting that malicious stuff will run on our computers (invited in by making the web able to run stuff locally). We used to think it was the exception that malicious code ran, whereas now it's the norm.

      Most security still stems from only running stuff from trusted sources. The main security holes are "run everything" platforms such as browsers and Flash.

      1. Anonymous Coward
        Anonymous Coward

        Re: Its *very* depressing really

        "We have only (relatively) recently started accepting that malicious stuff will run on our computers (invited in by making the web able to run stuff locally)."

        Are you serious? And, by the by, who is this "we"?

        Long before the era of ubiquitous web access, it was quite popular for someone to send someone else this week's weekly spreadseet (or whatever) and it has a macro in it that does the equivalent of "format c:" I'm thinking that goes back to the 1990s, when MS and IT departments had discovered email but the unprotected web hadn't become ubiquitous.

        And because the underlying commodity software people and commodity sysadmin people typically had no systematic concept of protecting their important resources (files and filesystems, for example) against inappropriate access (stuff visible that shouldn't be, stuff writable that shouldn't be), such matters being beneath outdated relics of a forgotten era where security and access controls *had* to be considered as part of a bigger picture, the rest of us end up two decades later with an industry literally subject to Meltdown.

        Maybe the IT crowd should switch it off and switch it on again and see if it works better afterwards. It seems to be the industry standard approach.

        1. Anonymous Coward
          Anonymous Coward

          Re: Its *very* depressing really

          "Maybe the IT crowd should switch it off and switch it on again and see if it works better afterwards. It seems to be the industry standard approach."

          It is a standard approach because resetting a system in an unknown state to a consistent starting configuration is a logical and efficient way to start.

          As the complexity and interconnectedness, both obvious and invisible, of computing environments increases, and the costs and impact of extended downtime to our lives soars, fast solutions or at least fast diagnosis becomes ever more logical.

          1. Anonymous Coward
            Anonymous Coward

            Re: Its *very* depressing really

            "As the complexity and interconnectedness, both obvious and invisible, of computing environments increases, and the costs and impact of extended downtime to our lives soars, fast solutions or at least fast diagnosis becomes ever more logical."

            These "costs of extended downtime" you mention.

            Who's picking up the costs? The system (hardware, software, etc) suppliers, the end users, the magic money tree?

            E.g. Do readers think the TSB IT people, as part of the diagnosis of the recent and ongoing issues, or the IBM people who were ordered in, by the CEO or whoever, might have tried "switching it off and on again"? Does the process seem to have helped resolve the issues?

            Have readers (some of whom must be TSB customers or TSB staff) been asked whether they care about complexity and interconnectedness, or whether they just might perhaps prefer to get at their money again so they can move it somewhere safer (e.g. shoe box under the bed).

            Complexity is not a valid excuse.

            "on-off-on provides fast recovery to a known state" might be admissible as a plea for leniency in certain very restricted circumstances.

          2. anonymous boring coward Silver badge

            Re: Its *very* depressing really

            "It is a standard approach because resetting a system in an unknown state to a consistent starting configuration is a logical and efficient way to start."

            It's the standard conditioned approach we use since we have been forced to use fragile systems where various components are allowed to affect each other in unpredictable ways.

            It's a sad state of affairs, that I mainly blame "ctrl-alt-del MicroSoft" for.

        2. anonymous boring coward Silver badge

          Re: Its *very* depressing really

          "Are you serious? And, by the by, who is this "we"?"

          It was a generalisation, of course.

          Obviously people have been tricked into running bad stuff on their machines for a very long time. Thanks MS for helping facilitating this.. Why not just run emailed stuff when people click on it? Brilliant!

          With "we" I meant your average home PC user, which, by the way, wouldn't even have had email facilities back in the pre-WWW era. (Yes, I know that _some_ would have had that.)

          It's a fact that the WWW has opened up the possibilities for trojans and viruses massively.

    4. anonymous boring coward Silver badge

      Re: Its quite depressing really

      "Almost as if they are all copying each other......."

      CPU design has been openly discussed in fora since day one.

      Most performance enhancement methods are very well known, and subject of research at universities etc.

      Developers get poached between companies.

      All currently used mainstream CPUs follow the same basic design pattern.

      Performance improvements are in the details of implementation, more than overall architecture.

      Pressure to make the fastest processors would lead to designers doing similar things, perhaps ignoring some obscure and unlikely to be exploited side effects (if they even considered them in the first place).

    5. steviebuk Silver badge

      Re: Its quite depressing really

      Might be because, as an engineer said to me the other day, she said "Maybe I don't see these things as I'm not criminally minded". This was as I was talking about potential exploits in some software we were using.

      Maybe the engineers that design the CPUs think the same. They just want to design the fastest chip possible and not have to think about the security of it.

      In my mind. As an engineer theses days I think you, unfortunately, do needed to think criminally in your work but only in order to protect yourself from what you think criminals might exploit.

      1. anonymous boring coward Silver badge

        Re: Its quite depressing really

        A better word might be that we have to be paranoid.

        We used to look at things from above. Privilege etc. Now we need to become paranoid and see what ways people can get to us from all sorts of obscure angles.

        1. Anonymous Coward
          Anonymous Coward

          Re: Its quite depressing really

          "A better word might be that we have to be paranoid"

          No, paranoia is believing things that are not true. Paranoids don't generally worry about real risks.

          Criminal minded is the way to go. Criminals look on everything as a potential opportunity for theft. They seek advantage, not fear.

          1. anonymous boring coward Silver badge

            Re: Its quite depressing really

            "No, paranoia is believing things that are not true. Paranoids don't generally worry about real risks."

            Well, before you have found the security flaw, you are indeed worrying about things that may or may not be true.

      2. Anonymous Coward
        Anonymous Coward

        Re: Its quite depressing really

        "the fastest chip possible and not have to think about the security of it."

        That's probably a reasonable starting point for designing a system to run the DOS version of Crysis - no need to consider data security or data integrity, no need for access controls like real computers used to have, just make that frame rate the fastest you can.

        For anything more realistic, there may be other fundamental considerations, along the lines of "should this instruction in this process with these access rights be permitted this kind of access to this kind of object".

        I'm struggling with some of the published descriptions of "rolling back" the consequences of mispredicted speculative execution.

        As far as I understood it, one of the fundamentals of getting speculative execution to work right in the real world (it's not easy, but it's not impossible either given sufficiently clear thinking) is that the results cannot become visible 'elsewhere' (e.g. to other applications), directly or indirectly, until the speculation up to that point is fully confirmed as correct. Hence multiple 'shadow' register sets and reservation stations and other such well documented and (I thought) well understood stuff.

        Shadow register sets provide multiple virtual (god i hate that word) copies of the real internal processor registers for speculative instructions to play with. Once it's determined which instruction stream gets to execute to completion, all the now-irrelevant copies aren't "rolled back", they're marked as outdated, and only the successful values are allowed to be used for further work. In any case, any speculative values *must not* be used for anything that will become visible in the outside world, e.g. a speculative load from real cache - such an operation cannot be "thrown away", and in the right circumstances potentially becomes a route for data leakage.

        Part of this is about processor architecture, part of it is about OS security. All of it requires clear thinking, not just a focus on 'how do we make this code sequence run faster' while forgetting the bigger picture - e.g. should this code sequence be permitted to execute at all.

        There used to be people who understood these things.

        1. Michael Wojcik Silver badge

          Re: Its quite depressing really

          There used to be people who understood these things.

          There still are. This is not a problem of understanding. It's a problem of economics.

          Things will change if and when a group of people representing a sufficient concentration of market power come to value particular security measures more highly than other attributes of whatever they're buying.

          And that's how things have always worked. A Honeywell running Multics was a hell of a lot more secure, under many reasonable threat models, than an Apple II. That didn't stop people from buying an Apple II to do their financial analysis with - because security was not an overwhelming economic advantage.

      3. Claptrap314 Silver badge

        Re: Its quite depressing really

        "Maybe I don't see these things as I'm not criminally minded".

        THIS. This is the mentality that made my time in microprocessor validation so...fruitful. This is the same mentality I tried to beat out of my calculus students. It's not lack of criminality, it's lack of rigor.

        I don't know how engineers are trained, but the important part of a mathematician's training is to find the edge cases that you missed the first time around. And the second.

      4. Daniel 18

        Re: Its quite depressing really

        "Maybe the engineers that design the CPUs think the same. They just want to design the fastest chip possible and not have to think about the security of it."

        In part, it's a matter of metrics. Engineers are not particularly rewarded for producing theoretically secure chips, they are rewarded for producing faster chips on time for the sales types to hype them as faster than the competition.

        In part, it's because a few engineers have months or years to design incredibly complicated chips, many many attackers, some lavishly supported by nation states, some by criminal organizations, some in a quiet basement somewhere have decades to find the small flaws that can be exploited.

    6. Tom 7

      Re: Its quite depressing really

      Federico Faggin designed the Z80 in 1974. It was, I'd bet, the last non-risc CPU that one person could get their head round. Since then people have designed bits of CPUs but how the hole thing works, along with the non too simple problem of the operating system running on it, is beyond one persons ability to fully understand. If you look at the way these things are being hacked you have to give some kudos to the people doing the hacking - just before you seriously deform their nasal passages.

      I would imagine, now these mechanisms have been uncovered they will be added to a long list of things to check for in future designs.

      Having said that I can easily see a bright engineer in Intel having spotted this already but the bean counters decided performance figures were more important than a hopefully sufficiently obscure security flaw.

      1. Daniel 18

        Re: Its quite depressing really

        "Federico Faggin designed the Z80 in 1974. It was, I'd bet, the last non-risc CPU that one person could get their head round. Since then people have designed bits of CPUs but how the hole thing works, along with the non too simple problem of the operating system running on it, is beyond one persons ability to fully understand."

        Off hand I don't know the exact date or chip generation, but it's been decades since CPUs were designed directly by humans, rather than by human guided design tools. That has to translate to a lessened understanding of what is going on 'under the hood' in detail... not that humans could do all the circuit analysis the tools do, even in a lifetime, for a chip with tens or hundreds of billions of transistors, data paths, etc.

    7. Michael Wojcik Silver badge

      Re: Its quite depressing really

      And not only pop out in Intel chips.. but everything else out there too.

      This is not at all surprising if you understand the basic concepts of information thermodynamics.

      A system that dissipates energy, where that dissipation is not a completely unbiased random function, is leaking information. In other words, it has side channels.

      If 1) any of those channels are detectable within the system, and 2) the system contains components with different security domains, then you have a potential violation of security boundaries.

      1 & 2 are true of essentially all general-purpose computing, and much embedded (dedicated-purpose) computing, today. The Spectre class has focused specifically on the side channels created by speculative execution, but that's simply because there are a number of ways in which those channels are detectable from within the system.

      Also, again, and contra Chris: These are not "blunders". They are deliberate design trade-offs. Arguably "oversights" is valid; those trade-offs were made based on incomplete risk analysis. But they were deliberate, and made to achieve the explicit goals of the project.

  5. John Brown (no body) Silver badge

    off by default...

    ...might also mean the fixes slow things down even more and risk is low enough that they don't want people complaining of slow(er) systems. Until an attack happens, then another update with make it on by default.

  6. chuckufarley Silver badge

    So who else...

    ...runs NoScript and is glad that they do?

    1. bombastic bob Silver badge
      Unhappy

      Re: So who else...

      "The fourth variant can be potentially exploited by script files running within a program"

      and I too, run 'NoScript' for reasons that now include THAT --^^^

  7. Anonymous Coward
    Anonymous Coward

    At this point I think Meltdown-Spectre is like herpes

    There is no permanent cure. All you can do is manage and limit damage.

  8. Adrian Harvey
    Go

    Analogy in video incomplete

    It would have been nice if the redhat video had extended the quite nice analogy of how speculative execution works to how this vulnerability exploits it. It kind of felt like it leapt from a helpful, high-level analogy - useful for explaining an obscure subject - to "and bad people could exploit this.." It would have been helpful to have an expanded analogy that explained how the speculatively produced bill could lead to another customer receiving your order (or something)

    I can't immediately think of a good way though - anyone else want to have a crack at stretching the analogy to it's limits?

  9. Anonymous Coward
    Anonymous Coward

    Anyone wanna buy an abacus??

    In stock and ready to dispatch.

    1. Anonymous Coward
      Anonymous Coward

      Re: Anyone wanna buy an abacus??

      I hope you have paid my license fee's on that. I have patent 000,000,001 that covers...

      "Adding or subtracting by movement of device(s) attached to another device(s)

    2. Tom 7

      Re: Anyone wanna buy an abacus??

      Unfortunately using an abacus in a mobile computing environment is a little insecure.

    3. Rusty 1

      Re: Anyone wanna buy an abacus??

      Yeah, well I have a table made of logs. #0000000002.

      1. GrumpenKraut
        Coat

        Re: Anyone wanna buy an abacus??

        Number 2: logs. Check.

        The one that is somewhat smelly --------->

  10. anonymous boring coward Silver badge

    "Also, to exploit these flaws, malware has to be running on a device"

    Unfortunately, just visiting a website starts all sorts of cra*p running. Draining the battery, flashing useless ads, and other oh-sooo important stuff going on.

    But, yes, these information leak bugs aren't exactly low hanging fruit. Much easier to just fool a gullible user to do something stupid.

    BTW, was it just me who found the explanatory RedHat video explanation not very useful? (I can't quite map waiters running around with how a CPU works..)

    1. Anonymous Coward
      Anonymous Coward

      For we know the breed.

      There's a whole criminal industry around persuading suckers to download and run malware, but the crooks aren't that clever. They can be traced. But nobody seems to bother, we never hear of anyone even getting to court.

      Is it an investigation failure or a reporting failure?

    2. Alan Brown Silver badge

      "I can't quite map waiters running around with how a CPU works.."

      Bistromathics

  11. Anonymous Coward
    FAIL

    Utter speculation to....

    presume that there is some security in Intel chips at all.

    Show me where !?

    The lack of a password at the front door perhaps, is about all.

  12. Jamie Jones Silver badge
    Boffin

    I'm confused..

    I still don't see how this valuable secret data that is now in the cache can be accessed by a third party. Even if it's based on a timing attack, if someone attempts to access data they aren't allowed to read, I'd have thought the cache wouldn't affect the speed of response, because the cached data would be unavailable anyway.

    Also, generally, they say virtual servers are likely to be badly affected, but I'd assume most of the hosts of these servers are not going to be idling, so the CPU shouldn't ever do 'idle-time' speculation, and just to be sure, wouldn't running something like SETI or crack in idle time solve that..

    Which leads me to another thought... CPU idle speculation must have an impact on kernel process scheduling, imagine:

    Case 1: A heavy job runs - not much else running on system. CPU speculates during brief idle times.

    Case 2: A heavy job runs. SETI etc. set to run at idleprio only - so shouldn't ever impact on the heavy job. However, in this case, the heavy job now loses the potential CPU speculative advantage, as the CPU is no longer idling as much.

    Argggh, too much to handle, and I've not had my coffee yet...

    Kittens.

    I like kittens.

    Furry, purry, cuddly kittens...

    Ahhh. Much better!

    1. anonymous boring coward Silver badge

      Re: I'm confused..

      "I'd assume most of the hosts of these servers are not going to be idling, so the CPU shouldn't ever do 'idle-time' speculation"

      It's not that sort of idle time (on a very macro scale). It's running some instructions while waiting for data for some other instructions: Out-of-order execution. Making your system more busy on a process level scale won't make any difference.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon