back to article AMD, Intel hate Nvidia so much they're building a laptop chip to spite it

Arch-rivals AMD and Intel have put aside their animosity toward each other, and united against a common foe: Nvidia. On Monday the pair revealed they are working on a chip family that will combine an Intel Core x86 processor, a customized AMD Radeon GPU, and HBM2 – high-bandwidth memory – all in one package. The silicon will, …

  1. Roo
    Windows

    Intel's half of the memory architecture slides appear to have been drawn up by the Underpants Gnomes.

    The slides assert that Intel has engineered a teleporting mechanism that magically delivers memory traffic direct to the cores. Poor old AMD not having any magic to fall back on. :)

    1. Anonymous Coward
      Anonymous Coward

      The "How to deceive with statistics" course, has a second part, "How to deceive with slide decks". I, too, am curious about Intel's teleportation technology.....

  2. Anonymous Coward
    Anonymous Coward

    Gaming on laptops? Rubbish.

    Have they tried to play world of warcraft on a touchpad? Completely unusable, I kept getting killed in every raid.

    1. Throatwarbler Mangrove Silver badge
      Facepalm

      So . . . I hear there are these external devices, "mice" I believe they are called, which can be connected either via a cable or wirelessly. Also, there are other kinds of games . . . my current laptop is utterly unable to run Civilization VI, for example, which it might be able to do if it had a decent graphics chipset (i.e. not Intel's garbage GPU). Of course, I went for portability rather than performance, which is on me.

      1. Anonymous Coward
        Anonymous Coward

        I have an external device known as "sarcasm"

        1. Mark York 3 Silver badge
          Pint

          As do I....I am known as "The OnComing Scorn"

  3. Voland's right hand Silver badge

    Frenemies

    Frenemies, we love them...

  4. Anonymous Coward
    Anonymous Coward

    > world of warcraft

    > gaming

    lol. you tried Myst?

    1. Anonymous Coward
      Anonymous Coward

      No? I tried fog once but couldn't see anything.

    2. Mark 110

      "> world of warcraft

      > gaming

      lol. you tried Myst?"

      Two things:

      1. WoW was the first big team sports game played by pro teeams trying to beat it. Belittiling it because it got a bit shit and fell behind (which I assume is your take) is just being silly.

      2. If you need a graphics card (Gaming rig really as its the package) to really perform then a MMORPG is where you test it. Blizzard cut raids to 25 people from 40 because very few peoples hardware could handle it, and my experience of taking part in a city raid where about 500 of us horde hit an Alliance city gave me frame rates of about 1ps on very high end hardware.

      AFAIK Myst doesn't demand the card can draw the actions of 500+ people on screen all at the same time.

      1. Curtis

        WoW and Resources

        Remember the Gates of An Quirage (sp)? That little "event' wiped out everyone on the server because there wasn't enough resources on ANYONE's PC to render 1000+ players + NPCs

      2. ChrisC Silver badge

        "frame rates of about 1ps on very high end hardware"

        Yes, achieving frame rates of 1 picosecond would require some pretty high end hardware, I guess...

  5. zbmwzm3

    haters gonna hate

    The enemy of my enemy is my friend. However the fact that Intel is basically feeding AMD just enough to stay alive is concerning. Nvidia also makes the best graphics now and that's just the fact.

    1. KegRaider

      Re: haters gonna hate

      Except when it comes to crypto mining. AMD kick Nv's ring on that.

    2. Anonymous Coward
      Anonymous Coward

      Re: haters gonna hate

      unless they take Maxim #29... but more than a few of these are suddenly applicable: 19, 20, 54, 59, etc

      I hope they all keep at it for a good while. Someday I'll be able to afford a petaflop!

    3. FIA Silver badge

      Re: haters gonna hate

      The enemy of my enemy is my friend. However the fact that Intel is basically feeding AMD just enough to stay alive is concerning.

      That's basically been the case since the 80s/90s. Until x86 doesn't make up a significant portion of the global CPU sales this is the way it'll stay too. Keeps the anti-trust people at bay. (AMD is the only other company licensed to build x86 compatible chips these days I believe).

    4. Halfmad

      Re: haters gonna hate

      without AMD, NVidia would chuck out even less optimised cards/drivers each year. They need the competition to keep them pushing as lethargically as they are.

      If AMD folded, you'd have one major supplier of gaming cards and the tech would barely move in the next 10 years except to make them cheaper to manufacture (without passing on the savings).

  6. Oh Homer
    Paris Hilton

    "Nvidia's dominance"?

    Eh? Last time I checked, Intel had ~70% of the total graphics market.

    If Intel also wants to get into discreet graphics, like its fair-weather friend AMD, then maybe it should think about, erm, you know, actually bothering to develop some. For instance.

    Or is the mighty Intel saying that it's too incompetent to compete, on a technical if not financial level, with a comparatively tiny rival, so it has to jump into bed with another comparatively tiny rival for support?

    And anyway, what is Intel even aiming for here? It already owns both the CPU and GPU market, or at least the high end of one and the low end of the other. Is it going full throttle Gordon Gekko and trying to suck up the last few percentile too?

    Hmm, or maybe Intel cast its gaze over the ho-ryzen®, saw the sun setting on its market dominance, and hit the Panic Button.

    1. Throatwarbler Mangrove Silver badge
      FAIL

      Re: "Nvidia's dominance"?

      Intel may own graphics in terms of "most shipped," but anyone wanting any kind of performance will buy nVidia or possibly AMD. Intel's graphics chips are garbage, managing to be simultaneously poorly-performing, power-hungry, and hot.

      1. Oh Homer
        Holmes

        Re: 'in terms of "most shipped"'

        I'm looking at this from Intel's perspective, not the end user's. Who has the "most shipped" is really all Intel cares about. Intel already owns 70% of the GPU market, so I really don't understand why it's bothering to chase the thin end of the wedge. It's either predatory or stupid, or possibly both.

        You could argue that if the end result is better graphics for 70% of the market, then who cares what Intel's ulterior motive is, but remember we're talking about the 70% who obviously don't give a damn about graphics, otherwise they wouldn't be using Intel in the first place.

        The 70% wouldn't benefit anyway, as their use case could probably still work on graphics hardware from the 1990s. Meanwhile games developers will continue to target Nvidia, as ever, so it's not like Intel users would get a better gaming experience even with improved Intel hardware, since that hardware is and will continue to be completely ignored by developers.

        So who exactly does Intel think it's targeting with this nonsense, and why?

    2. zbmwzm3

      Re: "Nvidia's dominance"?

      What you don't remember Intel Larrabee? Hold on just threw up in my mouth a little bit..

      1. Gordan

        Re: "Nvidia's dominance"?

        "What you don't remember Intel Larrabee?"

        Larrabee is what is now known as Xeon Phi.

        1. zbmwzm3

          Re: "Nvidia's dominance"?

          Thanks, thanks a lot, now I've totally thrown up and ruined a perfectly good pair of jeans.

        2. Anonymous Coward
          Anonymous Coward

          Re: "Nvidia's dominance"?

          Larrabee is what is now known as Xeon Phi.

          And coincidentally, Xeon Phi is an anagram of another recently launched, obscenely over-priced piece of tech tat. Although the makers of that hope to turn a pretty penny purely on the Pavlovian instincts of their well trained customers.

    3. John Brown (no body) Silver badge

      Re: "Nvidia's dominance"?

      "Eh? Last time I checked, Intel had ~70% of the total graphics market."

      Well, yeah, since it seems most if not all Intel based PCs are sold with onboard graphics these days. The vast majority of people either can't upgrade (corporate purchases) or don't know it's an option. Why would Joe Average spend extra on a GFX card when their shiny new PC came with one built in with DisplayPort/HDMI/DVI sockets already in place? The only people buying GFX cards are those who have a specific need.

      It's the Microsoft business model.

      1. KegRaider

        Re: "Nvidia's dominance"?

        Exactly!

        My neighbour asked if his computer could run the HTC Vive after I was demoing my toy to him. They get rather insulted by hysterical laughter, and pointing out that their $2000 AIO machine is just a glorified tablet. They don't understand that a $1000 graphics card (alone) is what is required to play games with any sort of resolution and framerates.

        1. Anonymous Coward
          Anonymous Coward

          Re: "Nvidia's dominance"?

          @Kegraider....

          Why do I think you were a little like this?

          https://www.youtube.com/watch?v=vviz-63TMNI

      2. Maventi

        Re: "Nvidia's dominance"?

        > The vast majority of people either can't upgrade (corporate purchases) or don't know it's an option.

        Then there are those like myself who specifically purchase desktops with Intel graphics because their Linux support is second to none. That's especially important when deploying desktops in hundreds at a time.

        1. Richard 12 Silver badge

          Re: "Nvidia's dominance"?

          I'm glad Intel support Linux, because they don't support Windows.

          So many horrific driver bugs. Take a look at the Chromium blacklist...

        2. phuzz Silver badge
          Stop

          Re: "Nvidia's dominance"?

          "Intel graphics [...] their Linux support is second to none"

          Really?

          Sure, as long as you're using a bang-up-to-date kernel, then the latest CPUs work ok, but if you're using a LTS release then get ready for basically no graphics support beyond VGA resolutions, or finding a way to run a bleeding edge kernel without breaking everything else.

          And don't get me started on the complete non-existence of drivers for Intel "PowerVR" graphics. Especially the stuff integrated into the lower end chips, which would normally be just the sort of hardware you'd want to use for a Linux desktop.

          1. Maventi

            Re: "Nvidia's dominance"?

            @phuzz agree the PowerVR stuff is a complete joke, but if you stick to mainstream desktop CPUs (i5/i7) the situation is much better.

    4. razorfishsl

      Re: "Nvidia's dominance"?

      Because it is integrated into the CPU core.

      So even if it is not used ... it is still "shipped"

    5. naive

      Re: "Nvidia's dominance"?

      >Eh? Last time I checked, Intel had ~70% of the total graphics market

      It would be interesting to find out which "graphics market" that is. As far I am aware, Intel does not produce any serious graphics hardware. At least a future mark ranking #165, with a rating of 3% of a NVIDIA 1080Ti, it seems Intel has some catching up to do.

  7. Throatwarbler Mangrove Silver badge
    WTF?

    Article picture

    So . . . why does the picture accompanying this article feature Mormons with crazy eyes?

    1. Anonymous Coward
      Anonymous Coward

      Re: Article picture

      why does the picture accompanying this article feature Mormons with crazy eyes?

      I don't know. Are there other sorts of Mormons you'd prefer as the pic?

      1. Anonymous Coward
        Anonymous Coward

        Re: Article picture

        I don't know. Are there other sorts of Mormons you'd prefer as the pic?

        Man, you are on a roll today - first the anagram, now this :-)

        1. Anonymous Coward
          Anonymous Coward

          Re: Article picture

          Man, you are on a roll today

          Thank you, thank you. And in the words of a music hall comedian, "throw money, not flowers!"

    2. This post has been deleted by its author

    3. NateGee

      Re: Article picture

      It must be getting late in the day. It took me two attempts and a back and forward after reading that as Moomins!

    4. wayne 8

      Re: Article picture

      Not Mormons. Hipsters with a guru.

  8. W. Anderson

    Screwed Again?

    Just when AMD appeared to be getting it's act "almost" together - licking Microsoft's anus at the expense of full Linux compatibility and operation is the block buster impediment, they move to get screwed by Intel AGAIN!!.

    Once Intel has a complete understanding, full technical specifications and grasp of AMD's Radeon GPU technology, the partnership will dissolve - in about 12 - 15 months, and Intel will announce and produce their own Radeon killer that may, or may not compete fully with Nvidia.

    In any case, such development will surely put AMD in last place AGAIN.

    1. Anonymous Coward
      Anonymous Coward

      Re: Screwed Again?

      In any case, such development will surely put AMD in last place AGAIN.

      Intel is a company that can't even produce a working cable modem chipset. I can't see that AMD should be too worried at Intel stealing their graphics lunch, even if they run off with the plans.

    2. wertyporp

      Re: Screwed Again?

      Not a chance, the only way Intel can develop their own GPU is to license IP either from AMD or Nvidia, Both AMD and Nvidia owns basically all of the major graphics IP right down to the fundamental building blocks. For this deal, AMD supply the whole silicon not licensing their IP, so Intel hired RAJA to write software that is programmable to AMD GPU for Intel, because Raja knows everything about vega, besides raja signed a non-disclosure agreement when he left. Even if Intel knows everything about AMD graphics chip, they can't make one simple due to patents.

  9. Anonymous Coward
    Anonymous Coward

    'a series of slides that trashed AMD'

    Could make some sense strategy-wise, if you view it as an opening salvo aimed at the 'hotty', you're actually trying to 'seduce' later on, in the night...

    1. Swarthy
      WTF?

      Re: 'a series of slides that trashed AMD'

      Corporate negging?!

  10. Anonymous Coward
    Anonymous Coward

    So is this about graphics...

    Or getting in on Currency Mining?

  11. Anonymous Coward
    Anonymous Coward

    Rumor has it

    This product exists because of Apple. Specifically that Apple might switch from Intel x86 to AMD x86 to get a higher performance integrated GPU than what Intel can provide. Less clear why AMD would go along with this though, if there was a chance they could sell Apple an all AMD solution.

    1. Dan 55 Silver badge

      Re: Rumor has it

      A year ago it was announced Apple was working with nVidia. What changed?

    2. Blotto Silver badge

      Re: Rumor has it

      @DougS

      i reckon the AMD mobile chip with Readeon graphics will out perform the intel equivalent at a cheaper price, providing a viable alternative for the last Apple x86 machines before they go ARM with their own Silicon.

      1. Anonymous Coward
        Anonymous Coward

        Re: Rumor has it

        If Apple was going to go with their own SoC they would have done so already. The A11 has performance on par with the latest Macbook according to Geekbench (not that that's a particularly great benchmark, but it is the best we have for cross platform stuff that runs on both iOS and OS X) so what are they waiting for?

        The only chance left for it is if Microsoft follows through on their promise to introduce full Windows on ARM, including built in x86 emulation. There is probably a large enough audience of Mac owners who either sometimes boot into Windows or run Windows applications via Fusion that going ARM in a way that abandoned them would not be smart.

        Anyway, I still think the best way for Apple to support running OS X on ARM is to provide for installing the full OS X API and GUI as an "app" on their iPhone. Plug it into a monitor, provide a keyboard and mouse, and there you go. Maybe down the road you go ARM on Mac, once ARM has become the dominant platform for OS X applications because of all the people using their phone as an "occasional desktop", but first things first.

  12. Anonymous Coward
    Mushroom

    Wha??

    Is it just me or is the world gone completely bonkers?

    1. Mark York 3 Silver badge
      Thumb Up

      Re: Wha??

      'Did I do anything wrong today or has the world always been like this and I've been too wrapped up in myself to notice?'

      I think its taken a turn for the worse, but that's probably just me getting old.

  13. the Jim bloke
    Big Brother

    Orwellian Alliances

    Oceania together forever with its eternal brother Eastasia/Eurasia against its implacable archenemy Eurasia/Eastasia...

  14. inmypjs Silver badge

    It will need Win 10...

    which makes it a worthless piece of crap regardless.

  15. LaeMing
    Boffin

    One assumes...

    AMD is doing this because they analysed the numbers and the results came up positive.

    1. Richard 12 Silver badge

      Re: One assumes...

      AMD are pretty desperate to make some money, but they need to be very, very careful.

      Intel have chewed up and spat out a lot of other companies.

      1. Anonymous Coward
        Anonymous Coward

        Re: One assumes...

        Its the ATI drivers I have issue with.

        I have never encountered stable ATI gfx drivers.

  16. cat_mara

    Both Chipzilla and mini-Chipzilla are keeping quiet...

    Chipzilla and Chipzuki, shurely?

  17. John Savard

    Not long for this world?

    I would have thought that a Xeon Phi would be vastly simpler to program than a GPU from either NVIDIA or AMD/ATI if what one wants to do isn't graphics, but general number crunching. Although GPUs are less stereotyped in the kinds of calculations they can perform than they once were, they're still quite limited.

    The advantage of the GPU is that despite being harder to program - and maybe not even being helpful at all, depending on your problem - it offers, for the same price, an awful lot more floating-point operations per second.

    A possible sweet spot is a vector processor, like the upcoming Aurora from NEC. That has a limitation, in that it requires arrays rather than individual numbers to work on for full efficiency, but is otherwise as flexible as a regular CPU.

  18. Phil W

    Not at again....

    "It's quite possible the Core processor could keep its built-in Intel GPU, activating the beefier AMD Radeon part as needed"

    Let's hope not, Optimus was shit. Lots of games would detect both cards and insist you pick one, but picking the Nvidia resulted in no display. Or they simply couldn't detect the Nvidia at all.

    Even when it did work and the Nvidia chip did the graphics work, it did it poorly compared to systems with just the Nvidia graphics chip without Optimus.

    1. Dave 126 Silver badge

      Re: Not at again....

      Game engines, graphics drivers and OSs are all a few generations on from when Optimus was first introduced.

  19. mark l 2 Silver badge

    I am still not sure how the world's largest chip maker can be so bad at making graphics chips that they have to go to their rivals who manage to out perform them with a much small R&D budget.

    I have always been a fan of AMD, my first self build PC had a AMD K6 CPU and a ATI Rage 2 graphics card.

    1. imanidiot Silver badge

      Because designing graphics means optimizing for parallel processing lots and lots of data at the same time. A CPU is a mostly serial device, doing only a few calculations at a time, but doing them with much better precision, specific fault handling, etc, etc.

      It's like asking why Lamborghini don't make TGV's. Surely they both go really fast on wheels? Can't be that different.

  20. R3sistance

    Good old Tegra

    This is because the Tegra chipset is so awesome right? right now Intel might be well in front but as the PC market continues to grow and ARM continues to want to try and break into the server market that position could come under threat, more so with the Tegra core which would give both CPU and GPU in a single packet.

    Heck the AMD monolopy on the Game Consoles market was broken by nVidia with the Nintendo Switch which uses a Tegra. I am honestly not aware of any direct competing product that rivals the Tegra right now. On top of that nVidia also is pretty much leading self driving car technology and have a strong presence in super computers. Certainly in the future nVidia might overtake intel but this just makes AMD's move here even stranger since Intel is bringing in like 5 times the income of nVidia and AMD combined. I don't see this ending well for AMD.

  21. x 7

    Intel have done this before.....there was a period when they were shipping ATi chipsets as Intel badged for the bottom-end market. Can't remember exactly when - replacements for BX/ZX chipsets maybe? But it marked a massive drop in sales of AMD based systems not long after

  22. Anonymous Coward
    Anonymous Coward

    Dear AMD, you really should read the fable of The Scorpion and The Frog.

  23. 2cent

    1 + 2 = 3?

    What is not being said is where the HBM2 is coming from. If it's made by Intel, no more shortages. If it is not, the price will come down. If there is a hardware cross platform agreement between Intel and AMD on this technology, we will see HBM3 a lot sooner and a lot faster. In any case, NVidia will have to (A) make their own (B) buy it from current suppliers -Samsung/SK Hynix- or (C) buy it from Intel.

    It is unlikely that GDDR5 will save NVidia.

  24. eclairz

    Nvidia got the best console in the Nintendo Switch

    In the end Nvidia was able to quickly take some of the console market away from AMD who were in all the consoles, and Nvidia were focusing on portable gaming with their shield and Nintendo's Switch which probably have higher margins. AMD currently are used by XBox and PlayStation, so it makes sense for AMD to fight them on have APUs, but I guess Intel wanted some action too since they missed out on console market due to their GPUs sucking, but maybe AMD needed the money and Intel the expertise.

  25. ntevanza

    Molestation, but is it sexual?

    The molester is the one with the power. I bet this would never have come to light if it hadn't been for Harvey Weinstein.

    AMD will wake up in the morning feeling dirty and alone.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like