back to article Intel gives the world a Core i9 desktop CPU to play with

Intel's Core range of CPUs now comes in a new "family" and has a new upper limit. Unveiled at the Computex trade show in Taiwan, the new family is dubbed the “Intel Core X-series processors family”. One member of the family is a the Core i9 Extreme Edition, which takes the Core range's upper limit from a “7” to a “9”. The …

  1. redpawn

    At 140 Watts...

    Could it be used to keep a supply of water hot for tea? Seems a shame not to use the considerable heat.

    1. Flocke Kroes Silver badge

      Re: At 140 Watts...

      The bad news is that you need Tcase for you Intel tea maker, not Tjunction. A reasonable guess is about 70°C, which is sufficient for green tea. Heating time for a well insulated kettle is 4200*(Thot-Tcold)*mass/Power. Cold water from the tap is about 10°C, so a 140W processor make 1kg of hot water every half hour, or one cup every 7.5 minutes. Green tea is usually served in smaller cups, so you can have a fresh cup every 4.5 minutes.

      It looks like Intel have found a killer app that totally trashes the raspberry pi, which would take over two hours to make a cup of green tea.

      1. Brian Miller

        Re: At 140 Watts...

        Yeah, but then every half hour you'd have to shut the computer down to put on another kettle.

        Oh, wait, ... Windows ... right ...

        1. Anonymous Coward
          Anonymous Coward

          Re: At 140 Watts...

          "Oh, wait, ... Windows ... right ..."

          Presumably. Can't see anyone wasting this sort of hardware on Linux. Direct-X 12 ftw....

    2. macjules
      Coat

      Re: At 140 Watts...

      It's just in case you need ... like .. that extra push you know.

    3. Horridbloke
      Devil

      Re: At 140 Watts...

      I once set up my PC to vent its hot air into a growing enclosure to help the chillies along. That was a Core 2 quad 6600 IIRC.

      1. tedleaf
        Happy

        Re: At 140 Watts...

        Me too,except it was a pair of old Dell precision 4500's (if I remember right) from the skips at my work site,old scsi drives etc,mine wasn't chillies,but did help me chill out !!!

    4. BinkyTheMagicPaperclip Silver badge

      Re: At 140 Watts...

      It's not actually that bad, the worst recently were AMD's 9xxx series at 220W. These chips fit a lot of cores into an 140W thermal envelope, and restrict the clock speed based on that.

      You might like my new system, dual E5-2690 (v1. i.e. old (2012), quite fast, cheap-ish, lots of cores) - 135W each, with two GTX 480s flashed to Quadro 6000s. Those are 250W GPUs, so if everything is at full chat, it'll be using over a KW in power..

      Yes, I did buy them before Ryzen was out in case anyone asks. Ryzen isn't as good as the Intel alternative, but for half the price, it's far more than half better (Unless you're using VME, which is currently broken on it)

    5. ChrisC Silver badge

      Re: At 140 Watts...

      A really nice hot cup of tea, excellent. Now, where did I put my Bambleweeny 57...

  2. Old Used Programmer

    Where have I seen this idea before...?

    Compute Card internal to a monitor? Sounds familiar... Oh, yeah... NEC last October. Professional grade monitors that can have a Raspberry Pi Compute Module internally.

  3. Youngdog

    But I don't want more cores!

    I want that 4.3GHz clock speed of the 7740X with the memory support and PCIe capacity of the other two.

    Can someone who knows more than I obviously do about CPU/Chipset architecture please explain why Intel won't let me have my cake and eat it?

    1. ArrZarr Silver badge

      Re: But I don't want more cores!

      It's due to yield. AFAIK, when a chip is created, not all the PCIe lanes pass validation. If too many fail, that chip is bumped down a notch.

      There will probably be another version of the 7740X, probably called the 7760X or some such which has the same core count, clock speed and full fat PCIe.

      For a similar historical situation, have a look at the 5820k and the 5930k - identical chips but the former had failed validation on more lanes so had a reduced PCIe capability.

    2. naive

      Re: But I don't want more cores!

      This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around.

      All these cores for desktops are non sense. They better use the transistor space on the cpu die for making a superfast 2-core with huge L1/L2/L3 caches so memory access can be minimized.

      Strange nobody is thinking about this, and they keep making cpu's suitable for parallel processing with which no software maker can do anything useful with.

      1. hmv

        Re: But I don't want more cores!

        Sigh.

        Just because you only run games doesn't mean the rest of us do. I do use cores.

        1. J. Cook Silver badge

          Re: But I don't want more cores!

          As do I; Handbrake will use every drop of processor you can feed it when transcoding a 25/50 GB Blu-ray rip down to something manageable in size and in a reasonable amount of time.

          My current machine (i7 6700) manages it in ~20 minutes, whereas the last machine (an old Precision 5500) was more or less real time (2 hours plus).

          I figure i should get ~8 years service out of it, which is what I got from the last set of computing hardware I bought.)

      2. Anonymous Coward
        Anonymous Coward

        Re: But I don't want more cores!

        "This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around."

        Some games are using all cores, have a look at Hitman 2016, which can potentially bottleneck at CPU Level, even with a very good GPU.

        And this is going to be more and more frequent, with more and more games multi-threaded, now the GPU API allows that (DX12, Vulkan).

      3. ChrisC Silver badge

        Re: But I don't want more cores!

        Start up your PC and let it boot into the OS. Now, without manually starting up *any* apps, games or whatever else you might use your PC for, open up the task manager and see just how much stuff is already running in the background...

        Your favourite game might be so badly coded that it genuinely can only use a single core, but even then your gaming experience will be enhanced by having additional cores available to handle all the other crap that a modern PC will want to be running at the same time. Oh sure, for each specific workload there'll always be a question over whether x cores at y GHz vs (n*x) cores at (y / m) GHz gives the best performance, but the long term trend seems to be heading straight down the road signposted "More Cores Please".

        Personally speaking, I can't wait to see these multi-core beasts hit the market, so long as the renewed level of competition between Intel and AMD keeps prices at a sane level - I could really do with refreshing my desktop system at some point in the next year...

      4. This post has been deleted by its author

      5. TheVogon

        Re: But I don't want more cores!

        "This is spot on. My game uses 1 (ONE) core of my expensive I7. The rest is idling around."

        Quite - this seems to be mostly overkill. Scopio's custom 8 core AMD CPU only runs at 2.3 GHz and can max out a 6TFLOPs GPU (That's roughly equivalent to an Nvidia GTX1070)

        1. Anonymous Coward
          Anonymous Coward

          Re: But I don't want more cores!

          Haven't benchmarked Scorpio yet but a 6 core XB1 build roughly matched the same game running, singlethreaded on an FX8370, running without the benefit of DX12 parallelism. I'm expecting Scorpio to compete with 2 FX cores. So huge core counts aren't needed for game players.

          Building games though, I much prefer 5 hour Ryzen clean build times with 8cores/16 threads to 9hr with 4 core pair/8 threads on FX at much higher clock and power drain. 16 core Threadripper will probably be the sweet spot before diminishing returns for my workloads.

          1. TheVogon

            Re: But I don't want more cores!

            "Haven't benchmarked Scorpio yet but a 6 core XB1 build "

            Microsoft have described Scorpio as a "full custom CPU design", so I would assume it's likely a fair bit faster than the 31% clock speed uplift might indicate....Sony meanwhile went with non-customised Jaguars on the PS4 Pro.

    3. Nimby

      Re: But I don't want more cores!

      Here here! I would also be perfectly happy with a higher GHz 4 core (8 thread) CPU with more PCIE lanes! 44 lanes? I can fill that easy. I would love to RAID5 four M.2 cards as my primary drive and RAID10 four SATA spinning rust for document/data storage. Add another SATA for a Blu-Ray player. Add two GPUs in SLI with full 16 lanes each. Of course you have all of those shiny new USB3 controllers right? Ethernet. Audio. And... Gah! Never enough lanes.

      Oh, sure, there are a lot of services running in the background and blah blah blah, but most things don't really use much CPU. I have plenty of processing power for everything I do. A "standard" gaming rig is bad enough, but hard drives are depressingly slow and as soon as you try to counteract that, splat. Faceplant into the old PCIE wall.

      It's been too long that I have been designing systems around how I want to suffer. For once I'd love to build a system where I can have everything that I want to work as it should have been able to. No compromises.

      1. Anonymous Coward
        Anonymous Coward

        Re: But I don't want more cores!

        "Here here!"

        Where where?

  4. Shadow Systems

    Ahh, I love the smell of competition.

    Intel trickle feeds us a few megahertz here, a few there, a couple of cores here, a couple there, & charges handsomely for the "priveledge".

    AMD tosses us a CPU with more cores & arguably better performance at a far lower price, & suddenly Intel scrambles to offer "something better".

    Coincidence? I think not.

    I love the fact that the big names in computer vendors either have currently or "plan on offering soon" AMD based desktops & laptops for customer choice. I wonder what price differences we can expect to see in such systems given Intel's propencity to make our megahertz mega-hurts?

    (I'm not sure about that pun, but I'll leave it because it's poking fun at chipzilla.)

    Competition: it's the lube the customers use to get corporations to get up off their ass & start offering REAL value for our money!

    1. TReko

      Re: Ahh, I love the smell of competition.

      Very well put.

      The main reason desktop CPU's have stagnated in the last few years is the lack of competition to Intel.

      I hope AMD is not destroyed by this.

    2. James 51

      Re: Ahh, I love the smell of competition.

      Given that every time I have looked at a laptop with AMD in it, it has been underpowered and over priced and as for desktops, they have been underpowered and overpriced too. That is not AMD's fault but if manufactures and resellers fumble the ball again it will be difficult for AMD to get the market share their products deserve.

      1. Naselus

        Re: Ahh, I love the smell of competition.

        "That is not AMD's fault but if manufactures and resellers fumble the ball again it will be difficult for AMD to get the market share their products deserve."

        Tbh, it really was AMD's fault. Prior to Ryzen, their most recent competitive offering in the desktop space was probably the Athlon 64 from 2003. They ended up in marginal desktops for the last decade or so because desktop Opteron and onwards genuinely were marginal processors, running years behind Intel's Core series. Performance-wise, a high-end machine had to have an Intel chip.

        It's just the pattern in the processor market, really; Intel overwhelmingly dominant at almost all times but with AMD injecting a tiny bit of competition every 15 years or so.

        1. Tannin

          Re: Ahh, I love the smell of competition.

          Well, not entirely. AMD have had several brief tilts at superiority, just as you say, but before and after they had one very long period of clear superiority on almost every metric. This was back when clck speeds were moving through (roughly) the 800MHz to 3000MHz range. Intel's Pentium III was reasonably competitive but way too dear; the Pentium IV was hopelessly outclassed for its entire market life, and as for Intel's wrong-headed fetish for the disaster called Rambus, the less said the better.

          Intel's then-new Core chips leveled the playing field, and the Athlon replacements were pretty sad efforts. As you were.

        2. James 51

          Re: Ahh, I love the smell of competition.

          @Nauelus This laptop retailed for £399 when it was new:

          http://www.currys.co.uk/gbuk/computing/laptops/laptops/hp-14-an060sa-14-laptop-silver-10156921-pdt.html

          and if you look at notebookcheck:

          https://www.notebookcheck.net/AMD-E-Series-E2-7110-Notebook-Processor.144996.0.html

          That CPU is only fit for a netbook. I've seen A4 latptops for £350 and £399 as well which is just crazy money.

          1. Naselus

            Re: Ahh, I love the smell of competition.

            @James 51 - not sure what your point is? That AMD were able to churn out shit processors for shit notebook computers while Intel were dominating all the actually profitable spaces quite totally? You can pick up an i3 laptop for £400 which will comfortably out-perform the E-series on more or less every metric.

            Don't get me wrong - when AMD do deliver, they produce great equipment at amazingly low prices. But the general rule for probably 25 of the last 30 years has been that an AMD processor line is inferior to an equivalent-spec Intel processor line (with the occasional very honorable exception). They ended up in low-end kit because by definition they cannot be present in high end kit - a laptop with an AMD processor in it is a cheap laptop with low-end hardware because the AMD processor IS cheap low-end hardware.

  5. Anonymous Coward
    Anonymous Coward

    Obligatory

    How well does it run Crysis?

    1. James 51

      Re: Obligatory

      Without an on-die GPU, it won't.

  6. anonymous boring coward Silver badge

    Only $1000 if you buy 1000 units?

    1. Robert Moore

      "Only $1000 if you buy 1000 units?"

      You would be a fool NOT to purchase at that price.

  7. Andy The Hat Silver badge

    I'm too old ...

    My first PC had 1/13 the DRAM of the L3 cache on that mother ... and it was about £50/Mb stick.

    Best I go service my Zimmer frame ...

  8. Anonymous Coward
    Anonymous Coward

    Dangling unpleasantly.

    > Intel thoughtfully supplied a short HDMI cable with the stick, but that just made it dangle unpleasantly.

    Sticky velcro?

    1. Charles 9

      Re: Dangling unpleasantly.

      Not that useful if the back of the unit's covered with ribs and air vents. Anything sticky's likely to melt off.

  9. TooManyChoices

    Well when AMD brought out the Ryzen 7, Intel had to have a bigger number.

    Now I'm waiting for the Ryzen 11, then the Intel 13, then the Ryzen 15 and the Intel 17

    .

    .

    .

    .

    Ryzen 999999 ......

    1. kain preacher

      Nope its called the threadripper. It's 16 core 44 pci e lanes . Asus has a prototype board that has 8 dimes slots 2 m.2 slots

      If really want to go nuts the EPYC chip that is replacing the opteron has 128 pci e 3.0 lanes ans 32 cores

      1. Anonymous Coward
        Anonymous Coward

        Sorry to correct you, but I watched the presentation, Threadripper has 64, yes 64 count em, PCI-e lanes available to ALL it's CPU's, that's right, no limited per CPU shenanigans, like Intel have pulled.

  10. Arthur the cat Silver badge

    Core 11 and Spinal Tap jokes

    It strikes me that "Core i11" is far too close (depending on font) to "Core ill" and Intel would want to avoid sick jokes. Similarly, "Core i13" is going to trigger too many superstitions to be a good name. If they do introduce a new level either they're going to have to stick to hex - "Core iB", or, given the random fits of pointy headedness that tend to strike marketing departments, they'll have a complete image change, and call it something like "Thrasher 42" and baffle everybody.

  11. Cuddles

    Workstations

    "an 18-core, 36-thread beast may well excite some workstation buyers"

    Probably not, given that it's been possible to get 22 core Xeon processors for some time now. It will be nice when Kaby Lake finally comes to Xeon properly, but I can't imagine anyone buying a single-socket only gaming CPU that supports a maximum of 64GB non-ECC RAM and has 1/4 the cache, and actually considering it a workstation.

    Speaking of which, I'm pretty sure these are Kaby Lake, not Skylake as stated in the article. Certainly Intel's website suggests that's the case, and I can't imagine why they'd be releasing new Skylake parts at this point.

  12. Anonymous Coward
    Anonymous Coward

    Threadripper..

    Why do I keep reading that as Threadcrapper?

    Why post as anon? Becasue I have a job that I would like to keep, thank you.

    Can't be too paranoid.. Scares me when I see people posting on FB forums that handily show where the person works underneath their name...

  13. localzuk Silver badge

    Pfft, cores. Who needs em?

    Minecraft only uses 1 core for most things. I'm still waiting for a 10Ghz chip so I can run a modpack properly. :D

    1. Anonymous Coward
      Anonymous Coward

      Re: Pfft, cores. Who needs em?

      "Minecraft only uses 1 core for most things"

      Only if you run the Java crap. The Windows / Xbox .Net versions are fully multi-threaded.

      1. localzuk Silver badge

        Re: Pfft, cores. Who needs em?

        The modding community is still very much Java based.

        1. stephanh

          Re: Pfft, cores. Who needs em?

          I understand Adventure/Colossal Cave can also exploit 1 core max.

          If one of the most seminal computer games doesn't support multi core, what is the point?

  14. Randy Hudson

    Not for gamers

    pitching the range at … gamers

    99% of games see no benefits beyond 4 cores. Once you have 4 cores, you want to increase sustainable clock speed, not core count.

  15. Anonymous Coward
    Anonymous Coward

    another chip ...

    .. another CPU ?.. massive multi-core is the way to go, software needs to keep up though,

    .. wake me up when something interesting happens

  16. conscience

    Considering what this will cost, I'll be sticking with AMD.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like