back to article You don't need no STEEENKING GPU, says Intel

Intel's again trying to make the argument that integrated graphics trump dedicated graphics processing units, by cooking a new Xeon tuned to processing video. Chipzilla's been here before: its Iris Pro graphics made it into last year's Xeons. This year's Xeon E3-1500 v5 is pitched at those who want to stream 4K video. It's …

  1. Pascal Monett Silver badge

    Intel has always tried to pull that one

    On-board GPU is good enough to display 2D Windows for work or browser-related purposes, no doubt there.

    I don't care what Intel tries to push, if I want to game or do heavy video computing, a discreet board from a specialized company is where I will seek the solution.

    1. Charles 9

      Re: Intel has always tried to pull that one

      In which case you're not Intel's target market. Dedicated gamers will build their own rigs or buy high-spec gaming rigs. Intel knows this. And graphics professionals will be using professional workstations, again with dedicated GPUs (only this time in their particular branch). Intel's not targeting them, either.

    2. Crazy Operations Guy

      Re: Intel has always tried to pull that one

      I have a laptop with an i7-4650U in it and using the built-in GPU (HD 5000). I was able to play Bioshock 1 and 2, Fallout 3 / New Vegas without any issues (1080p at 30-40 FPS). It can't compare to one of the big dedicated cards, but its not bad enough to just write it off as useless.

  2. Anonymous Coward
    Anonymous Coward

    Bloody 'Ell

    Not again. Please ? It doesn't work for serious graphics, and why would one want a Xeon for 4K streaming, pray tell ?

    1. P. Lee

      Re: Bloody 'Ell

      >why would one want a Xeon for 4K streaming, pray tell ?

      The only reason I can think is to split the a noisy workstation from an ultra-quiet frontend. Two 4k streams speaks to a dual-monitor setup and a fairly good LAN connection.

    2. Crazy Operations Guy

      Re: Bloody 'Ell

      I'd think that they are targeting the folk that are on the server end of video streaming. As for business, 4K of resolution = 4x 1080p monitors. So that's at least 4 thin clients supported per chip without having to fork over $10,000+ to nVidia for one of their VDI-centric video cards (And avoid the ridiculous charge from the server manufacturer for a system that supports a PCIe x16 slot).

    3. Bob H

      Re: Bloody 'Ell

      This chip really suits IPTV providers, streaming companies and broadcasters. Previously you bought an appliance that cost tens of thousands of pounds to do transcodes using ASIC chips, they provided excellent quality but the cost hurt. Many encoder and transcoder companies have been moving to software and cloud solutions in recent years which has hit the video ASIC market. With chips like this providing over a dozen transcodes in a 45W TDP for $450 I can see it being very attractive in my line of work.

      The VDI business is interesting, but look beyond that to the encoding space and there will be people jumping on this chipset when it hits the street.

  3. Lusty

    AWS

    Given that AWS do VDI, streaming and transcoding it wouldnt surprise me if Amazon asked for this what with their close Intel ties and all. The fact that other people might be interested too is a happy coincidence as Intel will probably sell enough to Amazon to make it worthwhile.

  4. Anonymous Coward
    Anonymous Coward

    Or no doubt..

    ...for the same cost of this chip, you employ a cheaper processor and a decent graphics card with far more power for this sort of job.

    1. Hans 1
      Boffin

      Re: Or no doubt..

      >Or no doubt..

      >...for the same cost of this chip, you employ a cheaper processor and a decent graphics card with far more power for this sort of job.

      You know what TDP stands for, right ?

      Now, in your small office, that makes sense ... nobody is gonna complain much .... but in the datacenter, where heat IS a problem, you really do not want dedicated power-hungry-heat-emitting graphics cards do you ?

      1. Anonymous Coward
        Anonymous Coward

        Re: Or no doubt..

        Yes. If you need power hungry heat emitting compute units then yes. Nice cool data centre that doesn't do what you need? Good idea Hans.

        1. Anonymous Coward
          Anonymous Coward

          Re: Or no doubt..

          Oh, so you never gave thought to a power-sipping workhorse instead? So you can cram more into the same space, get the same work done, and not blow out the A/C keeping the place from spontaneously combusting?

    2. Bob H

      Re: Or no doubt..

      At $450 this chip costs as much as a decent GPU and I haven't seen a GPU that can do 15 HD AVC transcodes yet. Certainly there is nothing from the GPU market that can do this at such a low TDP.

  5. BinkyTheMagicPaperclip Silver badge

    Excellent

    This has been an obvious development for ages, VDI makes a lot of sense.

    Know what makes more sense? AMD doing the same with their technology, they're perfectly placed to do so and can offer more performance per watt than Intel. Despite every opportunity they don't, what a waste.

    Intel eat them for lunch, again, and are the only virtualisation game in town.

  6. TeeCee Gold badge
    Facepalm

    Dear Intel.

    If you want people to do anything other than laugh at your CPUs with integrated graphics, try buying nVidia. That way you'd have something worth integrating......

  7. Ryan Kendall

    I hope they update the Intel GPU drivers more

    It's a great idea, however Intel don't always seem to fix their drivers.

    i3 & i5 have some great 'Blank Screen of Death' issues with windows 10.

    1. Sgt_Oddball
      Holmes

      Re: I hope they update the Intel GPU drivers more

      Yep, not quite as bad since the standard windows defaults work but anything that requires dedicated hardware support like any directX guff or OpenGL and your stuffed by intel if the chips are over a couple of years old.

      AMD by comparison are supporting cards on win10 over 7 years old... and why not if it still works?

      (Sherlock because it's elementary)

      1. Charles 9

        Re: I hope they update the Intel GPU drivers more

        Well, to be fair, an AMD GPU built seven years ago isn't that much different from one built now. It differs more in degree than in kind. Now, if you were to go TEN years back, back before Radeon became Radeon HD, then you're talking differing in kind, and I don't think AMD drivers go THAT far back. Same with nVidia, and the legacy split here seems to be when they went from four-digit numbering to three-digit. Last I checked, Intel's GPU designed changed a bit more significantly in the same time period, which is why the shorter legacy stretch.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like