back to article Intel CEO Brian is a man living on the edge

Intel declined to comment on industry whisperings that Qualcomm is mulling ending its Arm-powered server processor efforts. Perhaps it's no wonder: Intel has a monopoly on the compute workhorses used in data centers by Google and Amazon, among others, to provide their increasingly ubiquitous cloud computing services, as well …

  1. Michael H.F. Wilkinson Silver badge

    He's NOT the messiah!

    Sorry, couldn't resist. However, his message is not a universal truth. We still need to throw stuff away, even without GDPR. Many instruments generate such a massive load of data you cannot possibly store all of it, even with cheap storage. I work both on data from astronomical instruments and nuclear accelerators, and the data rates of the new generation of instruments, like SKA (but even for existing LOFAR) means we must reduce the data at the source, and throw most of the raw data away. This is on the one hand anathema to scientists, but it has to be done. Most collisions in accelerators yield nothing of interest, and can be discarded. It is just the rare events that need to be stored. So he is right that we want more data, but we don't want to store indiscriminately, or we will drown in a see of uninteresting stuff.

    He is of course right that we need more processing grunt (and storage), but that is pointing out the bleeding obvious

    1. Anonymous Coward
      Anonymous Coward

      Re: He's NOT the messiah!

      We want more and better information, and to that end we need more data points. What we do not need is to keep storing more and more noise.

      The example of data reduction in science is a good one. Unfortunately there are a lot of people who have a thoroughly OCD attitude to the stuff being collected, ending up with vast redundancy. It's good for server and database vendors, but it rests in terabytes that will never be accessed. I think the main reason is the reluctance to do proper analysis of what is being collected at the start, and working out how to store it most effectively.

      1. JohnFen

        Re: He's NOT the messiah!

        "We want more and better information"

        Who's "we", and information about what? I want these slurp-happy companies to have less and worse information about me.

  2. Destroy All Monsters Silver badge

    Certainly...

    By 2020, every stone will generate GigaBytes/hour of data, especially if wrung for blood.

    (Also, Intel Architecture ... will it be with us for ever?)

  3. Missing Semicolon Silver badge
    Unhappy

    So, no announcement of the faulty-chip buyback prorgram?

    Silly me....

  4. This post has been deleted by its author

  5. Anonymous Coward
    Anonymous Coward

    Has Mr Krzanich ever heard of the term 'Overfitting'?

    Maybe him and the rest of the goons interested in harvesting all the data should learn about it, it tends to royally screw up otherwise good predictive analysis. Turns out there is defintely such a thing as too much data, and these companies already have too much data, that is why they can't get anything right..

  6. Anonymous Coward
    Anonymous Coward

    Intel is going to have to watch their back

    For now, Intel's processors and other chips are general enough in nature and quick enough to be usable in what I call "intelligence engineering and analysis." If you look at how I file and track things, "artificial intelligence", "machine learning" (not quite the same thing) and "big data" are about taking data, tossing it into models and algorithms, to derive or infer information. I got my start back in the 1970's when "big data" involved fork-lifts and palettes. All that has changed since the beginning is the density and volume, oh, and the amount of skull-sweat involved. Well and good.

    Intel's problem is the current state-of-the-art in the above "disciplines" is mostly tossing darts at the dart board and trying, desperately, to find what works. As a few others in the field have observed, how & why, not so much. That's one reason that the field more resembles an art rather that a science. So, the engineering is very hit and miss. This if fine with general purpose hardware if speed and accuracy aren't paramount. Therein lies the rub. When you really, really need that speed and accuracy you have to look at dedicated designs. Thus we see where nVidia, IBM, Google, and even Microsoft are designing custom hardware.

    It's going to be an interesting balancing game for Intel. The one thing they've really got an in with is custom designs using their FPGU's and they certainly don't have a total lock on that look at recent announcements of advances in that section of hardware engineering. Even Microsoft has figured out that. Usually they're further behind the curve.

    I mostly use Intel here, I've always had respect for their CPU's even if a bit pricey. However, when it comes to doing intelligence engineering, I bring bigger, more specialized hardware that's, unfortunately, pricier by far. On a performance/price basis, actually cheaper.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like