back to article The rise of AI marks an end to CPU dominated computing

Just as Intel, the king of CPUs and the very bloodstream of computing announced that it is ending its Intel Development Forum (IDF) annual event, this week in San Jose, NVIDIA, the king of GPUs and the fuel of Artificial Intelligence is holding its biggest GPU Technology Conference (GTC) annual event yet. Coincidence? Nope. …

  1. John Smith 19 Gold badge
    Unhappy

    They've certainly sold a lot of chips.

    Lets see if they actually do anything useful.

    1. Anonymous Coward
      Anonymous Coward

      Re: They've certainly sold a lot of chips.

      Nvidia are ARM licensees - I can see some potential in ARM+Nvidia GPU lash-ups to compete with x86/AMD64 in these HPC workloads.

  2. Anonymous Coward
    Anonymous Coward

    Bloody AI's

    I'm fed up of seeing everything being called an AI. What's the point in striving for true AI when you can make an algorithm that does something a bit fancy and suddenly it's good enough to be called artificial intelligence?

    Am I the only one who thinks of true AI as something that can adapt and do more than perform one linear task?

    1. Anonymous Coward
      Anonymous Coward

      Re: Bloody AI's

      http://smbc-comics.com/index.php?id=4015

      1. John Lilburne

        Re: Bloody AI's

        Frankly AI is anything that cons a comp-sci geek for longer than their attention span (currently about 10 seconds).

    2. PatientOne
      Joke

      Re: Bloody AI's

      With an AI here

      and an AI there,

      Here and AI

      There and AI

      Everywhere an AI...

      There's a song in there, somewhere, I'm sure...

    3. Anonymous Coward
      Anonymous Coward

      Re: Bloody AI's

      As soon as we succeed in using AI for a problem set, it's no longer AI...

      1. Horridbloke

        Re: Bloody AI's

        I recall a lecturer in college saying that 25 years ago, so bravo Sir or Madam or Thingy!

        1. This post has been deleted by its author

    4. Justthefacts Silver badge

      Re: Bloody AI's

      But that's the *point* which you are missing.I can't remember who said it, but someone like "Ah, I thought what you did was so clever, but now I see you just had a trick".

      I believe that's the clue to "what is AI" - that the intelligence of a human brain is mostly just dozens of well-evolved tricks. Good AI work may just be stitching together a bunch of tricks to do the heavy lfiting.

      E.g. Seeing consists not of seeing a picture of a scene, we only see the centre of the image, and fill in the rest. We don't even see "features" that well - as any artist knows who tries to draw the room in front of them. Mostly we just encode back to representations that are "close enough for government work".

      As an example, it is easy to complain about the technology of self-driving vehicles, and note how much better their performance is when augmented with tricks like using GPS & lidar that humans don't need.

      But humans need "sharp bend ahead" signs, & white lines down the centre of a road, physically augmenting the environment to make the driving problem cognitively tractable.

      A modern autonomous vehicle, in a rally car in a unmade rocky landscape, will easily outperform an average driver. Yes, it had to be trained. But so did *you*, to take your driving test - and they taught you loads of tricks for cognitive offload, which largely don't scale to the rocky landscape.

      I don't think the human adaptivitiy advantage is as large as you think, nor as special.

  3. Korev Silver badge
    Boffin

    A lot of hype around AI...

    ...but most of our GPUs' workloads are Molecular Dynamics and CryoEM. We did buy a load of GPUs for the latter recently, I guess purchases like this are counted in Nvidia's sales figures.

  4. Mage Silver badge
    Coffee/keyboard

    rise of AI marks an end to CPU dominated computing

    Wut?

    Someone has invented Asimov's positronic* brain?

    What does that even mean, considering there is no general AI, only pattern matching to big databases? OK, parts of the algorithm might use some sort off gpu, array processor or other specialised co-processor.

    All still needs a conventional CPU programmed by humans.

    [*World chosen because he thought it sounded more advanced than electronic, back then computers were described by the media as "electronic brains"]

    1. Ugotta B. Kiddingme

      Re: rise of AI marks an end to CPU dominated computing

      Mage: "Wut?

      "Someone has invented Asimov's positronic* brain?

      "What does that even mean, considering there is no general AI, only pattern matching to big databases? OK, parts of the algorithm might use some sort off gpu, array processor or other specialised co-processor.

      All still needs a conventional CPU programmed by humans."

      Indeed. See also "paperless office", "flying car", etc...

      1. Nolveys
        Unhappy

        Re: rise of AI marks an end to CPU dominated computing

        paperless office

        My office has been mostly paperless since last September. That was when they cut the janitorial budget by 75%.

        1. bombastic bob Silver badge
          Devil

          Re: rise of AI marks an end to CPU dominated computing

          ack - many trend predictions have fallen on their face. I'm still using a desktop computer (albeit an older one), and not trying to type on a fondleslab 'fake' keyboard nor use a phone for El Reg comments. The PC obviously isn't dead yet, and nor will the CPU be dead any time soon.

          'new shiny' is a FAD, not a trend. making a long term decision based on fads is a recipe for FAIL.

          I predict the CPU will be around for a long, long time. That's because there aren't any _REAL_ AI algorithms out there to make use of "wide GPU farm" architecture. We're barely seeing multi-threaded algorithms to take advantage of multi-core. I mean, really...

  5. DropBear
    Facepalm

    Someone has apparently decided that come hell or high water, after "Cloud", "IoT" and "VR" the next mega-buzzword shall be "AI" and is now pushing it on all fronts accordingly. That we still haven't seen anything worthy of that description is but a minor technicality and shall not trip up The Plan, whatever it is...

  6. Anonymous Coward
    Anonymous Coward

    Bloody Nvidia

    You can't just download drivers, you have to have an account with them and partake of the "Nvidia experience" crapware if you want to run games. Can't speak for anyone else's nvidia experience; but it resets my bloody settings every driver update and the last time I played GTAV it bought the whole system to a beeping halt; having assigned more video memory than the card actually has.

    1. quxinot

      Re: Bloody Nvidia

      You can separately uninstall the experience crap from the nvidia package. While you're at it, the 3D stuff can be removed unless you're one of the 3 people that actually cares. Then you disable the auto-update crap, and update when you understand that there's a new version out that does something better than the old one.

      If you want to be in charge of your computer, you have to take charge of your computer. It's just that it's less easy than it used to be, but those drivers aren't particularly ugly once you get past downloading a bunch of crap that you'll immediately uninstall.

      While we're at it, AI isn't actually here yet in a meaningful way.

      And GPUs and CPUs are different tools for different tasks. Because the AI is so primitive currently, I suspect that there will be several iterations of "GPU AI KILLED BY <x>", "<x> AI KILLED BY <y>" and so on in el Reg before we actually get to a product that does something that anyone who isn't a buzzword marketer wants.

    2. Anonymous Coward
      Anonymous Coward

      Re: Bloody Nvidia

      "You can't just download drivers, you have to have an account with them..."

      Doesn't seem to be the case for Linux users - I don't have an account with Nvidia and have no problems downloading drivers.

      1. Anonymous Coward
        Anonymous Coward

        Re: Bloody Nvidia

        "If you want to be in charge of your computer, you have to take charge of your computer. It's just that it's less easy than it used to be, but those drivers aren't particularly ugly once you get past downloading a bunch of crap that you'll immediately uninstall."

        Fair point; but that way I 1) won't be notified when there is a new driver and 2) will be 'punished' by having to download the bloody lot AND have to find the account password every time I do want to update.

  7. John 104

    Wrong Name

    They got the name wrong. It should be IA. Intelligent Algorithms.

  8. JeffyPoooh
    Pint

    Characteristics of A.I.

    A.I. is hard.

    The keyword 'hard' can be read as nearly impossible.

    A.I. outdoors is *very* hard.

    A.I. outdoors requires senses, else it'll aimlessly wander out in front of a bus; which would be stupid.

    A.I. connected only to a terminal or a network is a very thin A.I. indeed. It's hard to claim 'Strong A.I.' if I can wander over and pull its plug, and it doesn't even know. May I coin the phrase 'Helen Keller A.I.' without incurring too much wrath?

    If you speak of A.I. "self-awareness", then set the self-awareness flag to 'true' and get back to work.

    The human brain has many dedicated, pre-programmed, hardware co-processors; it ain't just learning neural nets.

    Any fears of a future Skynet dangerous A.I. are quite stupid. The foundations of all computers still include little batteries in the motherboards, code that needs a Press Any Key just outside the reach of the A.I.'s little arms, etc. It wouldn't last a month.

    1. Meph

      Re: Characteristics of A.I.

      While I don't disagree with the bulk of your comment, I'd only add the caveat that a Skynet class hostile A.I. is only a pipe dream until some well meaning engineering type starts getting a limited A.I. to start designing and building the next generation service droids.

      What will ultimately doom humanity is the point where hubris and laziness intersect.

  9. arthoss

    So in reaction to the article itself: you wish. CPUs are here to stay and dominate. Whether in "AI" or others (like handling the highly complex ships/cars of the future/factories, etc.).

  10. sarahhart

    LISP, and lessons from the past

    (apologies if this shows up multiple times, i'm new and unsure why i only see my preview but not the comment anywhere)

    is anyone else reminded of LISP machines? massively parallel, with a special language requirement.. sounds a lot like CUDA. at least CUDA is good for games, too.

    as for those who are complaining about so many things being called AI.. this is how the field has been for, what, coming on 60 years?

    for so long, "expert" systems seemed to be the answer, the technology that would lead AI out of the dark and into helping everybody make decisions with their life.. but then they found the limits of expert systems.

    of course, it's not like neural networks have no uses! they've been massively helpful in speech recognition, as well as about.. a million other things.

    but now we're entering a period of huge investment in any company that claims to want to do anything at all with "machine/deep learning". most of these companies clearly cannot all change the world.

    AI winters have always fallen after a period of huge investment, buyouts, etc. investment bubbles must always pop.

    someone above me touched on the idea that once things become commonplace, you don't think of them as "AI", anymore, either.

    when's the last time you felt like "an AI" was in control of your laundry or dishes? fuzzy logic is in most if not all household appliances now, in some way or another.

    and once upon a time, fuzzy logic seemed like the new big thing. finally, computers having nuance in how they think, etc.

    but now it's not considered that special. "AI" can be taken to mean "cutting edge data processing", really.

    naturally, i would love it if we never have another AI winter. but, as i said earlier, all investment bubbles pop. usually, AI fields fall with them for a while. can AI resist this next one?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like