back to article FPGAs for AI? GPUs and CPUs are the future, shrugs drone biz Insitu

"It's just too hard to maintain all of those threads," eye-in-the-sky drone firm Insitu told The Register, explaining its move away from FPGAs to commercial off-the-shelf compute hardware for its AI and machine learning tech. The firm's chief growth officer, Jon Damush, was answering our questions about the tech underneath the …

  1. Aitor 1

    Not an expert on AI

    As the title says, Not an expert on AI.

    That being said, a FPGA, or better still, a custom chip will be an order of magnitude more efficient at crunching the algorithm than a GPU. The problem is.. it is kind of fixed on what it does.. and if you start making it more general purpose, then it is better to stick with the GPU.

    Once we as humans have stable, decent "AI", then we will see AI coprocessors designed specifically for the task at hand, but probably not before as the cost of design+tape+productions is just too great for things not done in big numbers.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not an expert on AI

      In certain application domains (signal processing is my area of interest), never count on an FPGA being quicker or more efficient than, say, a CPU. The PowerPC 7400, the first one with Altivec, could do a floating point 1k complex FFT quicker than whatever Xilinx were selling at the time (Virtex 2?) could do a fixed point equivalent.

      Also, depending on the exact nature of the algorithm being implemented, other factors such as memory bandwidth plays a big role. If you're algorithm needs to chew through several 10s of megabytes at a time, there's a good chance that a CPU will be quicker. Modern Intel / AMD CPUs have fantastic memory systems, and to match that in an FPGA is effectively impossible. FPGAs are OK ish, so long as the data to be processed fits inside it.

      In my experience (mainly signal processing), FPGAs are worthwhile only in the circumstances were one knows exactly what the algorithm needs to be. In FPGA development there's a ton of things that gets in the way of progress, it's just so slow to do things such as place & route. If one is developing an algorithm, it's almost certainly better to not use an FPGA.

      I've noticed another trend recently. I've done systems that started of as pure CPU, and then as time passed and parts of the algorithm became settled it was worthwhile getting an FPGA involved. However, CPUs have made such tremendous strides over the past 10 years there's no point involving an FPGA any more, simply because the CPUs now have so much performance. There's no point using an FPGA for the hell of it, it's just a wasted chip. That is an application domain specific observation, but it's interesting to see other fields beginning to think along similar lines.

      And let's not forget ARMs. There's some really, really good ARM SOCs nowadays that have pretty good compute performance at surprisingly low power consumptions. For a lot of signal processing applications there's plenty of compute in an ARM. Why go to the effort of gluing down an FPGA when a cheap ARM can do the same job, and is far easier to develop for, and gets you a Linux environment?

      I know that a lot of FPGAs have ARM cores inside, but they're there to make use of the FPGA logic, not to have any decent math performance of their own. If an ARM SOC has a decent enough SIMD unit, why bother with the FPGA part?

      1. EnviableOne

        Re: Not an expert on AI

        for any implementation, an ASIC will be the fastest, depending on the FPGA design, it could be the next fastest, but GPUs can do Maths far faster than CPUs.

        In AI, its normally about number crunching, and if you are doing several specific optimal tasks, being able to switch the silicon to optomise for each in turn, with FGPA, will give you some advantage.

        The reason they are looking at off the shelf silicon, is cost and compatability. If every device has a custom FPGA running custom code, then every function they develop, needs to be translated to each individual spec device; with COTS silicon in everything, they can write once, and deploy the function to any device.

      2. DropBear

        Re: Not an expert on AI

        I'd like to see some evidence backing up those so-called tremendous CPU performance jumps in the last ten years - glossing over the fact that apparently everyone discreetly just stopped graphing flops performance past circa 2010 for some reason, all the material I've seen seems to indicate that progress of single-core performance all but stalled around that time, and "more cores" doesn't seem have done much either ever since quad/hex/octa-cores suddenly replaced single cores, also a while ago.

        1. theblackhand

          Re: Not an expert on AI RE:DropBear

          Here’s a link showing x86 single core improvements at around 15%/year @ constant 4Ghz clock speed:

          https://m.imgur.com/uuMOToE

          The bit that’s missing is if you add in CPU clock speed increases from around 2004 (potentially earlier, I just haven’t seen the data) is that X86 was at around ~50% performance increases per year, hence the felling of single thread performance stagnation.

          For the rest, you answer your own question - who needs flops for single or multi ore CPUs when you can just provide massively parallel GPU’s if you need them.

  2. nematoad

    "...that can be problematic because they don't want their data going anywhere else"

    OK, I get that, but with their increasing use of "the cloud" and off-the-shelf GPUs and processors how are they going to make sure that the stuff stays safe? They will have to trust in third party security and we are all aware that things can and do leak or are just plain broken.

    If I was in the market for such assets I would want to be certain that all the data was under my control rather than spreading it all over the internet.

  3. John Smith 19 Gold badge
    Coat

    "the use of secure cloud."

    Oxymoron detected.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like