Meh.
This isn't even click-bait. It is more like the comedy wind up mouse that a cat will half heartedly swipe at.
AI is finding its way into every day business and government. The idea of AI is not a new, but what is different is that today's hardware and software is bringing the various concepts underpinning AI to a mass market. What’s new, too, is the driver: from bots and digital assistants to autonomous vehicles Google, Microsoft, …
Think this was meant to be a joke but logically it's the right thing to do surely. I am not trivialising the massive amount of work to somehow bodge together an AI that is just adequately smart enough to design another AI, but once that was complete just leave it running while you go to the pub.
A crack AI team? That's a team of artificially intelligent cracks then?
That was a long article that was not about artificial intelligence, mentioning ML and neural networks a hundred times each still doesn't make it AI.
They may be steps on the way to true artificial intelligence or it may turn out there is something else we are so far missing but I have pigeons outside my window every morning that are smarter than anything currently called AI that I know of.
Of course if somebody wants to offer me 150K a year plus, Oh! That's AI yes indeedy!
To all you folks out there buying into 'AI' just have a look at the stupidest, dumbest person you know, then consider how your AI compares to them in truly functional terms.
That one of the key criteria for team selection isn't real life experience, which explains a lot about customer facing AI systems where learning is restricted to getting to say No more efficiently.
AI isn't that new, we were messing around with it at Scicon in the 80s for retail replenishment, so there's quite a few of us out there with long experience of "Intelligent" systems. We just aren't 20 years old any more. Most recently looking at it for Blue Light demand predictions.
The main difference is the improvement in processing speeds and tools rather than the basic concepts.
Crack smoking AI would imply that the Singularity has already occurred.
But seriously, where do I get some Movidius USB "thinking" sticks? Can't locate them in stock anywhere, the next best alternative is a high end graphics card like the GTX1080TI but these are over £500+.
Anything that supports OpenGL would be handy but alas none of my spares found in Ze Junkbox of DOOM! will work well enough due to flaky interconnects/bad RAM.
Next project: bodge together an X-ray BGA repair rig to inspect soldering and determine which chip is actually bad.
For once this is quite simple with flash emitters less than £20 and already have the scintillator screen to go with my modified camera.
1. Immediately put the following question on Stack Overflow: 'can I haz tah codez for da AI I am needing it rightly now?'. Hire anyone who replies. This gets you all the Capita and ATOS top-of-the range off-shore coders right where you want them. They'll be hungry to type shit in.
2. Find a Cloud Architect. Find another. As long as they've spoken in numerous conferences, and never actually delivered anything, hire them both, as you'll need two clouds to hold all the smarts. Obtain first round funding. Immediately move 35% of it somewhere safe. No-one needs to know where. Explain it as unexpected burn due to initial resourcing being super-successful.
3. DevOps. Whatever it is, you need at least 6 months or $1m (whichever is more) of it. Put it in your marketing materials, but don't let your dev team (see point 1) anywhere near it.
4. Your AI solution will retrospectively fill in point 4.
5. Microsoft have repeatedly shown themselves to be both trustworthy and ahead of the game. Same with Oracle. Hire as many consultants from these fine firms as your remaining first round funding will allow (so that's around 15 from Microsoft and 2 from Oracle).
6. You need second round funding, The only way to obtain this is to change your logo, and get your website rebranded. Follow the Deliveroo lead - hire an absurdly expensive Brand Realisation Factory - and turn your perfectly acceptable original logo into some monochromatic fucking pointy Australian animal. Bingo!
7. IBM's Watson (based on HP's Autonomy, itself based on an odd combination of Maven plugins that went rogue) might help. No-one (including itself) knows what it actually does, so this'll keep the press at bay for a while.
8. Hire a call-centre in Bangalore, and claim it's your neural AI net that is still learning. This is what BT does with its broadband support, and they reckon their net will learn what a router is sometime in 2047.
9. Retire to a life of cars, gin and hookers. You've got the Intelligence. Leave the Artificial bit to some other sucker.
This post has been deleted by its author
The problem I have is not the article, but the generic use of the term AI across the media. Use of it immediately discredits virtually any article, as we're not even close to AI yet. Like endless power from nuclear fusion, it's this year's hype phrase, banded about by journalists and CEOs, without the faintest clue as to what it means or how far we are from it. Just like Big Data, Hadoop, Prince, etc etc before it. It's even on Gartner's 2017 hype curve, far on the left hand side under General Purpose Machine Intelligence.
Machine learning, data science and predictive analytics to usefully employ as a tool alongside people to make money/cut costs/improve productivity? Yes. Although all of those are still in their infancy in terms of being incorporated into deliverable systems. But AI? No, it's still in the pre-Asimov era, we're not even close, in all but a handful of specialist, well defined skills.
Just on a technical inaccuracy: You don't need C/C++ coders to take advantage of GPU cards for machine learning. Sure, you can use it, just as you can use assembler to write your own word processor, if you have a few decades and a bottomless pit of cash. The better alternative is to use Google's Tensorflow, and Theano, both high-level, optimised, broad functionality machine learning libraries with a variety of models for different tasks. These libraries will talk to GPU cards via their CUDA drivers, removing the need for C++ people to unproductively write tedious less time productive code. The hard work has already been done. Python and Tensorflow (with maybe Keras sat between as a high level framework) is mainly where it's at both commercially and academically, with R and Theano also up there. But not C++, unless you want to hand craft or optimise the low level code of your own ML libraries and credibly believe your developers are better at both C++ and machine learning than Google's/MILA's engineers).