The race to develop a dedicated AI chip is ongoing, and several tech giants are taking aim at the incumbent.
INTEL'S NERVANA NEURAL NETWORK PROCESSOR. IMAGE SOURCE: INTEL.
Over the past several years, the adoption of artificial intelligence (AI) has made countless headlines, and the number of uses for the data-mining and pattern-recognizing capabilities of AI has exploded. Chip-maker NVIDIA which pioneered the graphic processing unit (GPU), was the biggest beneficiary of the adoption of AI. The parallel processing capability of its GPU, which brought about a revolution in image rendering, was a surprisingly good fit for the process of training AI systems.
That level of success breeds competition, and Intel and Facebook have joined forces in an effort to dislodge NVIDIA from its leadership position.
Chipping away at the competition
At the Consumer Electronics Show in Las Vegas this week, Intel said it was working with Facebook to develop the Nervana Neural Network Processor for Inference (NNP-I). The company said this new class of processor will cater to the needs of those high workload demands by accelerating the practice of inference. The processor is scheduled to go into production this year. Intel is also working on a Neural Network Processor for Training -- which it code-named Spring Crest -- which it expects to be available later this year.
For the uninitiated, AI processes occur in two very distinct stages: training and inference. The training phase involves developing the algorithms and computer models necessary to complete a specified task, such as language processing or image recognition. This phase of the operation is computationally intensive, which is what initially attracted researchers to the GPU. The second phase, known as inference, occurs after the system has been programmed with the necessary data and is working on the task it was trained for -- such as tagging friends in photos.
Intel initially announced its partnership with Facebook in late 2017. In a blog post at the time, Intel CEO Brian Krzanich said, "We are thrilled to have Facebook in close collaboration sharing its technical insights as we bring this new generation of AI hardware to market." Intel planned to deliver the initial version of the chip to Facebook and other partners for testing and feedback before embarking on the sophomore version.
Building a better mousetrap
Intel isn't the only company looking to cash in by developing a better chip developed specifically for AI uses. Google, a subsidiary of Alphabet , developed the tensor processing unit (TPU), which is currently on its third generation. Google has been using the processor internally and has not made it available for sale, but rather used the chip to power its Google Cloud.
Amazon an early pioneer and current leader in cloud computing, announced late last year at its Re:Invent conference that that Amazon Web Services (AWS) had developed the Inferentia chip, designed to provide high performance when making predictions at an extremely low cost. The inference phase of AI operations results in 90% of the expenses, prompting Amazon to focus on the cost savings. The company called the resulting processor "a game changer," but it doesn't plan to bring the chips to market, instead only using them internally for its cloud-computing customers.
AI chip is more and more important in the future technology, we will continue to update AI chip news.