Intel and Facebook Working on AI Inference Chip Called Nervana
CYBER NEWS

Intel and Facebook Working on AI Inference Chip Called Nervana

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Intel and Facebook are working together on a new artificial intelligence (AI) chip. Called the Nervana Neural Network Processor (NNP-I), it is an AI chip for inference-based workloads. The new chip is expected to be finished in the second half of 2019, Intel said on Monday during the Consumer Electronics Show in Las Vegas.

Facebook is pleased to be partnering with Intel on a new generation of power-optimized, highly tuned AI inference chip that will be a leap in inference workload acceleration,” Facebook said in its own statement.

With this chip, Intel is hoping to remain on a leading position in the fast-growing market of AI. Similar chips are expected from competitive companies such as Nvidia and Amazon’s Web Services unit, Reuters reported.




Intel and Facebook Working on Nervana AI Chip

According to Naveen Rao, corporate VP and general manager of AI at Intel, the chip is built on a 10-nanometer Intel process, and will include Ice Lake cores to handle general operations and neural network acceleration.

Nervana is designed with the idea to help researchers in the process of inference. In AI, an inference engine is a component of the system that applies logical rules to the knowledge base to deduce new information. Inference is in fact part of machine learning. Machine learning has two phases – training and inference. Training takes a long time and is more complicated. Performing inference on new data appears to be easier and is the essential technology behind computer vision, voice recognition, and language processing tasks, experts explain.

Related: Intel Deploys Threat Detection Technology, Enables GPU Scanning

Nervana is optimized for image recognition, Intel explained during the presentation. The chip’s architecture is different because it lacks a standard cache hierarchy, and on-chip memory is managed by software directly, VentureBeat reported. In addition, the chip is able to distribute neural network parameters across multiple chips, and this is due to its high-speed on- and off-chip interconnects.

It is worth mentioning that Intel’s processors are at the top of the market for machine learning inference. In September 2018 Nvidia also launched its own inference processor, and Amazon is following in their footsteps. However, Amazon’s chip is not direct competition as the company is not selling the chips. Amazon is planning to sell services to its cloud customers which will utilize the chip in the future. But this means that Amazon won’t be needing chips from Intel and Nvidia, which is a loss of a major customer, Reuters noted.

Avatar

Milena Dimitrova

An inspired writer and content manager who has been with SensorsTechForum for 4 years. Enjoys ‘Mr. Robot’ and fears ‘1984’. Focused on user privacy and malware development, she strongly believes in a world where cybersecurity plays a central role. If common sense makes no sense, she will be there to take notes. Those notes may later turn into articles!

More Posts

Leave a Comment

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.

Share on Facebook Share
Loading...
Share on Twitter Tweet
Loading...
Share on Google Plus Share
Loading...
Share on Linkedin Share
Loading...
Share on Digg Share
Share on Reddit Share
Loading...
Share on Stumbleupon Share
Loading...