Skip to content

Intel Movidius Neural Compute Stick Is A USB-based Deep Learning Kit

Artificial Intelligence and Machine Learning has moved down from the cloud to devices and now in your hands. Movidius, the subsidiary of Intel has launched a Neural Compute Stick, a USB-based deep learning inference kit.

The Movidius NCS is the world’s first self-contained AI accelerator in a USB format that delivers dedicated deep neural network processing capabilities to a wide range of host devices. AI accelerator is a type of microprocessor designed to accelerate machine vision tasks and other machine learning algorithms.

The Movidius Neural Compute Stick is priced at $79 (roughly Rs. 5080) and is available for purchase through select distributors.

What is the Neural Compute Stick (NCS)?

The Movidius Neural Compute Stick (NCS) is a small fan less deep learning device that you can use to learn AI programming. NCS is powered by the same low power high-performance Movidius’ Vision Processing Unit (VPU) that is found in most of the smart security cameras, gesture controlled drones, industrial machine vision equipment, and much more.

The Neural Compute Stick was actually announced in April 2016 as a prototype device named Fathom. But soon after Intel acquired Movidius in September, and it got put on hold.

The new Compute Stick is the same as the old one in many aspects. It is powered by Myriad 2 Vision Processing Unit or VPU that uses twelve parallel cores to run vision algorithms like object detection and facial recognition. According to the company, it delivers more than 100 gigaflops of performance, and can natively run neural networks built using the Caffe framework which is one of the neural network libraries.

To recall, Myriad 2 VPU is a multicore processor developed by Movidius in February 2016. The chipset can function on power constrained devices. It consumes just as low power as a single watt.

The main changes in this new version are that it’s made out of aluminum instead of plastic. Also, the on-chip memory has increased from 1 GB on the Fathom to 4 GB on the Movidius NCS, in order to facilitate larger neural networks. The price has also been cut from $99 to $79 for this new chipset.

Highlights of Movidius NCS

The new NCS is designed for product developers and researchers mainly. It aims to reduce barriers to developing, tuning and deploying AI applications by delivering high-performance deep-neural network processing.

The stick, combined with Neural Compute SDK allows developers to profile, tune, and deploy Convolutional Neural Network (CNN) on low-power applications that require real-time inferencing. It automatically converts a trained Caffe-based CNN into an embedded neural network optimized to run on the Myriad 2 VPU.

Machine intelligence development is basically composed of two stages – developing an algorithm on large sets of data via modern techniques, and running the algorithm in an application that needs to interpret real-world data.

This second stage known as inference means performing inference at the edge or inside the device brings numerous benefits in terms of latency and power consumption.

So, this new NCS from Movidius can behave as a neural network accelerator by adding dedicated deep learning inference capabilities to existing computing platforms for improved performance and power efficiency.

Use Enhanced Google Site Search Text Box Below To Find Solution to Your Tech Problems

Satyendra Pal Singh

Satyendra Pal Singh

Satyendra explores the latest happenings in the tech world and writes stories about those. He likes to play around with the latest gadgets and shares his views through articles. In his free time, you can find him watching movies/TV shows and/or reading books.View Author posts