Skip to main content

The Intel Neural Compute Stick 2 is Here!

In July last year (2017), the Intel Movidius Neural Compute stick (139-3655) was launched, this was the world’s first self-contained Artificial Intelligence accelerators available in a USB format that allowed host devices to process deep neural networks at the edge. This gave developers and researchers a low cost and low power method of developing and optimising computationally intensive AI vision applications.

colab_image_5e2651dc9c6c6c8dc8aff7044d912be021a620b2.jpg

This capable Neural Compute Stick is powered by a fully functioning SoC, the Myriad 2 Vision Processing Unit, and looking at the specifications is said to be capable of 100 GFLOPS of performance whilst only consuming 1W of power amongst its other talents. The NCS certainly proved its worth paving the way for vision rich applications which traditionally are very data heavy. When combined with the OpenVINO Toolkit, the NCS was unbeatable for prototyping and for rapidly bringing computer vision and AI to IoT and edge devices.

You can read how Andrew Back got on with the original Intel Compute Stick here.

 

Introducing The New Intel Neural Compute Stick 2 (181-1851)

NCS2-stick-quarter_5f0515bc2bf6dfd47f040d7fea31c28efecb15b9.jpg

Featuring the same form factor as its older brother The Intel Neural Compute Stick 2 has been designed with even more potent-power to make those data-intensive AI and vision applications even easier to perform. The Neural Compute Stick 2 is powered by the new Myriad X VPU, delivering best-in-class performance in computer vision and deep neural network interface applications with ultra-low power consumption. Capable of over 4 trillion operations per second (TOPS) the advanced vision and artificial intelligence applications become that much more accessible.

The handy small form factor plugs straight into a laptop or single board computer or other platforms with a USB port giving the user the ability to access DNN with minimal expenditure. When used with the optimised OpenVINO Toolkit it amplifies the abilities of its older sibling for an even more powerful plug and play experience.

Naveen Rao, Intel Corporate Vice President and General Manager of the AI Products group has this to say:

“The first-generation Intel Neural Compute Stick sparked an entire community of AI developers into action with a form factor and price that didn’t exist before. We’re excited to see what the community creates next with the strong enhancement to compute power enabled with the new Intel Neural Compute Stick 2.”

The Intel Neural Compute Stick 2 is packed with improved features:

  • Brand new Myriad X VPU delivering over 1 trillion operations per second of DNN inferencing performance
  • 16 programmable 128-bit VLIW SHAVE vector processors
  • Up to 8 times faster on DNNs than the original Neural Compute Stick
  • Enhanced vision accelerators, perform tasks such as optical flow and stereo depth utilising over 20 hardware accelerators without additional computing overheads
  • 5MB to 450 GB/s bandwidth of Homogenous on-chip memory
  • New Hardware encoders provide 4K resolution support at 30Hz and 60Hz frame rates
  • Intel Distribution of OpenVINO Toolkit optimised for Intel Neural Compute Stick 2
  • Supported frameworks: TensorFlow, Caffe
  • Connectivity: USB 3.0 Type-A

Image216_49a1b721a7f7ca16f2c90233f726a7fc5b7ffe96.jpg

Supported Models

Open the VINO

The OpenVINO Toolkit, which has been optimised for The Intel Neural Compute Stick 2 is a comprehensive prototyping and development software package designed to allow you to quickly and easily develop AI and vision applications. The toolkit includes two sets of optimized models that can expedite development and improve image processing pipelines. You can use these models for development and production deployment without the need to search for or train your own models.

OpenVINO features:

  • Enables CNN-based deep learning inference on the edge.
  • Supports heterogeneous execution across Intel's CV accelerators, using a common API for The Intel Neural Compute Stick 2
  • Speeds time-to-market through an easy-to-use library of CV functions and pre-optimised kernels
  • Includes optimised calls for CV standards, including OpenCV, OpenCL and OpenVX

 

Click here to order the new The Intel Neural Compute Stick 2

 

Further Reading:

Neural Compute Stick and Tensorflow

Ai Powered Identification 

 

Countless years taking things to bits to see how they tick...now Fighting the good SEO & content battle at Kempston Controls! Level 191...get in!
DesignSpark Electrical Logolinkedin