While Elon Musk is going around the world warning people about the implications of unregulated Artificial Intelligence development, the tech world is lapping up the technology like no other. From cloud to devices right in our hands, AI is advancing at an unprecedented pace. The latest example of innovation in the space is by chipmaker Movidius.

The company, which was acquired by Intel last year, has invented and launched a Neural Compute Stick that is capable of adding a machine vision processor to any device. Powered by the same low power high performance Movidius Vision Processing Unit (VPU) that is used in millions of smart security cameras, gesture controlled drones and industrial machine vision equipment all around the world, the Neural Compute Stick is a tiny fanless deep learning device that anyone can use to learn AI programming at the edge.

To put it simply, the Movidius Neural Compute Stick is like a plug-and-play USB stick that can be used by manufacturers to amp up the AI capabilities of their new product.

The Neural Compute Stick has been in works for several years now. Announced by the company last year as Fathom, the stick took such a long time to see the light of the day because of Movidius' sudden acquisition by Intel last September. The whole acquisition process put Fathom on the back burner for a long time before being finally launched this summer.

The Neural Compute Stick is designed to help democratize the machine intelligence space, and accelerate an age of ubiquitous, intelligence devices and systems. Through software and hardware tools, the Neural Compute Stick brings machine intelligence and AI out of the data centers and into end-user devices.

People who have used the previous version of the Compute Stick might find that the new stick is almost similar to the old one from technical point of view. The Movidius Neural Compute Stick has Myriad 2 Vision Processing Unit or VPU at its heart. Myriad 2 is a low-power processor that uses twelve parallel cores to run vision algorithms like facial recognition and object identification etc. According to the company, the stick is capable of delivering more than 100 gigaflops of performance, and can natively run neural networks that have been built using the Caffe framework.

One of the main changes that can be observed in the new version of the stick is that it has an aluminium body instead of plastic. Further, Movidius has been able to bring down the price from $99 to $79 for the new version. According to Movidius, Intel’s involvement in the project and their manufacturing ability helped them in bringing these crucial changes.

Who All Can Use Movidius Neural Compute Stick?



The Stick can be used by AI researchers as an accelerator. They can plug the stick into their computers in order to derive more local power when they're busy training and designing new neural nets. According to Movidius, one can chain multiple Neural Compute sticks together and boost the performance of the computer linearly with each stick's addition.

The Stick can also prove to be useful for companies that are looking to put AI powers in a physical product. The stick can provide these companies an easy way to execute their neural networks locally.

Of course, there are some limitations to the device. For example, for a company manufacturing AI-powered security camera, there are much more efficient ways to incorporate specialised processors than going the Movidius Neural Compute Stick way. Further, a researcher into training a new neural nets might find that buying the latest graphic cards or renting processing power in the cloud might provide him/her quicker results than using the Neural Compute Stick.

But, one thing is for sure, a device like Movidius Neural Compute Stick makes artificial intelligence a more accessible resource, something which wasn't possible earlier.

If you wish to give AI a try and land yourself your own personal Movidius Neural Compute Stick, you can buy it here. People living in India can score themselves the stick here.

Post a Comment

Previous Post Next Post

Related Readings