With the new, Microsoft-designed AI coprocessor, the HoloLens won't be reliant on the cloud for analyzing visual data.
The second version of Microsoft's HoloLens, the firm's "mixed-reality" headset, will feature an on-board AI coprocessor designed by Microsoft itself, Microsoft's AI lead Harry Shum announced in a keynote address at CVPR 2017 on Sunday.
The coprocessor will exist as part of the HoloLens's multiprocessor known as the Holographic Processing Unit (HPU), according to a Microsoft blog post. With the AI coprocessor handling visual data analysis on the device, it would eliminate reliance on the cloud for processing, boosting the headset's edge computing capabilities and opening it up for new use cases.
Previous processing options were tethered to the cloud, the post said. By adding the AI coprocessor on-board, it untethers the HoloLens completely, and lowers latency for visual input activities like hand tracking and more.
SEE: Build 30 Mini Virtual Reality Games in Unity 3D From Scratch (TechRepublic Academy)
According to Shum's address, the AI coprocessor will be able to natively implement deep neural networks, with a bit of flexibility as well. The chip will support multiple layer types, which are programmable by Microsoft, the post said.
The new chip will also be powered by the HoloLens on-board battery, keeping in stride with Microsoft's untethered theme for the HoloLens.
By processing data on the headset itself, the new AI chip makes the HoloLens a more powerful device for edge computing. In addition to improving graphics rendering for VR and AR, leading to a better user experience, edge computing could also lead to deeper explorations of autonomous vehicles and more efficient robotics for manufacturing. As such, networking firms like AT&T are investing heavily in edge computing capabilities.
Outside of VR and AR, the Internet of Things (IoT) also stands to benefit heavily from edge computing advances. Real-time analytics are easier to provide directly to managers and on-the-ground employees, as the data doesn't have to be shipped off to be analyzed.
In addition to the new chip, there have been rumors that the next HoloLens will have a new form factor as well. A prototype developed by Microsoft uses normal-sized glasses to project holographic images onto a surface, in a view similar to those provided by AR and VR helmets.
The 3 big takeaways for TechRepublic readers
- Microsoft's second HoloLens will have a dedicated AI coprocessor that will handle the visual data analysis on board, untethering the device from the cloud.
- The chip will be able to implement deep neural networks and support multiple layer types, which Microsoft will be able to program.
- The new coprocessor improves the HoloLens's capabilities in edge computing, which will also stand to impact IoT, robotics, and more.
- Microsoft HoloLens: The smart person's guide (TechRepublic)
- Microsoft's next-generation HoloLens processor will feature an AI coprocessor (ZDNet)
- What HoloLens means for Microsoft and for the future of augmented reality (TechRepublic)
- Microsoft's smartphone killer? Possible future HoloLens takes sunglasses form (ZDNet)
- AR could be coming to the iPhone 8 as VR and AR markets set to explode (TechRepublic)
- Brief Study of Augmented Reality Mobile Applications [download] (TechRepublic)