A few years ago we had the chance to work on Kiosk application that was gesture controlled via Leap Motion. The gestures were a great way to interact with the kiosk but the technology was very early and had its limitations. This morning I saw the next evolution of this technology and felt I needed to share with you. Project Soli is a new product from Google’s advanced technology and projects division
Soli is a new innovative device that allows for touchless interactions using radar to detect gestures. It is capable of detecting submillimeter motion with high speed and great accuracy through most materials.
Through intuitive gestures, you can control any application in new and natural way. Soli’s gesture-tracking allows for natural movements to simulate real-world items. Imagine rubbing your fingers together to control a volume knob, sliding your finger to move a slider, pressing a button by bringing your fingers together
Imagine bringing this technology together with Vr to create an engaging training or simulation tool. How useful would this be to future pilots learning to fly as they practice and simulate sitting within the cockpit and actually being able to interact with all the switches, knobs and dials?
How else could you integrate Soli into your existing products to help accessibility or improve efficiency by allowing a more natural interaction to occur? Let me know in the comments below