A new technology based on ultrasound enables smartphones to recognize touchless hand gestures.
Created by Norwegian startup Elliptic Labs, the technology, which makes using a smartphone or a tablet like playing a video game via Microsoft's Kinect motion sensing system, is being showcased at this year's Ceatec in Japan.
Although it is only in its proof-of-concept stage, it is capable of responding to one millimeter movements and can recognize hand gestures across 180° and at a distance of up to 50cm from the screen. But what's most exciting is that Elliptic Labs claims it has already formed partnerships with a number of Asian handset manufacturers who are looking at building the company's ultrasound chip into phones and tablets that could come to market as early as next year.
There is little doubt that gesture is going to be one of the key computing interfaces of the future and that along with voice recognition will probably replace the keyboard, mouse and maybe even the touch screen. Look at the attention and headlines that companies such as Leap Motion and Thalmic Labs are generating with their motion-sensing technologies and the way that Microsoft brought a new dimension to gaming when it launched the first Xbox Kinect controller.
What makes Elliptic Labs' technology so interesting is that it uses ultrasound, rather than infra-red sensors (like the Kinect) or cameras (like both the Kinect and Leap Motion), and therefore is much simpler and cheaper to build into consumer electronics devices and has none of the other technologies' drawbacks.
For instance, infra-red sensors become completely overwhelmed when used outdoors, in sunlight, while cameras put strain on batteries and have a limited focal range meaning that in order to recognize an air gesture in space, several cameras mounted at different positions would be needed to measure both position and depth.