Google has been granted approval from the FCC for a new radar-based technology that could allow users to navigate around their smartphone without touching the device at all. The new motion sensor tech is part of Project Soli. Are in-air gestures the future of smartphone operation?
Google needed FCC approval to continue testing its new type of motion sensor since it uses power levels that are higher than what is currently allowed. Google asked for permission to operate within the 57 to 64 GHz frequency band, but was only granted approval for a range slightly higher than what is currently allowed. Despite the decision, Google says it can still get Soli to work.
Now that approval has been granted, Mountain View can continue with the project, which aims to "enable touchless control of device functions or features, which can benefit users with mobility, speech and tactile impairments." The technology could allow users to swipe in mid-air, and click by touching a thumb and index finger together, even if your hand is some distance from your smartphone.
The radar also works through clothing and fabrics, meaning that you could easily operate your device whilst wearing gloves or even when it was in a bag or pocket without having to take it out.
We have seen this kind of motion gesture navigation technology before, but never like what Google is doing with Project Soli. Bixi, a portable diver safety companion that can sense your in-air hand gestures to control your smartphone is able to make calls and send messages, but it is based on optical tech, not radar, and the working range is limited to between 3 cm to 20 cm.
Google is still in the testing phase with Project Soli, so it could be some time before we actually see this come to market. Some users will also be concerned about the amount of metadata a radar such as this would allow Google to collect from its users.
What do you think about the possibility of navigating through your phone with in-air gestures? Let us know.