Google’s Radar-Based Gesture control for Smartwatches
Google brings out a unique solutions to the problem of using a tiny display of smartwatch.
The company is fitting a tiny radar system into a smartwatch that can detect various hand gestures and perform various functions. The idea is based on Project Soli, announced last year, which focuses on improving the way we interact with wearables. Project Soli was introduced last year during Google I/O and is a part of the company’s Advanced Technologies and Projects (ATAP) division. The theory is quite simple that, put a radar system into a wearable that can detect various hand gestures.
Google rolled out an alpha developer kit last year, and the company is working with around 60 developers for coming up with different implementations. Google representatives recently showed off how Project Soli works on a customized LG Urbane smartwatch. He showed that the smartwatch detects when you bring your hands close and switches to the menu screen. On moving away, the smartwatch switches back to the standard watch face. With a simple flick of your fingers you can also scroll through various functions on the smartwatch.
The representatives also showed Project Soli implemented on a JBL speaker. When you move your hands close to the speaker it lights up, and when you move away it goes into sleep mode. Again by using simple hand gestures close to the speakers you can either play or pause songs, or skip tracks.
Ivan Poupyrev employee of ATAP said:
We’ve developed a vision where the hand is the only controller you need. One moment it’s a virtual dial, or slider, or a button.”
Project Soli surely seems polished and almost ready to be commercially released this year. But these are only prototypes, and Google and its partners are still working on various issues like power consumption among others. Some of the other implementations include identifying objects, 3D imaging and even remote in-car controls, which could be a huge boon for drivers.
As far as smartwatches are concerned, their tiny displays can be a nightmare for people with weak eyesight or large thumbs. Project Soli looks quite close to solving it. Other solutions being worked on include Samsung’s idea of beaming an interactive display on users’ hands, and ‘SkinTrack’, which can turn the user’s lower arm into a touchpad.
Keep visiting here for more stuffs like this 🙂
Thank you for your patience 🙂