Google Soli

Interacting with Digital Signage using Google Soli

Google has unveiled a project that is bound to make software deveopers stand up and take notice. The new "Soli" platform can be seen as the next stage of a more "futuristic" tomorrow; one where the user can merely move their hands in the air to interact with digital signage, mobile devices and other types of hardware.

The hardware behind Soli is hidden in a small chip that can be installed into a variety of devices. It uses a miniature radar system operating in the 60GHz ISM band, which can detect touchless gestures and movement. Imagine a concept where interacting with an image is acheived by simply be rubbing a finger and thumb together. That is Soli as a concept.

image

Various Soli prototypes from 2014 and 2015

In a video released by Google, there are examples of using Soli in object recognition, 3D imaging, predictive drawing, in-car remote, security, visualisation, and even music.

The reason Soli is said to be so accurate is because it is purpose-built for these kinds of interactions. Although it is difficult to judge the real performance of Soli from these types of demonstrations, it does look like this simple 8mm x 10mm package (which includes all the sensors and antennas) can offer some highly accurate interactive usage.

 

As with most startup projects, there is no way of telling when we can expect Soli to impact the market in a big way, but for those interested in working with Soli, there is an official newsletter from the URL below and more demonstrations of useage.

https://atap.google.com/soli/