Google Soli, the radar that wants to analyze your body language

A computer capable of perceiving your movements, your behavior, adapting to them to serve you better, this is the latest project from Google’s Advanced Technology and Products (ATAP) division, presented on March 1. For those, a bit worried about their data or who have watched too much dystopia based on rebellious artificial intelligences, rest assured, a little: your computer will not see you via a camera, but thanks to a radar device.

Soli, the heart of the project

Google has been interested since 2015 in a radar system capable of tracking user movements. The Soli sensor project already has a practical existence. It was integrated into the Mountain View smartphone, the Pixel 4. It allowed you to activate the snooze mode or pause music with a wave of your hand.

In the same category

War in Ukraine: the delicate position of Chinese companies

Too expensive, the device was not maintained for the Pixel 5. It is still used in the second generation Nest Hub smart display. Google’s Soli sensor can monitor a person’s sleep, without a device in contact with the body, by tracking chest movements.

Google wanted to reverse the operating logic of Soli. Instead of letting a user perform a specific gesture to activate a specific feature, it is from their body language that the connected device will interpret their expectations. Leonardo Giusti, Head of Design at ATAP summarizes, “ We believe that as technology becomes more present in our lives, it’s only right to start asking it to take a few more cues from us. “.

To achieve this, the radar must capture subtle gestures, such as face orientation, body orientation and make them relevant to the use of the device to which it is linked. ATAP researchers report having sought the concept of proxemics, the relationship of individual to individual according to the distance between them, to adapt it to a machine. An AI will process all the data for the computer to make sense of.

gif of a man moving towards and away from a computer

Demonstration of the principle of proxemics between a machine and a human. Credit: Wired/Google

The Google team has been working on the project for a year. The designers explain that they took advantage of the confinements to train the algorithm themselves, at home, in their daily use, by varying the movements to interact with the devices equipped with Soli.

Among the examples of uses staged, the video which pauses when the user is no longer in front of his screen, the appearance of a message when the device is observed in an insistent way or on the contrary the deactivation notifications when the user leaves their workspace. For Lauren Bedal, in charge of interaction design, ” The idea is that computers can fade into the background and only help us at the right time. “.

Entrusting so much daily data to Google?

Technically, several unanswered questions arise. For those who like to put a video in the background while they do something else, the automatic pause may be annoying. A concrete example of everyday behavior that could pose a problem for this technology. Other worries, the range of Soli is three meters, requiring a mesh of sensors in a living room, but a radar system with less latency, operating in the dark, is not disturbed by sound or temperature.

A perspective that opens the big question, that of privacy. As reminded Wired, Google lives by monetizing our data. Providing so much to Mountain View is likely to put many off.

The ATAP team swears that radar is the most privacy optimized technology. No images rhyme with absence of identification. This remains the collection of daily data, on the uses of users. Google researchers are not there yet, their research is still ongoing. They offer interesting perspectives on future technologies, provided you know all the ins and outs.

ttn-4