Google sleep sensing

ML at the heart of motion sense. Latest use: Sleep sensing

Google’s Project Soli was backed by machine learning to enable motion sense technology. With Pixel 4 discontinued, it finds application in sleep sensing on Google’s Nest Hub.

When Google introduced Project Soli, the tech ecosystem was excited and also awaited the outcome of it. Motion Sense was intriguing as you could perform a range of activities—snooze alarms, change songs, etc—without touching your phone. It was introduced in the Pixel phone and then the buzz kind of died down. Now at least one big tech or the other tries its hand at Motion Sense. But Google’s latest application for the technology: Sleep Sensing with the new Nest Hub.

Soli—a miniaturized radar small enough to fit into a wrist band, smart watch—with capabilities to understand and interpret human motions like the tap of the finger to body movements. Developing the entire lifecycle of the product—from conceptualization, early-stage prototypes to integrating it into devices—took Google five years. It was first introduced in Google’s Pixel 4.

Backed by machine learning

The tech giant stated in the description of the project that the data collection pipeline was custom built. That in turn helped design a robust ML model. The model helps Soli understand possible movements and motion. Google states that the model on Pixel 4 runs on the device without sending the sensor data back to the servers, and interprets motion to Quick Gestures.

The radar of Soli emits electromagnetic waves. Human hand scatters the energy and reflects a portion back to the radar antenna. The reflected signal—which includes frequency shift, time delay, energy—state information about the object that include size, shape, material, orientation, distance and velocity. Later, Soli processes the signal variations and characteristics captured and differentiates the movements to understand the object.

The chip designs emulate two architectural designs: FMCW and DSSS radar and both integrate the radar to the miniaturized version, enabling 3D tracking and imaging, states Google in its description of the project.

The Federal Communications Commission offered Google a waiver with findings that Soli sensors pose minimal potential of harmful interference to other spectrum users. It also ruled Soli would serve public interest by providing for innovative device control features with touchless technology using hand gestures.

Sleep sensing with Nest Hub

Google introduced Motion Sense in its range of Pixel 4. Pixel 4 was discontinued with a year of the launch, replaced by Pixel 4A, without the motion sense technology Soli chip. Even after anticipation, Pixel 5, introduced in December 2020, did not include the Soli chip either.

But with new Nest Hub, Google has brought forth the motion sensing with what it calls Quick Gestures. Very much similar to what could be done with Pixel 4, you can snooze an alarm, change songs by waving your hand in front of you screen and/or tapping the air all without touching the device. The sleep sensing feature was trained with more than 100,000 hours of sleep data and the data analyzed by TensorFlow.

The Soli chip in the Nest Hub will detect motion and analyze the characteristics of your movement. It can detect whether you are sleeping soundly or have woken up or climbing in and out of bed. The microphone, light and temperature sensors all play an important role in determining any disturbances within the room. Google states that the feature will keep track of duration, consistency of the sleep and the restfulness of it. Without any camera-based determination, the Nest Hub will not capture pictures during the sleep sensing activity, but rely on motion sense to track the quality of sleep.

Leave a Reply

Your email address will not be published. Required fields are marked *