Google researchers have revealed a new AI called SensorLM that learns the “language” of our smartwatch health sensors, bridging the gap between raw data and real-world context.
Ever looked at your smartwatch and wondered what all those numbers really mean? Your device tracks your every step and heartbeat, but it can’t tell you the story behind the data. A heart rate of 150 bpm could be an energetic run or a horribly stressful work presentation; your watch simply doesn’t know the difference. That’s what Google’s SensorLM aims to solve.
The biggest challenge was the data itself. To understand the connection between sensor signals and daily life, an AI needs to learn from millions of hours of examples that are pre-labelled with text descriptions. Asking people to manually write down what they were doing for millions of hours of sensor recordings is practically impossible.
So, the team at Google developed a system that automatically creates descriptive captions for the sensor data. This approach allowed them to build the largest-known sensor-language dataset in the world, using 59.7 million hours of data from over 103,000 people.
SensorLM learns in two main ways:
- It’s trained to be a great detective through a process called contrastive learning. This teaches it to tell the difference between similar but distinct activities, like correctly identifying a “light swim” versus a “strength workout” from the sensor signals alone.
- It’s trained to be a storyteller through generative pre-training. This is where the AI learns to write its own human-readable descriptions based on what it sees in the complex sensor data.
When tested on its ability to classify 20 different activities without any specific prep-work (a “zero-shot” task), SensorLM performed with remarkable accuracy. Other powerful language models were essentially just guessing.


Beyond just classifying activities, SensorLM can generate accurate summaries. Given nothing but the raw stream of sensor data, it can produce a detailed and coherent description of events. In one example, it accurately detected an outdoor bike ride, a subsequent walk, and a period of sleep, right down to the minute.
The research showed that as the model gets bigger and is trained on more data, its performance just keeps getting better. This opens the door to a future of truly personalised digital health coaches, clinical monitoring tools, and wellness apps that can offer advice through natural conversation.
We are moving past the era of just seeing simple metrics. With innovations like Google SensorLM, we are getting closer to a future where wearable devices like our smartwatch can truly understand the language of our bodies and can turn a flood of data into personal and actionable insights.
(Photo by Triyansh Gill)
See also: Samsung and Stanford Medicine advance sleep apnea research


Want to learn about the IoT from industry leaders? Check out IoT Tech Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Cyber Security & Cloud Expo, AI & Big Data Expo, Intelligent Automation Conference, Edge Computing Expo, and Digital Transformation Week.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.