2025-07-14 9:07 PM
Hello ST Team,
We are currently building an AI-powered digital pen using ST MEMS sensors (accelerometer, gyroscope, magnetometer) and the STM32WB05KZ MCU. We’ve reviewed the st-mems-finite-state-machine repository and are exploring how the embedded FSM core in sensors like LSM6DSO32XTR can help detect gestures such as:
Double tap
Swipe gestures
Repetitive handwriting-like motion (“double write”)
Tilt-based mode switching
We are aiming for a low-power design, where motion recognition is offloaded to the sensor and only specific events are passed to the MCU for further processing.
Could you clarify:
Can the FSM core be extended to detect custom gesture patterns (e.g., two similar handwriting motions)?
Is there an example or methodology to define gestures based on gyro and magnetometer data, in addition to accelerometer signals?
Any tools or guidelines on blending FSM with ML or application-level classification using the sensor + STM32WB05KZ?
This would greatly help validate our approach and optimize power while still supporting intelligent gesture-based features in the AI Pen.
Thank you in advance.
2025-07-14 11:59 PM
I have seen a video on youtube for something very similar to what you are describing with a pen, can't remember the name of the video but it's on ST channel, I guess that should help as the video showed how to recognize writing, idle and other gesture, can't remember.