Using FSM in ST MEMS for AI Pen Gesture Recognition with STM32WB05KZ
Hello ST Team,
We are currently building an AI-powered digital pen using ST MEMS sensors (accelerometer, gyroscope, magnetometer) and the STM32WB05KZ MCU. We’ve reviewed the st-mems-finite-state-machine repository and are exploring how the embedded FSM core in sensors like LSM6DSO32XTR can help detect gestures such as:
Double tap
Swipe gestures
Repetitive handwriting-like motion (“double write”)
Tilt-based mode switching
We are aiming for a low-power design, where motion recognition is offloaded to the sensor and only specific events are passed to the MCU for further processing.
Could you clarify:
Can the FSM core be extended to detect custom gesture patterns (e.g., two similar handwriting motions)?
Is there an example or methodology to define gestures based on gyro and magnetometer data, in addition to accelerometer signals?
Any tools or guidelines on blending FSM with ML or application-level classification using the sensor + STM32WB05KZ?
This would greatly help validate our approach and optimize power while still supporting intelligent gesture-based features in the AI Pen.
Thank you in advance.
