Air Gesture Recognition using IMU
I want to implement air gesture recognition using accelerometer and gyroscope data.
With different movements you should be able to make settings on the handheld device.
Movements to be recognized are, for example, tipping twice or moving up and down twice.
My findings so far:
- The first attempt was made with LSM6DSO and it's Finite State Machine (FSM) by using Unico tool. Basically it works, but the reliability is too low.
- The second attempt was made with LSM6DSOX and it's Machine Learning Core (MLC) by using Unico tool. Reliability is too low in this case as well.
- The third attempt was made with Qeexo AutoML using the SensorTile Box MCU for processing the sensor data. Similar to this description. https://community.st.com/t5/mems-and-sensors/how-to-detect-hand-air-gestures-with-the-sensortile-box-and/ta-p/49547
The result was relatively good. However, just one tipping triggers the double tipping event.
My biggest problem is, that I don't want to detect continuous signals, but specific time discrete events. I had Qeexo AutoML trained with segmented signals. Nevertheless, it triggers with just one tilt and not with two tilts. Apparently there are difficulties with the time window to be examined.
My next consideration is to find a possible solution with NanoEdge AI or STM32 Cube.AI.
Can anyone tell me what is the best solution to reliably detect time discrete events using 6 DOF accelerometer and gyroscope data?
