Skip to main content
Associate II
July 30, 2023
Solved

Air Gesture Recognition using IMU

  • July 30, 2023
  • 7 replies
  • 7741 views

I want to implement air gesture recognition using accelerometer and gyroscope data. 
With different movements you should be able to make settings on the handheld device. 
Movements to be recognized are, for example, tipping twice or moving up and down twice

My findings so far:

  • The first attempt was made with LSM6DSO and it's Finite State Machine (FSM) by using Unico tool. Basically it works, but the reliability is too low. 
  • The second attempt was made with LSM6DSOX and it's Machine Learning Core (MLC) by using Unico tool. Reliability is too low in this case as well.
  • The third attempt was made with Qeexo AutoML using the SensorTile Box MCU for processing the sensor data. Similar to this description. https://community.st.com/t5/mems-and-sensors/how-to-detect-hand-air-gestures-with-the-sensortile-box-and/ta-p/49547
    The result was relatively good. However, just one tipping triggers the double tipping event. 

My biggest problem is, that I don't want to detect continuous signals, but specific time discrete events. I had Qeexo AutoML trained with segmented signals. Nevertheless, it triggers with just one tilt and not with two tilts. Apparently there are difficulties with the time window to be examined.

My next consideration is to find a possible solution with NanoEdge AI or STM32 Cube.AI.
Can anyone tell me what is the best solution to reliably detect time discrete events using 6 DOF accelerometer and gyroscope data?

Best answer by niccolò

Hi @Pointes ,

when dealing with specific time discrete events, the way to go is the FSM, so your first try was right =)
If you would like to increase the reliability, I would suggest to use the LSM6DSOX, so that you can filter data with MLC before using the FSM to detect the event
you can refer to this Application Note and to this code example, to understand the process

If this answers your question, please, mark this as "best answer", by clicking on the "accept as solution" to help the other users of the community

Niccolò

 

7 replies

MasterT
Lead II
July 31, 2023

Interesting project, "pattern recognition":

https://www.instructables.com/Secret-Knock-Detecting-Door-Lock/

Quite possible to extend on 2D or 3D

PointesAuthor
Associate II
July 31, 2023

Thank you very much for your suggestion. I assume that this is the code for it.
https://github.com/darkomen/Arduino/blob/master/Ejemplos/secret_knock_detector/secret_knock_detector.pde

The solution path for this problem is very similar to the finite state machine (FSM) of the LSM6DSO. There is a defined event (first knock) which triggers the recording of the knocks. Then the knock pattern is evaluated. (Number of knocks, timeout, ...) 
But what if the first knock was unintentional and I still knocked the right pattern afterwards? Then, unfortunately, this solution does not work. For my motion detection, this has to work. Especially because there is no suitable start event.

For the motion detection a machine learning approach is necessary.

MasterT
Lead II
July 31, 2023

Obviously, there should not be any start/stop events, simply capture input signal with regular time base and analyze "window" - 3-5 seconds max password (pass-gesture) length. Pass cann't be too long, since pattern recognition needs to be run after each sample taken.

Analytics may not necesserely tun in time domain only, frequency another side, same way as voice recognition.

I have no comments on machine learning, have no idea how does it work no I want to, but I have completed similar project using  arduino.

 Cross correlation two patterns (one is composed or recorded) better works if data high-pass filtered, more on topic in this book "The Scientist and Engineer's Guide to Digital Signal Processing" 

 

niccolò
niccolòBest answer
ST Employee
August 1, 2023

Hi @Pointes ,

when dealing with specific time discrete events, the way to go is the FSM, so your first try was right =)
If you would like to increase the reliability, I would suggest to use the LSM6DSOX, so that you can filter data with MLC before using the FSM to detect the event
you can refer to this Application Note and to this code example, to understand the process

If this answers your question, please, mark this as "best answer", by clicking on the "accept as solution" to help the other users of the community

Niccolò

 

PointesAuthor
Associate II
November 1, 2023

Hi @niccolò,
thank you very much for your advice.

I still have some challenges with it.

I try to detect every back and forth movement in the xy plane, no matter in which direction. To do this with FSM, I set a limit value for the amplitude of the acceleration in the x and y directions. But if the movement is between the x and y axes, it is more difficult to reach one of the x or y limits.
Can the LSM6DSOX calculate sqrt(x²+y²)?
Or can it only do sqrt(x²+y²+z²) with V?

My other question is whether it is possible to measure the distance traveled using built in FSM? Because you can also measure the rotated angle. 

niccolò
ST Employee
November 2, 2023

Hi @Pointes ,
I'm sorry, but it is not possible to use only x and y in the sqrt(), and also to process the distance traveled by the board.
for the latter, you can try to use an algorithm developing fw/sw, but it is more difficult to go from acceleration to space, than from angular velocity to angular space.

Niccolò

PointesAuthor
Associate II
November 1, 2023

Hi,

I still have some challenges with it.

I try to detect every back and forth movement in the xy plane, no matter in which direction. To do this, I set a limit value for the amplitude of the acceleration in the x and y directions. But if the movement is between the x and y axes, it is more difficult to reach one of the x or y limits.
Can the LSM6DSOX calculate sqrt(x²+y²)?
Or can it only do sqrt(x²+y²+z²) with V?

My other question is whether it is possible to measure the distance traveled? Because I can also measure the rotated angle.

PointesAuthor
Associate II
November 2, 2023

Hi @niccolò ,
thank you for your message.

The user cannot distinguish between x, y and inbetween and therefore the difference does not feel good.
So I need this functionality.

Can I use the Finite State Machine (FSM) and Machine Learning Core (MLC) and send the raw data to the microcontroller at the same time?
Or should I do all the evaluations on a microcontroller in this case?

niccolò
ST Employee
November 2, 2023

Hi @Pointes ,

sure, you can read data while running the FSM/MLC with no problem, just look into the output registers =)

 

Niccolò

PointesAuthor
Associate II
November 2, 2023

Hi @niccolò ,

this would be a workaround for some gestures.

Can another sensor recognize these gestures that would be better for my application?
Maybe the LSM6DSO16IS with its ISPU - intelligent sensor processing unit?

Pointes_0-1698946773575.png

It is mentioned that it can even do digital gesture recognition.

Pointes_1-1698946898792.png

 

 

niccolò
ST Employee
November 3, 2023

Hi @Pointes ,

you can try to use the ISPU to get to better results, but the digit gesture recognition is linked to the Qvar, so you should add a sensor embedding the Qvar.

anyway, with the ISPU you have a lot a flexibility in developing your own neural network but you have to be an expert in C programming to be able to implement it.
automated tools to develop networks are not available now.

Niccolò

p.s.: if you'd like more info on this matter, feel free to open a new thread, so that even the other users can follow =)

Visitor II
September 8, 2024

Can you give me some more details about your project actually i am also having the same plan of project but with different hardwares and softwares.

PointesAuthor
Associate II
September 12, 2024

Hi @pratham 
Since I posted the question here in the forum, I have not continued with the project. But I will continue at the end of this year.
Unfortunately, the LSM6DSOX can't do some basic functions that are essential for me. Therefore, I will probably do the calculations on the microcontroller and only request raw data from the sensor. I don't yet know exactly what this will look like.