2025-06-25 4:06 AM
Hello everyone,
I am currently working on a gesture recognition project using ST's VL53L5CX multi-zone laser distance sensor , and I would like to ask if anyone in the community has experience with related gesture datasets or has worked on similar applications.
So far, I have been able to obtain multi-zone distance data from the sensor using NUCLEO + VL53L5CX in an Arduino environment. However, to train and validate the gesture recognition algorithm, I need to collect a large number of gesture samples.
Therefore, I would like to ask:
1、Is there any publicly available VL53L5CX gesture dataset ?
2、Has anyone worked on similar projects and is willing to share some experience or methods for data collection?
3、Are there any recommended approaches for feature extraction, preprocessing, or classification algorithms for such gesture data?
If anyone in the community is interested in sharing raw data or collaborating on building a dataset, I would be more than happy to work together on this!
Thank you all for your support!
Best regards,
2025-06-26 1:13 AM
Hi @xu_xupt ,
Good to hear you are working with our ToF sensors to detect Gestures.
My first question is: which kind of Gestures are you targeting ? Hand Posture (like, flat hand, ...) or motions ?
Currently, we have 2 solutions to support this 2 kind of gestures:
- Hand Posture: based on AI, you will find everything (training scripts, models, datasets) here: stm32ai-modelzoo-services/hand_posture at main · STMicroelectronics/stm32ai-modelzoo-services · GitHub
- Hand Gesture (motion): the complete solution is available on ST.com: STSW-IMG035 - Turnkey gesture recognition solution based on VL53L7CX and VL53L8CX multizone Time-of-Flight ranging sensors - STMicroelectronics
Last information, the dataset has been captured with the VL53L8CX, so you won't get good results if you are using the VL53L5CX, you will have to collect your own dataset.
Thanks,
2025-06-26 1:26 AM
Thank you very much for your response!
First of all, I'm using the VL53L5CX sensor mainly because of its lower cost, which helps with budget control in my project.
Secondly, my goal is to recognize both static gestures (such as rock, paper, scissors) and dynamic gestures (like swipe up, down, left, right, etc.), aiming to implement around 15 different gesture types in total.
Therefore, I would like to ask if you or others in the community have any experience or suggestions regarding similar applications, especially in the following areas:
1、Is there any publicly available gesture dataset that is suitable for the VL53L5CX sensor?
2、If I need to collect data by myself, are there any recommended procedures or methods for data collection?
3、I only have a NUCLEO-F429ZI board and a VL53L5CX sensor board (not the X-NUCLEO-53L5A1), how can I quickly get started with STSW-IMG035 ? Specifically, how can I use the NUCLEO-F429ZI and VL53L5CX to replace the NUCLEO-F401RE and X-NUCLEO-53L5A1 setup?
In addition, as you mentioned, the gesture recognition solution is based on a dataset trained with the VL53L8CX sensor. If I want to implement similar functionality using the VL53L5CX, is it possible to migrate from aspects such as data collection and model training? Are there any suggestions or considerations for this?
I'm really looking forward to your further guidance, and I’m also more than happy to exchange experiences with other developers in the community — even collaborate on building an open-source gesture dataset together!
Once again, thank you for your help!
Best regards,
xu_xupt