cancel
Showing results for 
Search instead for 
Did you mean: 

How to use STSW-IMG035 Hand Posture?

emilywengster
Associate II

Hello, I'm currently trying to use "STSW-IMG035 Turnkey gesture recognition solution based on VL53L5CX, VL53L7CX and VL53L8CX multizone Time-of-Flight ranging sensors" on my board. I have a NUCLEO-F401RE and XNUCLEO-53L8A1 expansion board. I'm very new at this and I want to be able to get Hand Posture VL53L8 sensor driver to be able to display some output (thumbs up or thumbs down) but I have no idea how to do that. Do I need to load any dataset before I can use it? Any form of help is appreciated. Thank you!

1 ACCEPTED SOLUTION

Accepted Solutions

Hi all,

that's good to read people wanting to play with AI with our ST ToF sensors.

I hope I will give you more than hints in this answer.

 

STSW_IMG035 is an all-in-one package for our Gesture solutions. Inside, you will find 3 kinds of "Gestures":

- hand tracking (analytic solution)

- motion gestures (analytic solution)

- hand postures ( AI-based solution)

 

Let's focus on our Hand posture ToF AI solution.

In the STSW_IMG035, we are showing an example of AI model supporting 7 postures and trained with VL53L8CX data. To be honest, the accuracy is very good, and you can feel the power of AI with ST low resolution ToF data.

 

How did we create this model?

Everything is available on the STM32AI ModelZOO - GitHubstm32ai-modelzoo/hand_posture at main · STMicroelectronics/stm32ai-modelzoo · GitHub

 

STM32AI ModelZOO - GitHub

When I'm saying everything, it really means everything:

- Python scripts (tensorflow / Keras / ...)

- ST AI model topology (2D CNN)

- ST training Dataset (used to generate the AI model of STSW_IMG035)

The documentation is embedded in GitHub (READ.ME) and if you are motivated to read it, I have no doubt you will be able to customize your own AI model.

The ST ModelZOO is sharing a common Python environment between all use cases, you just have to follow the installation procedure described on the welcome page: "Before you start"

 

Hand posture ToF AI - web interface

We also released a graphical AI tool which is part of the famous ST Edge AI Suite , called Hand posture ToF AI.

This web interface is using directly the STM32AI modelZOO scripts, it's easy to use and you can test your model directly on the web page if your board is connected to your computer. Just follow the steps.

You can't modify everything like STM32AI ModelZOO, but it allows to modify the basic AI parameters.

The documentation is embedded in the web page.

 

One last information, the STSW_IMG035 is the official data logger to capture your data in case you would like to create your dataset or create new postures (it's explained in the documentations)

 

I hope this will help you start your journey into the wonderful world of AI. Our ToF sensors can do much more than measuring distances thanks to AI.


Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'

View solution in original post

7 REPLIES 7
John E KVAM
ST Employee

The STSW_IMG035 has code for the STM32F401RE which can do gestures. 

We define Gesture as a swipe (left, right, up or down) plus tap and double tap. 

This is a algorithmic function based on finding the center of the hand and tracking it. 

It is NOT A/I.

We have built demos where we tried, postures such as thumbs-up, thumbs-down, halt, OK and a few others.

And one can do it with A/I.

Create a bunch of datasets using the distance data and the signal strength. Then fire up your A/I engine. 
- john

 


If this or any post solves your issue, please mark them as 'Accept as Solution' It really helps. And if you notice anything wrong do not hesitate to 'Report Inappropriate Content'. Someone will review it.

Hello John, 

Thank you for your reply. Could you give us any hints on what AI ideas we should try to have hand posture running? May I also ask if what is achieved is static gesture recognition? I am currently encountering challenges with dynamic gesture recognition. Thank you!

 

-Emily

We are exploring AI technology to address dynamic gesture recognition using your 3D sensor. Could you provide some insights into which AI model you utilized for your AI-based solution?

 

 

Hi all,

that's good to read people wanting to play with AI with our ST ToF sensors.

I hope I will give you more than hints in this answer.

 

STSW_IMG035 is an all-in-one package for our Gesture solutions. Inside, you will find 3 kinds of "Gestures":

- hand tracking (analytic solution)

- motion gestures (analytic solution)

- hand postures ( AI-based solution)

 

Let's focus on our Hand posture ToF AI solution.

In the STSW_IMG035, we are showing an example of AI model supporting 7 postures and trained with VL53L8CX data. To be honest, the accuracy is very good, and you can feel the power of AI with ST low resolution ToF data.

 

How did we create this model?

Everything is available on the STM32AI ModelZOO - GitHubstm32ai-modelzoo/hand_posture at main · STMicroelectronics/stm32ai-modelzoo · GitHub

 

STM32AI ModelZOO - GitHub

When I'm saying everything, it really means everything:

- Python scripts (tensorflow / Keras / ...)

- ST AI model topology (2D CNN)

- ST training Dataset (used to generate the AI model of STSW_IMG035)

The documentation is embedded in GitHub (READ.ME) and if you are motivated to read it, I have no doubt you will be able to customize your own AI model.

The ST ModelZOO is sharing a common Python environment between all use cases, you just have to follow the installation procedure described on the welcome page: "Before you start"

 

Hand posture ToF AI - web interface

We also released a graphical AI tool which is part of the famous ST Edge AI Suite , called Hand posture ToF AI.

This web interface is using directly the STM32AI modelZOO scripts, it's easy to use and you can test your model directly on the web page if your board is connected to your computer. Just follow the steps.

You can't modify everything like STM32AI ModelZOO, but it allows to modify the basic AI parameters.

The documentation is embedded in the web page.

 

One last information, the STSW_IMG035 is the official data logger to capture your data in case you would like to create your dataset or create new postures (it's explained in the documentations)

 

I hope this will help you start your journey into the wonderful world of AI. Our ToF sensors can do much more than measuring distances thanks to AI.


Our community relies on fruitful exchanges and good quality content. You can thank and reward helpful and positive contributions by marking them as 'Accept as Solution'. When marking a solution, make sure it answers your original question or issue that you raised.

ST Employees that act as moderators have the right to accept the solution, judging by their expertise. This helps other community members identify useful discussions and refrain from raising the same question. If you notice any false behavior or abuse of the action, do not hesitate to 'Report Inappropriate Content'

 

Like you, we are also exploring AI technology.

Please review ST employee Labussiy's response, which provides valuable information.

Thank you so much, i appreciate the response. It was very informative and I look forward to trying out this method.

 

-Emily

Thanks for the excellent information.