cancel
Showing results for 
Search instead for 
Did you mean: 

Getting started with head gestures on ST AIoT Craft

Denise SANFILIPPO
ST Employee

Introduction

Head gestures, such as nodding, shaking, and other general head movements, are essential for various applications, including safety helmets, VR headsets, and other head-worn gear. Recognizing these gestures accurately can significantly enhance user experience and functionality in these applications.
STMicroelectronics offers a solution for head gestures recognition using the machine learning core (MLC) integrated into the smart MEMS sensor LSM6DSV16X available on the SensorTile.box PRO board kit. For wireless tracking, the ST AIoT Craft mobile app, available both on Google Play Store and Apple Store, can be used.

1. Getting started

To begin, click on the "Head Gestures" project available inside the Project examples. A pop-up window appears with a brief description, as shown in Figure 1.
Next, click on the three vertical dots next to the [Try out] button and select [View details]

Figure 1: Head Gestures projectFigure 1: Head Gestures project

This reveals that the project uses a single AI model named head_gestures, targeting the SensorTile.box PRO and the IMU LSM6DSV16X with a machine learning core. The model classifies data into three categories: nod, shake, and other (which is a category embedding all other possible gestures).

Figure 2: Head Gestures: a ready-to-run use caseFigure 2: Head Gestures: a ready-to-run use case

 

You can evaluate the AI model either through the web application or via the mobile app. If you choose the mobile app, QR codes for installation are provided.

Figure 3: Choose your environment: either Web browser or ST AIoT Craft mobile appFigure 3: Choose your environment: either Web browser or ST AIoT Craft mobile app

 

For now, let us focus on evaluating the AI model directly on the web application using the SensorTile.box PRO.

1.1 Firmware update and connection

  • Update the firmware: Connect the SensorTile.box PRO to your PC using a USB Type-C® cable. Turn on the board in DFU mode by holding down user button 2 (BT2) and powering on the board via the power switch (SW) (see Figure 4).
Figure 4: Turn on the board in DFU modeFigure 4: Turn on the board in DFU mode

 

  • Flash the firmware: Click the [Flash firmware] button to download and stream the firmware binary to the board. Ensure that your browser detects the connected SensorTile.box PRO. Once the firmware update is complete, establish a connection with the board by clicking the [Connect device] yellow button.
  • Enable sensors: The AI model is downloaded and programmed onto the SensorTile.box PRO. The accelerometer settings are configured as follows:
    • ODR = 30 Hz
    • FS = 4 g
    • Machine learning core enabled

1.2 Evaluation Phase

Click the [Start] button to begin evaluating the AI model. The SensorTile.box PRO recognizes the following states based on its placement and movement:

  • Nod
  • Shake
  • Other (all other possible head gestures)

To end the evaluation, click the [Stop] button.

Conclusion

You can effectively classify the state of your activity using the "Head Gestures" project example on the ST AIoT Craft platform: share your experience!

Create your own solution exploring the "My datasets" section and the "My Projects" section. 

Figure 5: Explore My datasets and My Projects sectionsFigure 5: Explore My datasets and My Projects sections

 

 Stay tuned for more insights on how to leverage ST AIoT Craft for your projects!

Related links

Version history
Last update:
‎2025-12-10 2:26 AM
Updated by:
Contributors