cancel
Showing results for 
Search instead for 
Did you mean: 

How to quickly design animated avatars for the Metaverse based on ST’s smart sensors_ Part 1

Luca SEGHIZZI
Senior
Easily create your proof of concept with ST’s intelligent sensors and explore key building blocks to enable reliable, low latency avatars. In this knowledge article you will understand how you can successfully transfer real movements into your avatar for a seamless experience in the Metaverse. You will be able to create your own PoC with AI algorithms using AlgoBuilder, which includes an easy-to-use GUI that lets you program "in the edge" functionalities and advanced algorithms without writing the code.

What you learn reading this knowledge article?

In this knowledge article, the first of two articles, it is explained how it is possible to create an MLC algorithm using AlgoBuilder and the sensortile.box to detect two classes: a steady position of a Hat and when a person wears the hat.
In the next knowledge article chapter, it will be explained how it is possible to use this MLC algorithm to activate the head tracking movements using sensor fusion algorithms only when the user has the hat on.

 

Before starting, what do you need?

Software

The software tool that we use in this knowledge article are:
AlgoBuilder – A graphical user interface for Linux, MacOS, and Windows, which allows to build complicate AI algorithms and use the “in the edge” functionalities of our MEMS sensors without writing any line of code
Unicleo – A graphical user interface for Linux, MacOS, and Windows, which allows to display in 2D & 3D the firmware output of the AlgoBuilder Algorithms
Unico – A graphical user interface for Linux, MacOS, and Windows, which allows to create in a quick way a Machine Learning Algorithm (.UCF file)

Hardware

An ST hat and the sensortile.box will be used in this tutorial to configure and develop an MLC algorithm with the LSM6DSOX MEMS 6 axis sensor. While the sensor fusion algorithm runs on the MCU of the sensortile.box.
In this knowledge article, the board is used in DFU mode. To enable this mode on the board, it is needed to disconnect the board from the power supply.


 
1123.png
 

Then, it is needed to press the “Boot” button and at the same time connecting the USB cable to the PC:
 

1124.png
 

Setup AlgoBuilder

  • Configuration

As a first step launch the AlgoBuilder program and go on Application Settings and linking all the .exe files for:

  • Unicleo-GUI

  • STM32CubeIDE

  • STM32CubeProgrammer

  • Unico-GUI

 
1125.png
 
  • Project Setting

Then, it is needed to open the firmware settings selecting the project location, the toolchain will be selected automatically and finally the Target board to flash the firmware.
The target for this demo is the Sensotile.box.

 
 
1126.png

How to Create the .UCF file

The target in this chapter is to show how it is possible to use AlgoBuilder to get the datalogs of both AXL and gyroscope and use them to create the .UCF file to detect two classes:

  • The steady position, meaning when the hat is on a table or on a surface

  • The Hat_On position, meaning when someone is wearing the hat

The first building block to use is the Sensor Hub block.
It is possible to change the properties of the AXL and the gyroscope:

  • 104Hz for both AXL and gyroscope

  • Accelerometer full scale: 16g

  • Gyroscope full scale: 500dps

 
1127.png
 

Since the target is to get the data logs of both AXL and gyroscope, it is needed to add the respectively blocks.
 

1128.png
 

Once this is done, it is possible to build the firmware, upload it to the board by clicking open Unicleo-GUI by clicking on the icon below.

1130.png

Once the firmware has been flashed in the sensortile.box. By clicking on the Unicleo – GUI icon

1132.png

The SW opens and by clicking on Start and selecting the two graph icons and the data log icon like in the image below:

1134.png

It is possible to:

  1. Check the Angular rate and the Acceleration data displayed with the respective axis

  2. Generate some datalogs of the Angula Rate, the Acceleration or of both. The data is saved as .CSV files in the selected folder.


Two classes have been recorded to be identified:
The Steady position getting AXL and gyroscope data when the hat is laying on the table.


1136.png

And the Hat_On position getting the AXL, gyroscope data when a person is wearing the hat.


1138.png
 

To create the .UCF file it is needed to add one more block to AlgoBuilder, the MLC/FSM block:


1140.png

The next step is to import the data logs acquired in the MLC tool of Unico-GUI. In the "Data Patterns tab" of the MLC tool by clicking on the "Browse" button it is possible to select the corresponding data log files (multiple files can be selected simultaneously for the same class).
 

1141.png

For all the data logs imported and for each class it is needed to assign a label, which is the class associated to that data.
Once all the logs have been correctly imported, click on the “configuration” tab and proceed with the creation of the decision tree and the configuration file.
The configuration is for LSM6DSOX:

  • 104 Hz data rate for Machine Learning Core (which is the same data rate used to acquire the data logs)

  • Accelerometer data & gyroscope has to be selected accordingly to the parameters we used to get the data logs:

    • 16 g full scale

    • 500 dps


 

1143.png
 

Then it is needed to proceed with the configuration by selecting 1 decision tree and a window length of 104 samples. The window parameter specifies the number of samples the Machine Learning Core elaborates every time it has to identify a class. It can be from 1 to 255 and the latency of the MLC results will be related to this number. In this case with 104 Hz data rate, and 104 samples in the window will get an equivalent time window of 1 second.
 

1145.png
 

The filter configuration part is skipped for this example, that means that no filters are used on the input data.
In the features configuration section, it can be select which feature to use when building the decision tree.
Since from the data logs it was clear that there were some variations on the accelerometer & gyroscope axes, it is possible to select some features (like mean, variance, energy, peak2peak) on these axes, and also on the accelerometer norm.

 

1147.png

1149.png
 

This is the step 3 of the MLC configuration procedure, where we are going to build the decision tree, which uses some of the features we have previously defined. We just need to click on the Generate button, or alternatively we can also import decision trees generated by external tools. Then we have the meta-classifier, it is an optional filter for the decision tree results, we can skip it for this example. Finally, we can generate the UCF file with the Machine Learning Core configuration for LSM6DSOX, to embed the decision tree in the sensor configuration.
Finally, by clicking on next it is possible to save the file .UCF to be implemented in our Sensortile.box.
Closing the Unico-GUI, it is possible using AlgoBuilder to upload and verify if the ML algorithm works properly.
To do so, it is needed to add one more block to AlgoBuilder, which is the Value Block.

 

1150.png

Since we need to display just one value, it is needed to set some parameters on this block:

  • Data type: Custom

  • Number of values: 1

  • Window name: Hat_Config

  • Value 1 name: Hat_On

It is now possible to build the firmware again and flash it in the sensortile.box. Opening the Unicleo-GUI and clicking on "Start", it is possible to immediately check the result of the implementation:

  • If the hat is on the table, the MLC output is 0


1152.png
 

  • If the hat is on the head, then the MLC output is 4


1154.png
 

Conclusions

We have seen how to detect two different classes using the MLC in the LSM6DSOX by visual programming the sensortile.box with AlgoBuilder to get the data log, to create and implement the MLC configuration.

If you are interested in the same topic, but detecting the motion intensity, you should check this article on GitHub. Here it is explained how it is possible to use AlgoBuilder to detect three different classes of intensity motion.
Moreover, please check the following link for a completed Webinar on the topic.

Version history
Last update:
‎2022-09-29 12:45 AM
Updated by: