cancel
Showing results for 
Search instead for 
Did you mean: 

How to quickly design animated avatars for the Metaverse based on ST’s smart sensors_ Part 2

Luca SEGHIZZI
Senior
Easily create your proof of concept with ST’s intelligent sensors and explore key building blocks to enable reliable, low latency avatars. In this knowledge article you will understand how you can successfully transfer real movements into your avatar for a seamless experience in the Metaverse. You will be able to create your own PoC with AI algorithms using AlgoBuilder, which includes an easy-to-use GUI that lets you program "in the edge" functionalities and advanced algorithms without writing the code.

What are we going to build?

In this knowledge article, the second of two articles, it will be explained how it is possible to implement the MLC algorithm created in the previous article in a more complex algorithm using visual programming with AlgoBuilder to activate the sensor fusion and hence the head tracking only when a condition is verifies.

Before Starting?

Software

The software tool that we will use in this knowledge article are:
Algobuilder – A Graphical User interface for Linux, MacOS and Windows which allows to build complicate AI algorithms and use the “in the edge” functionalities of our MEMS Sensors without writing any line of code
Unicleo – A Graphical User interface for Linux, MacOS and Windows which allows to display in 2D & 3D the firmware output of the AlgoBuilder Algorithms
Unico – A Graphical User interface for Linux, MacOS and Windows which allows to create in a quick way a Machine Learning Algorithm (.UCF file)

Hardware

An ST hat and the sensortile.box will be used in this tutorial to configure and develop a MLC algorithm with the LSM6DSOX MEMS 6 axis sensor. While the sensor fusion algorithm will run on the MCU of the sensortile.box the STM32L4R9ZI.
During this knowledge article, the board will be use in DFU mode. To enable this mode on the board it is needed to disconnect the board from the power supply
1129.png

Then it is needed to press the “Boot” button and at the same time connecting the USB cable to the PC:
 
1131.png
 

Setup AlgoBuilder

1. Configuration

As a first step launch the AlgoBuilder program and go on Application Settings and linking all the .exe files for:
  • Unicleo-GUI
  • STM32CubeIDE
  • STM32CubeProgrammer
  • Unico-GUI
1133.png
 

2. Project Setting

Then, it is needed to open the firmware settings selecting the project location, the toolchain will be selected automatically and finally the Target board to flash the firmware.
The target for this demo is the Sensotile.box.
 
1135.png
 

How to build a Sensor Fusion algorithm what is triggered only when a MLC condition is verified?

The target in this chapter is to show how it is possible to use AlgoBuilder to build a complex algorithm that enable the Quaternion 9X only when the MLC recognize that the hat position is Hat_On
  • When the hat is steady, the Quaternion is not displaying any data
  • When the hat is on someone’s head, the Quaternion data start streaming and the sensor fusion is performed to head track
The first building block to use is the Sensor Hub block.
1137.png

It is possible to change the properties of the AXL and the Gyro:
  • 104Hz for both AXL and Gyro
  • Accelerometer full scale: 16g
  • Gyroscope full scale: 500dps

Since our target is to get Quaternion 9X data and displaying them with a 3D face, it is needed to bring two blocks in the AlgoBuilder:
1139.png
 

The sensor fusion has different model to display 3D elements, it has a teapot, a car, a Nucleo Board, and a head. For this knowledge article linked to avatars in the metaverse, the head is the feature selected.
The next step is to get the .UCF file and implement it in the algorithm. To do so it is needed to select the MLC/FSM block and click on load .UCF file.
 
1142.png

The data from the Sensor Hub will be analyzed by the MLC .UCF file we generated.
Since the goal is to activate the 9x quaternion when the MLC shows the Hat on position, it is needed to insert here some logic blocks:
First it is needed to put an integer block with and we set its value to 4. In the previous knowledge article we defined two classes the class 0 for when the hat is steady and the class 4 for when the hat in on someone’s head.
Comparing the MLC output with this integer value with an “equal” block the logic will return a 0 value if the MLC output doesn’t match with the value 4 and it will return a 1 value if on the contrary the MLC and the integer value match, hence they are both 4.
 
1144.png

To check if the logic works well, it is needed to display two different integer output the one from the MLC and the one from the equal.
The next block to use is the Mux [2] block which gets as input two values
 
1146.png

Finally, the display is done with the value block where it is needed to set some parameters:
  • Data Type: Custom
  • Number of Values: 2
  • Window name: Status
  • Value Name 1: Fusion On-Off
  • Value Name 2: Hat On_Off
If this way a status display will appear highlighting if the hat is on or off and if the sensor fusion is running.
Finally, it is needed to connect the output of the equal block to the Quaternion 9x, in this way if the condition of the hat on is verified and the equal block output is 1 than the quaternion is activated and the fusion will display the moving avatar following the head movement.
 
1148.png

The very last step that is not mandatory, but can be useful is to add one more block to the algorithm leveraging the switch on/off of the LED present on the sensortile.box.
For that it is needed to write one line of code in .xml format and add it to AlgoBuilder libraries.
 
1151.png

The code says that if the equal output is equal to 1 than the LED 2 turns on, otherwise the LED 2 is off. In this case the LED 2 is the green LED.
 
1153.png

Once this is implemented it is possible to build the firmware and then flash it in the sensortile.box as shown in the previous part (link).
Finally opening Unicleo it is possible to check the output of our algorithm:
  1. If the hat is on the table:
    1.  the hat status is 0, the fusion is not running and its value is zero as well.
    2. The green LED is off and the avatar head is not moving
1155.png
 
  1. If the hat is on someone’s head, than the hat status is 4, the fusion is 1 and it is computing the sensor fusion and therefore the head is now moving following my head movements.
  2. Finally the green LED is now on
1156.png

The fusion has a reset model button which is very useful since it allows the self calibration of the model respect to the sensors position.
It is possible for instance to have the face parallel to the screen:
 
1157.png

Or the avatar will move with the back of the head visible:
 
1158.png

Or the avatar with the face in front of the screen:
 
1159.png
 

Conclusions

In this knowledge article we have shown how to use the UCF configuration generated in the previous knowledge article to trigger the quaternion 9x data and allow the movement of the head with sensor fusion algorithms only when a specific class was detected, in this case the hat on class.
If you are interested in the same topic, but detecting the motion intensity, you should check this article on GitHub. Here it is explained how it is possible to use AlgoBuilder to detect three different classes of intensity motion.
Moreover, please check the following link for a completed Webinar on the topic.
 
 
 
Version history
Last update:
‎2022-10-17 01:00 AM
Updated by: