on 2024-12-05 08:00 AM
This article showcases how to get started with ST AIoT Craft tool to develop your IoT solution based on advanced edge AI features within ST smart sensors.
This new tool has been created to provide an experience to program ST smart sensors with AI that is smoother, easier, and more user-friendly.
ST AIoT Craft tool allows you to experiment with advanced edge AI features integrated in ST smart sensors.
Get started by learning how to program the machine learning core (MLC) engine that is embedded in many MEMS inertial sensors and IMUs: acceleration (or vibration) and rotation patterns can be classified while the microcontroller is in sleep mode, saving energy.
ST AIoT Craft collects labeled sensor data and uses it to train the MLC engine. The result can be validated using ST evaluation boards either from the web portal (we recommend using Chrome) or from the companion mobile app. You do not need to download ST AIoT Craft locally on your PC, you can run ST AIoT Craft directly in your browser.
Let us see how ST AIoT craft tool works.
To do that, start evaluating one of the projects examples available.
Access the tool by clicking on the following link: https://staiotcraft.st.com/index.html.
On the [Readme] button in the top-right corner you can access the ST AIoT Craft documentation pages. Here is a list of examples:
Coming back to the homepage, click on the [Try it out] button to enter into the tool, as shown in Figure 1.
Note: at this point, you are not yet logged in to the web application as it is not required.
You can find practical project examples prepared by ST, covering industrial use cases and entertainment use cases.
Click on the Smart Asset Tracking project example. A pop-up window appears, providing you with a short description.
Click on the three vertical dots on the right side of the [Try out] button. Then click on [View details].
You can see that in this project example, only one AI model called smart_asset_tracking is developed. The target of this AI model is the kit SensorTile.box PRO and the IMU LSM6DSV16X with machine learning core.
The model is a classifier with four classes (stationary_upright, stationary_not_upright, motion, shaken) since it is a decision tree and must be trained on data since it is an AI model. The AI model is trained on four classes (stationary_upright, stationary_not_upright, motion, shaken).
You can evaluate this AI model either on the web with the web application or you can continue the journey with the mobile app.
If you select the mobile app journey experience, some QR codes are displayed. In particular, on the right, you can find the installation QR code of the mobile app.
If you are interested in exploring ST AIoT Craft mobile app, a dedicated article is coming soon.
Let us focus on evaluating this AI model directly on the web with the web application.
Consider the SensorTile.box PRO as the board used throughout this hands-on session.
You need to flash the firmware compatible with this web application to evaluate the AI model. The same firmware is used later on, for the data logging phase to create a dataset.
As a first step, update the firmware of your SensorTile.box PRO by clicking on [Update the firmware].
Here are listed three steps that you need to follow:
1. Connect the SensorTile.box PRO to the PC through a USB Type-C® cable.
2. Turn on the board in the DFU mode to flash the firmware. Hold down the user button 2 and power on the ST board through the power switch of the SensorTile.box PRO, as shown in the figure below:
3. Flash the firmware FP-SNS-DATALOG2_Datalog2 on the device by clicking the [Flash firmware] button in the right bottom corner. The board has been recognized “connected” in the DFU mode.
The web application is now downloading the firmware binary, and streams the content of the binary file to the board.
Your browser should already detect your connected SensorTile.box PRO. If it is not, you need to select it. Once the firmware update is finished, you can establish the connection with the board.
Click on the[Connect device] button located in the bottom-right part of the screen.
When the connection has been established, the AI model is downloaded and sent to the SensorTile.box PRO, where the firmware application programs the sensors accordingly.
In particular, as you can see, the accelerometer has been enabled with these particular settings:
The data coming from the accelerometer is fed to the machine learning core, which classifies the data into the four classes: shaken, motion, stationary_not_upright, stationary_upright.
Click on the [Start] button.
The states are recognized based on the placement and movement of your SensorTile.box PRO.
By clicking the [Stop] button, you terminate the evaluation of this AI model. The following view appears as shown below:
Now, you can decide to continue with exploring some other compatible projects in the section [Explore other projects], such as "Fan Coil Monitoring" and "Drilling Machine".
Otherwise, you can clone the project in your workspace by clicking on the yellow button [Clone project]. Note: this action requires user authentication through the login process.
You can also choose to click on [View project details], as done previously.
Now, let us see how the evaluated AI model is trained.
To do that, click on the yellow button [Train] to enter the training view.
For the training process, we need to use data. Inside the left-most block "Training data" you can see 64 files of data for a total acquisition time of 1353.99 seconds, as shown in the figure below:
We have collected the data for these four classes and the dataset is well balanced.
Since the data contains only accelerometer data, you can only use the accelerometer to send data to the machine learning core.
The training process automatically extracts the filters and features from the data. You can choose how these filters and features can be extracted. There are three options: [Default], [Best], [Fastest]. Choose [Default] for this specific example.
After training the AI model, it has a classification accuracy of 97.96%.
You can download the results of the training process by clicking on the [Download configuration files] button. This initiates an .ucf file download which has been used to program the board. Additionally, a .h file that is useful to develop your own firmware application.
You can expand the results by clicking the white and blue button in the rightmost box [Results].
This allows you to check statistics like the confusion matrix and other metrics. Other metrics include instances, correctly classified instances, incorrectly classified instances, and the kappa statistic.
Close the Results MLC window by clicking the [x] button inside the window's top-right corner. Click the gray [Back] button near the main title MLC configuration: smart_asset_tracking, to return to the previous view.
You can also clone the project in your workspace by clicking on the [Clone] button in the upper right corner.
Cloning requires user authentication, so you are redirected to the myST login webpage.
After entering the credentials, you can conclude the cloning by calling the demo “smart_demo” and then click ok. Now, the project has been cloned inside your workspace. If you access your project area, the project “smart_demo” is present, as shown in the following figure.
Create a new project by clicking [New Project]. The objective of this new project is detecting different movements of the board.
Call the new project “movements_detector” and click OK.
The new project is empty. So, we need to create a model, but before that we need to collect a dataset. Move to the section “My datasets”, as shown in the following figure. Here you can find the smart_demo dataset, which was already cloned when you cloned the smart_tracking project.
You can look at this dataset by clicking the [Open details] button to understand how the application handles the dataset.
The next view shows the dataset, which can be documented with a readme.
By default, the readme file is empty. We can customize it, by clicking the [Edit] button, and writing an example such as:
# Dataset
my dataset
Then, click [Save].
On the right side, you have generic metadata. This includes dataset ID, dataset type, datasets (which contains 107 datalogs), dataset size, classes acquired in the dataset, and the board used to collect the dataset.
You can also view the files by selecting the tab called [Data logs].
You can view a set of .csv files, one for each class. You can look at the data by clicking on the [Open data visualizer] on the right side, as shown below.
Click on [Start with default charts].
The file only contains accelerometer data and you can customize the view. For example, you can zoom in on the data and move the timescale.
You can add a new chart by clicking on the [Add new chart] button. A pop-up window appears. Here you can select [lsm6dsv16x - Accelerometer] as sensor and [x] as the dimension. For colors and styles, select [Blue color] as [Line Color] and [Dashed] as [Line Style]. Click [Save] to save these settings. The resulting charts are synchronized:
You can create a new dataset by clicking on the [New Dataset] button in the upper right corner of the section “My datasets”.
Name it "movement_dataset". Inside "Classes", insert the movements that you detect, for example vertical, horizontal, and idle. Click OK.
Import data by clicking on the [Import data] button in the upper part of the section “My datasets”. You can choose to import data from an existing .csv file, or choose to log data.
Choose to proceed with logging data. Click on the [Log data from board] button. Check that the board is connected to the PC, otherwise connect it to the PC and pair it to the PC. Once you have done that, click on the [Connect device] button in the top-right corner.
Configure the settings of your sensor by expanding the [Sensor Settings] menu. Select the accelerometer of the LSM6DSV16X IMU (FS = 8 g and ODR = 60 Hz).
Now, you can start acquiring data which is stored temporarily in the local file system on your PC. Then, you can upload data to the cloud.
To proceed, click on the [Start] button. You need to select the folder where the data should be stored. Click on [View files, Save Changes] in the pop-up window. You can collect 20 seconds for each class, that is, idle, horizontal, and vertical classes.
Now, do not move SensorTile.box PRO while it is acquiring data for 20 seconds. After 20 seconds, click on the idle tag to stop the acquisition for this class.
Start with acquiring of the horizontal movements by clicking on the horizontal blue tag. While you acquire data for 20 seconds, perform horizontal movements by moving horizontally the board with your hand. Then stop the acquisition by clicking on the horizontal tag.
Start with the acquisition of the vertical movements by clicking on the vertical blue tag. While you acquire data for 20 seconds, perform vertical movements by moving the board vertically with your hand. Then stop the acquisition by clicking on the vertical tag and then on the[Stop] button.
The summary of your acquisition pops up. As you can see in the lower section of the pop-up, you collected a balanced dataset among the three different classes idle, horizontal, and vertical.
Now, you can import your data. To do that, click on the blue button [Confirm and import].
The data has been uploaded and locally saved in binary format.
So, what is happening right now is that the cloud is decompressing the data, and it is generating a set of .csv file, one for each class.
The conversion ends. If you visit the section “Data Logs”, you can see a set of CSV files, one for each class. To see the datalog related to the horizontal class, click on the symbol with the bars [Open data visualizer] on the right in the correspondent row of the table. Click on [Start with default charts] to visualize the acquired data for the horizontal class.
If you want to crop the beginning and/or the end of a datalog, you can click on the scissors button. To finalize the action, click on [Crop selection], then click on [Crop data] in the pop-up window.
Move to the section “My projects” and consider the project called movements_detector, which is the one we created before. Click on it and create a model by clicking on the button [New model]. Since you have a SensorTile.box PRO connected to the PC as target, select its name inside the section [Evaluation board]. Select “MLC v2” as the smart sensor name. Call the model “movements_model” and select the movements_dataset as dataset. The classes are inherited from the dataset. To finalize the creation of the new model, click [ok].
Now, click on the [Train] button to train the model.
The dataset is quite balanced among the three classes. In the section “Sensors”, select the accelerometer. The filters and features can be set as Default for the extraction method. To start the training, click on the [Run MLC Configuration] button.
The training phase extracts the filters and features from the data. Afterwards, it generates the decision tree and converts the decision tree in an .ucf file. The .ucf file is used for the evaluation and is moved under the user workspace.
If you look at the results, the model is overfitting the data (classification accuracy is at 100%). If you click on the white icon inside the "Results" box, you can check the confusion matrix. Now, you can evaluate the model on the SensorTile.box PRO, by clicking on the [Try MLC] button in the [Results] box.
Select the “On the web option”, since the SensorTile.box PRO is already connected and paired to the PC. Click on the [Connect device] button located in the bottom-right corner. The web application automatically flashes the AI model on the SensorTile.box PRO.
When the process is finished, click on the [Let's start!] button to run the AI model.
The accelerometer is automatically enabled with the same settings which you used during the datalog phase, as well as the machine learning inside LSM6DSV16X.
Here is an example of the evaluation: the board is detected in a horizontal state.
If you change the board position from horizontal to vertical, the vertical class is detected.
If the board is kept still, the idle state is detected.
Click on the [Stop] button to stop the evaluation phase.
In this article, we introduced you to ST AIoT Craft tool. ST AIoT Craft can teach you how to program the machine learning core (MLC) engine that is embedded in many ST smart sensors. In particular, how to train an AI model. The result can be validated using ST evaluation boards, such as SensorTile.box PRO, from the web portal without downloading the tool locally in your PC. More articles about further features of ST AIoT Craft tool are coming, so stay tuned!