cancel
Showing results for 
Search instead for 
Did you mean: 

Creating a smart asset tracking application with ISM6HG256X and STM32U3

Denise SANFILIPPO
ST Employee

Introduction

In this article, we showcase how ST motion MEMS sensors turn raw data into useful insights for an energy-efficient, real-time asset tracking application.
The proposed solution is based on the microcontroller, STM32U3, and on our latest industrial MEMS sensor, ISM6HG256X.

1. Overview of the smart asset tracking application

In this asset tracking application, the MEMS sensor, ISM6HG256X, continuously tracks motion with high accuracy and uses embedded AI to detect events instantly. When an important event happens, it wakes the MCU, ensuring nothing is missed while saving power.

The MCU handles logging, runs advanced AI for deeper analysis, manages alerts and connectivity, oversees battery life, and secures the data.

By sharing AI tasks between the sensor and MCU, we achieve a perfect balance of fast response, power savings, and smart processing. This makes our asset tracker both quick and energy-friendly!

2. STM32U3, the ultralow power microcontroller

STM32U3 is an ultralow power microcontroller with advanced power-saving features, designed for a range of applications including wearables, personal medical devices, home automation, and industrial sensors.

The STM32U3 uses the same Cortex®-M33 core and 40 nm process technology as the STM32U5 series. In terms of performance, it is at the same level as the STM32L4+ with 144 DMIPS. But where it truly stands out is in efficiency, positioning itself as our flagship product in ultralow power operation. For more info, check out our website st.com/STM32U3.

This smart asset tracking application needs a smart, reliable sensor to gather accurate data and that is where our latest industrial sensing solution comes in.

3. ISM6HG256X: a highly accurate inertial measurement unit (IMU)

The ISM6HG256X is a highly accurate IMU that detects both low and high acceleration simultaneously. Designed for tough industrial environments, it ensures reliable performance. With built-in edge processing and self-configuration, it optimizes performance while saving power at the system level.

We have chosen to use ISM6HG256X because it reliably detects everything, from small vibrations to strong shocks, all with one sensor. Its built-in edge processing and self-configuration make that possible. Additionally, it is built for harsh industrial environments, capable of operating up to 105°C.

The sensor includes two fully independent accelerometers: one optimized for low-g measurements up to 16 g, and another for high-g measurements up to 256 g. This setup makes sure that no event gets missed.

It also features a high-performance gyroscope with a full scale of 4000 degrees per second, providing precise angular velocity data.

Noise levels are impressively low: 60 micro-g per root hertz for the low-g accelerometer and 1 milli-g per root hertz for the high-g accelerometer, so you get clean, reliable signals.

Power consumption is optimized for long-term use, with just 0.2 milliamps for the accelerometers and 0.77 milliamps when combined with the gyroscope, making it ideal for battery-powered applications.

The sensor is packed with smart features like a machine learning core, free fall, and shock detection, plus 4.5 KB of embedded FIFO memory for efficient data handling.

It supports multiple communication interfaces, SPI, I2C, and I3C, making integration flexible and straightforward.

Finally, the ISM6HG256X operates reliably in harsh environments, with a temperature range of -40 to +105°C, and comes in a compact 2.5 x 3 x 0.71 mm LGA package.

These features make it ideal for demanding applications like condition monitoring, asset tracking, robotics, factory automation, safety helmets, and black box systems.

4. Smart asset tracking demo deep dive

Let us consider the brand-new development platform that combines two powerful boards: the Nucleo-U385RG-Q and the X-NUCLEO-IKS5A1.

Figure 1: The boards NUCLEO-U385RG-Q and X-NUCLEO-IKS5A1Figure 1: The boards NUCLEO-U385RG-Q and X-NUCLEO-IKS5A1

 


This pairing brings together the STM32U3 MCU with the highly accurate ISM6HG256X inertial measurement unit.

With this platform, you can easily dive into advanced applications like asset tracking, condition monitoring, robotics, and more, all while benefiting from optimized power consumption and strong sensor performance.

Let us have a look at the two boards:

The Nucleo-U385RG-Q is a development board featuring the STM32U385RG microcontroller, providing an affordable and flexible platform for users to explore new concepts and build prototypes. Key features of the board include:

  • A USB Type-C® connector
  • An Arduino® Uno V3 expansion connector
  • An on-board STLINK-V3EC debugger/programmer

X-NUCLEO-IKS5A1 is the new MEMS expansion board for industrial applications.

It offers a wide range of industrial motion and environmental sensors like:

  • ISM6HG256X: an intelligent IMU with simultaneous low-g and high-g acceleration detection (which is the protagonist of today’s tech dive)
  • Another IMU, ISM330IS, with an intelligent sensor processing unit (ISPU)
  • An intelligent ultralow power accelerometer
  • A pressure sensor, ILPS22QS
  • A magnetometer, IIS2MDC
  • A DIL24 socket and some additional connectors to manage external sensors

Now, let us plug the expansion board onto the Nucleo board and start the hands-on session to implement an asset tracking application.

The hands on is divided into four steps. Starting from the sensor configuration, then moving to the firmware generation for the STM32, evaluating the solution, and finally optimizing power consumption.

Figure 2: The four steps of the presented hands-onFigure 2: The four steps of the presented hands-on

 


5. Step one: sensor configuration (ISM6HG256X)

Let us start with the sensor configuration for the asset tracking use case.

The industrial IMU ISM6HG256X has the possibility to classify events directly at the edge and save power.

No events are missed thanks to the dual accelerometer full scale, and you get high accuracy and precision for tracking.

Moreover, you get a high level of integration thanks to the small size, since the package is just 2.5x3 mm.

One of the key features of our sensor is its edge processing capability, made possible by the machine learning core (MLC).

The MLC is a hard-coded engine inside the sensor that uses decision tree logic.

The accelerometer and gyroscope data are inputs for some computation blocks, where some features are computed and used in the decision tree logic to generate results.

This local processing enables the MLC to detect different classes and wake up the MCU only when necessary.

Setting up the MLC on the sensor is quick and easy because you do not have to design the algorithm yourself. Instead, you just:

  • Collect data logs.
  • Label the data.
  • Let the tool compute features and automatically build a decision tree for you.
  • Embed that decision tree right into the sensor to work with real-time data.
Figure 3: The machine learning core (MLC) embedded inside ISM6HG256XFigure 3: The machine learning core (MLC) embedded inside ISM6HG256X

 

Now, we are going to show you how to do all these steps with MEMS Studio.

MEMS Studio is a complete desktop software solution designed to develop embedded AI features, evaluate embedded libraries, analyze data, and design no-code algorithms for the entire portfolio of MEMS sensors.

One of the tools inside MEMS Studio is AlgoBuilder, which allows to generate firmware for various evaluation boards with STM32 microcontrollers.

Figure 4: Firmware settings - TargetFigure 4: Firmware settings - Target

In this case, a Nucleo board with the STM32U3 inside is used, together with the sensor expansion board X-NUCLEO-IKS5A1, as shown in the following picture.

A simple project must be created for data logging.

For the data logging, the following is needed:

  • Set the data rate, in this case 30 Hz.
  • Set the accelerometer full scale to 16 g, in this example only the accelerometer is used.

 

Figure 5: MEMS Studio - AlgoBuilder - Sensor Hub (properties)Figure 5: MEMS Studio - AlgoBuilder - Sensor Hub (properties)

 


After adding the "Sensor Hub" block, add also the "Acceleration" and the "Graph" block, as shown in the following picture.

Figure 6: MEMS Studio - AlgoBuilder - BlocksFigure 6: MEMS Studio - AlgoBuilder - Blocks

Save the project and build it, invoking a compiler that was previously selected. When the compilation is done, flash the firmware to the MCU and then connect the board. Click the pink start button in the top left corner of MEMS Studio, and see the sensor data.

Figure 7: MEMS Studio - AlgoBuilder EvaluationFigure 7: MEMS Studio - AlgoBuilder Evaluation

In the section “Save to File” you can perform data logging of the different classes that you want to recognize with the machine learning core. In our case: stationary, stationary not upright (which means that the package is rotated by 180 degrees), moving, and shaking.

Figure 8: MEMS Studio - AlgoBuilder Evaluation - Save to FileFigure 8: MEMS Studio - AlgoBuilder Evaluation - Save to File

 

Once data logging has been completed, go to the MLC tool in the advanced features of MEMS Studio to create the MLC configuration, which will automatically detect different states.

In the MLC tool, you can import data logs previously acquired and assign a label corresponding to the class associated to the log.

Four classes are defined in this case:

  • “Stationary.”
  • “Stationary not upright” (which means that the package is rotated by 180 degrees.)
  • “Moving”, corresponding to slow movements.
  • And “Shaking”, corresponding to higher intensity movements.
Figure 9: MEMS Studio - Advanced Features - MLC - Data patternsFigure 9: MEMS Studio - Advanced Features - MLC - Data patterns

To proceed with the MLC configuration, in the “ARFF generation” tab you can manually select the sensor settings and the features to be used for the classification.

Features are manually selected here, based on previous experience with this particular use case.

Figure 10: MEMS Studio - Advanced Features - MLC - ARFF generationFigure 10: MEMS Studio - Advanced Features - MLC - ARFF generation

However, consider that you also have the possibility to perform this step automatically by using the AFS tool.

In the “Decision tree generation” tab, the decision tree model is generated, and the classification results are shown.

Figure 11: MEMS Studio - Advanced Features - MLC - Decision tree generationFigure 11: MEMS Studio - Advanced Features - MLC - Decision tree generation

You can set values to the classes and get the configuration files for the sensor.

Figure 12: MEMS Studio - Advanced features - MLC - Config generationFigure 12: MEMS Studio - Advanced features - MLC - Config generation

Before moving on to the MCU configuration, an important point worth mentioning is the ability to store MLC data in the first in first out (FIFO) buffer. Thanks to this FIFO buffer, our sensor can store MLC parameters like decision tree results, filters, and features.

 

Figure 13: MLC results, filters and features can be stored in the FIFOFigure 13: MLC results, filters and features can be stored in the FIFO

 

This allows the STM32U3 microcontroller to execute more complex algorithms at any time by retrieving this data.

You can find more details about the machine learning core in the application note document AN6355, available on our website.

With the ISM6HG256X sensor, the machine learning core can handle data from both the low-g and high-g accelerometers.

6. Step two: board firmware generation (STM32U385RGT6Q)

In step one, we have seen how to create a configuration for the sensor.

Step two is the generation of firmware for the STM32U3, so that the Nucleo board can receive data from the sensor.

To do this second step, the AlgoBuilder feature in MEMS Studio is used again.

Going back to the AlgoBuilder project previously used for data logging, an additional branch can be added, called FSM and MLC, for setting the machine learning core configuration previously created.

Figure 14: AlgoBuilder - Adding FSM/MLC blocksFigure 14: AlgoBuilder - Adding FSM/MLC blocks

 

A .json file containing the MLC configuration should be specified, and a “Value” block should be added to visualize the different states or classes detected by the machine learning core.

Figure 15: AlgoBuilder - Adding Value blockFigure 15: AlgoBuilder - Adding Value block

 

The project is built again, and then new firmware is flashed on the STM32U3 of the Nucleo board.

7. Step three: evaluation (asset tracking classification)

We are now ready to test the machine learning core configuration in the sensor ISM6HG256X visualizing the data read by STM32U3.

We can see the acceleration data in the plot, and the correspondent class detected by the machine learning core in the sensor, with the values:

  • 0 when the package is still.
Figure 16: MEMS Studio - AlgoBuilder Evaluation - The package is stillFigure 16: MEMS Studio - AlgoBuilder Evaluation - The package is still
  • 4 when it is not in an upright position.
Figure 17: MEMS Studio - AlgoBuilder Evaluation - The package is not in an upright positionFigure 17: MEMS Studio - AlgoBuilder Evaluation - The package is not in an upright position
  • 8 when it is moving.
Figure 18: MEMS Studio - AlgoBuilder Evaluation - The package is movingFigure 18: MEMS Studio - AlgoBuilder Evaluation - The package is moving
  • 12 when it is shaken.
Figure 19: MEMS Studio - AlgoBuilder Evaluation - The package is shakenFigure 19: MEMS Studio - AlgoBuilder Evaluation - The package is shaken

 

At this point, step three has been completed, by evaluating the asset tracking application previously created.

The STM32 firmware used so far keeps the microcontroller active to continuously read data from the sensor.

8. Step four: Power consumption optimization

The next step, Step four, is to optimize power consumption.

This can be done in two ways:

  • Using the AlgoBuilder feature in MEMS Studio and generating and optimized firmware. 
  • Or manually modifying the code generated by AlgoBuilder by opening the project with STM32CubeIDE.

Before doing it, let’s see what it means for optimizing power consumption.

The STM32U3 offers multiple power modes to optimize power consumption for your applications.

For example, if register retention is not required, you can use standby mode, which allows the device to reach power consumption in the nanoampere (nA) range.

Stop mode, on the other hand, achieves the lowest power consumption while retaining the contents of SRAM and registers.

The SRAM can be fully or partially switched off. In our case, stop mode is used to optimize power consumption.

For more details about stop mode, you can refer to the STM32U3 datasheet, specifically Table 8 in document DS14830.

Back to the AlgoBuilder project, the stop mode is going to be introduced for the STM32U3.

In AlgoBuilder you can create “Custom libraries” and “Custom blocks” to extend the functionality by your own code.

You can create a “Custom library” called “MCU”, and inside it a custom block called “MCU Stop”.

Figure 20: Create a “Custom library” called “MCU”, and inside it a custom block called “MCU Stop”Figure 20: Create a “Custom library” called “MCU”, and inside it a custom block called “MCU Stop”

Then you need to set the required info, inputs, outputs, and the code to enable MCU stop, as shown in the following picture.

Figure 21: Set the required info, inputs, outputs, and the code to enable MCU stop blockFigure 21: Set the required info, inputs, outputs, and the code to enable MCU stop block

In this case, the function block has only one input to enable the stop mode, no output, and no property.

Simple C code can be written in the function block, to enable stop mode when the input of the function block is equal to 1 and recover clock settings after waking up.

Figure 22: Simple C code can be put in the function block, to enable stop modeFigure 22: Simple C code can be put in the function block, to enable stop mode

Once the block is created, it will appear in the MCU library as MCU Stop.

Figure 23:  The MCU stop block appears inside the MCU libraryFigure 23: The MCU stop block appears inside the MCU library

Before adding the MCU stop block, let’s add some condition to enable it.

We want to set the STM32U3 in stop mode when there is a stationary condition, which means that the package is still in an upright position.

The stationary condition is verified when the output of the machine learning core of the sensor has a constant value 0.

When this condition is verified, the MCU stop can be enabled.

In this project the stop mode is enabled when the stationary condition is verified for 10 seconds (corresponding to 300 counts at a 30 Hz data rate.)

In the project you can see also a “Logic Analyzer” block that has been added to show the different states and distinguish the MCU stop mode.

After rebuilding the project and flashing the new firmware to the STM32U3 on the Nucleo board, you are ready to test it.

Figure 24: The block tree is completedFigure 24: The block tree is completed

Let’s reconnect the board.

Then going to “AlgoBuilder Evaluation”, in the “Custom layout” of MEMS Studio, plots can be added for:

  • Acceleration data.
  • MLC state to check the output of the machine learning core.
  • And a logic analyzer to understand when the STM32U3 is in stop mode. When the MCU is in run mode the communication is active.

When the MLC state becomes 0, after 10 seconds the MCU is set to stop mode, and thus does not send any data to the application.

The machine learning core generates an interrupt every time the output value changes.

Moving the package causes the machine learning core in the sensor to detect a new state, which triggers an interrupt on the GPIO pin to wake up the microcontroller.

Figure 25: AlgoBuilder Evaluation - Custom layoutFigure 25: AlgoBuilder Evaluation - Custom layout

 

Step four of the hands-on has been completed, and so far, only MEMS Studio has been used.

If you want to do further optimizations or modifications, you can still open the project with STM32CubeIDE.

To do this you need to:

  • Select the workspace to your firmware location.
Figure 26: Select the workspace to your firmware locationFigure 26: Select the workspace to your firmware location
  • Click on “Import project” to add an existing project into the workspace.
Figure 27: Import the project inside STM32CubeIDEFigure 27: Import the project inside STM32CubeIDE
  • Select the root directory to your STM32U3 firmware location.

Figure 28: Select root directory to your STM32U3 firmware locationFigure 28: Select root directory to your STM32U3 firmware location

  • Select the build configuration you want to use.

Figure 29: Select the build configuration you want to useFigure 29: Select the build configuration you want to use

You can now modify the code in the IDE and customize your application. For instance, you could run other algorithms on the STM32U3.

Conclusion

The STM32U3 paired with the ISM6HG256X sensor, combined with ST’s ecosystem, gives you a powerful platform for smart asset tracking and IoT, offering accuracy, low power, and fast innovation. With the shown tools, you are well equipped to prototype quickly and build smarter, energy-efficient connected devices for the future.

Related links

Version history
Last update:
‎2025-11-18 8:16 AM
Updated by: