Showing results for 
Search instead for 
Did you mean: 

Actually i was trying to run mobilenetV2(.onnx model) on stm32h723zg using stm32cubeMx, After enabling external flash and external memory i was able to analyze and validate on desktop but while validating on target i am getting memory overflow error.

Associate II
Ghofrane GSOURI
ST Employee

Hello @jeman.1​ 

First let me thank you for posting.

It sounds like you are trying to run a Mobilenet V2 model on the STM32H723ZG microcontroller using STM32CubeMX, and you are encountering a memory overflow error during validation on the target device.

There are a few potential reasons why this error might be occurring:

  1. Insufficient memory: The STM32H723ZG microcontroller may not have enough available memory to store the Mobilenet V2 model and all the data needed for its operation. In this case, you may need to reduce the size of the model or try using a different microcontroller with more memory.
  2. Incorrect memory configuration: The external flash and external memory that you have enabled in STM32CubeMX may not be correctly configured or may not be compatible with the Mobilenet V2 model. Double-check your memory configuration to make sure it is correct and that you are using the correct type of memory for your application.
  3. Model not optimized for microcontroller: The Mobilenet V2 model may be optimized for use on a desktop or server, and may not be well-suited for use on a microcontroller with limited resources. You may want to consider using a smaller or more efficient model that is better suited for use on a microcontroller.

It can be challenging to run large models like Mobilenet V2 on microcontrollers, and you may need to experiment with different approaches and configurations to find a solution that works for your application. I recommend consulting the documentation for the STM32H723ZG and STM32CubeMX, as well as any other resources you can find on running machine learning models on microcontrollers, to get more ideas on how to troubleshoot and resolve this issue.



Ghofrane GSOURI
ST Employee

Hello again @jeman.1​ 

Mobilenet V2 is a machine learning model that was designed to perform image classification and object detection tasks on desktop or server-based systems. It is a relatively large model, with a size of around 17 MB, and it requires a significant amount of computing resources to run. As a result, it may not be well-suited for use on microcontrollers, which typically have limited memory and processing power compared to desktop or server systems.

Running large models like Mobilenet V2 on microcontrollers can be challenging due to the limited resources available. In general, it is best to use models that are optimized for microcontroller environments, which may be smaller in size and require less memory and processing power to run.

If you are interested in using machine learning on a microcontroller, you may want to consider using a different model that is more suitable for the limited resources available.

There are many machine learning models that are well-suited for use on microcontrollers, and the best model for your application will depend on your specific requirements and the resources available on your microcontroller. Some examples of machine learning models that may be suitable for use on microcontrollers include:

  1. TinyML: TinyML is a collection of machine learning models that are designed specifically for use on microcontrollers and other resource-constrained devices. These models are typically very small in size and require minimal resources to run, making them well-suited for use on microcontrollers.
  2. Micro-CNN: Micro-CNN is a machine learning model for image classification that is designed for use on microcontrollers. It is a lightweight model that requires minimal resources to run and is suitable for use on devices with limited memory and processing power.
  3. FPGA-based models: FPGA (Field-Programmable Gate Array) devices can be used to run machine learning models on microcontrollers. FPGAs are specialized chips that can be programmed to perform specific tasks, and they can be used to offload machine learning tasks from the microcontroller's CPU, allowing for more efficient and faster performance.
  4. TensorFlow Lite Micro: TensorFlow Lite Micro is a version of the TensorFlow Lite machine learning library that is optimized for use on microcontrollers. It includes a collection of pre-trained models and tools for building and deploying machine learning applications on microcontrollers.