cancel
Showing results for 
Search instead for 
Did you mean: 

STM32N6570-DK how to generate .c .h files from AI model

msingh08
Associate III

I have downloaded the x-cube-n6-ai-multi-pose-estimation-v1.0.0 and have managed to open it in CubeIde however when I try to build the project it displays the following error: 

"Please generate this file with STM32Cube.AI tools. See instruction in README.md." 

I have installed CubeAI (I think) however I am absolutely confused by the Readme.md file, as it seems to direct to "Doc/Download-and-deploy-model.md" when I open this file I downloaded "yolov8n_256_quant_pc_uf_pose_coco-st (1)" however I am not sure what to do next? 

 

If I understand now I only need to generate the .c and .h files from the AI model in the "...\x-cube-n6-ai-multi-pose-estimation-v1.0.0\Model" folder, but how do I do that? 

1 REPLY 1
Julian E.
ST Employee

Hello @msingh08,

 

First, I would suggest using the latest version of this package (v2.0.0 GitHub - STMicroelectronics/x-cube-n6-ai-multi-pose-estimation: An AI software application package demonstrating multi-pose estimation use case on STM32N6 product.​)

 

Next, it is true that it is a bit confusing.

I would suggest to look at the other md: x-cube-n6-ai-multi-pose-estimation/Doc/Program-Hex-Files-STM32CubeProgrammer.md at main · STMicroelectronics/x-cube-n6-ai-multi-pose-estimation · GitHub

 

In here you see that the application required to flash 3 binary files:

  1. The weights of the model in external
  2. The application 
  3. The FSBL that will launch first and load the application in internal RAM

 

The other readme explains how to generate new weights. If you do follow this, you will obtain new weights and to run the application you need to flash these new weights (and the appli + FSBL if you did not do it already)

 

Let me know if you need further explanations.

 

Have a good day,

Julian


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.