2025-04-10 7:44 AM - edited 2025-04-15 9:33 AM
Hi everyone,
I'm trying to run inference with a TFLite model on a Nucleo-N657X0-Q board.
The issue I'm facing is that the network returns a constant output for constant inputs - but the value of this constant changes each time I restart the application. When the input varies, the output also varies, but the values returned are not consistent with those obtained during the validation process.
The model was quantized using a post-training quantization approach.
I'm using AXISRAM4 and AXISRAM5, which have been correctly initialized and enabled in the main.c.
I attach the .ioc file, the app_x-cube-ai.c file and the quantized tflite model for reference.
Thanks in advance for any suggestions!
Solved! Go to Solution.
2025-04-14 8:15 AM
Hello Lisa,
Thanks for your feedback.
For the external flash, maybe you can take a look at examples provided with the Neural-Art component.
In order to easily read the external flash, you need to "memory-map" it, the easiest way for development boards being to make use of the BSP functions like BSP_XSPI_NOR_Init...
Take a look at project examples on how to memory-map external memories !
Storing weights in flash should make your life easier and should fix the issues you had above with unstable outputs (because of unstable weights :) ).
2025-04-15 1:46 AM - edited 2025-04-15 8:10 AM
2025-04-28 2:27 AM
Hi,
I am having a similar problem, I am trying to save the weights of my AI model in the external Flash. I do not know if what I have done so far is correct but I am able to build the project. The problem is that the FSBL and AppN projects run successfully but when I run or debug the AppNS project, where the AI related code is located, I get debugger connection errors and it seems like the debugger never gets connected to the board.
So my question is, how did you save the weights to the external flash @LisaB ? Is it possible for you to help me?
My ioc file is attached below.
Thanks!