2022-03-16 08:50 AM
I have a convolutional model with convolutional layers bach normalization layers and Dense layers at the end. The model is converted to a tflite model. The inferencing works perfect on computer using tflite but When I try to deploy it on the nucleo h743zi2 I get this error.
The network layers ans its shape look like it is shown in the pic. Has anyone come across this problem?
As far my understanding goes, I did not do wrong model creation. It is some bad interpretation from STM Cube library.
Additional Info: I am using STM Cube AI version 7.1.0
Thanks in advance
Rick
Solved! Go to Solution.
2022-03-17 10:46 AM
The problem comes from the optimization to fold the batch normalization.
With the undocumented option "--optimize.fold_batchnorm False" the model is analyzed correctly.
You can pass the option directly to the stm32ai command line or if you are using X-Cube-AI inside STM32CubeMX you can add this option in the first screen of the advanced parameter window
Regards
Daniel
2022-03-16 09:08 AM
Can you share the model so I can reproduce and have the development team fix the problem ?
Thanks in advance
Regards
Daniel
2022-03-16 09:20 AM
Hello @fauvarque.daniel,
Thanks for quicl reply. Should I share you the keras .h5 file ?
2022-03-16 09:22 AM
yes please
2022-03-16 09:33 AM
2022-03-16 09:37 AM
If I may, if you could provide also the quantized tflite so I have exactly the file you are using.
Daniel
2022-03-16 09:51 AM
I've reproduced the problem with the h5, I let you know if there is a workaround
2022-03-16 02:42 PM
Ok Thankyou @fauvarque.daniel
2022-03-17 10:46 AM
The problem comes from the optimization to fold the batch normalization.
With the undocumented option "--optimize.fold_batchnorm False" the model is analyzed correctly.
You can pass the option directly to the stm32ai command line or if you are using X-Cube-AI inside STM32CubeMX you can add this option in the first screen of the advanced parameter window
Regards
Daniel
2022-03-18 02:30 AM
Thanks a lot @fauvarque.daniel . The solution works. :)
Well, I am curious. Can you give little bit more insight about it. What do you mean by folding the bachnorm ?
Thanks
Rick