cancel
Showing results for 
Search instead for 
Did you mean: 

How to fix "INTERNAL ERROR: list index out of range" when uploading an onnx model to STM32Cube.AI

APerd.1
Associate II

Hello!

I'm getting the error: INTERNAL ERROR: list index out of range when uploading an onnx model to STM32Cube.AI and run analysis. I have validated my model with other tools like netron.app and it is correct. I would appreciate if anyone could help to figure out what could be the problem? I attach my model below in case someone can help me. Thanks in advance. Regards!

8 REPLIES 8
APerd.1
Associate II

I forgot to mention that this model was generated in Matlab with exportONNXNetwork function, in case it may be of help.

fauvarque.daniel
ST Employee

The error is in the import of the LSTM layer, probably due to the way Matlab exports the LSTM layer.

I forwarded the network to the development team.

Regards

Daniel


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
APerd.1
Associate II

Thanks! I would appreciate any feedback on the matter, cause I need to use lstm layers for my project.

Regards.

Alejandro

APerd.1
Associate II

Hi @fauvarque.daniel​. Hope you're fine.

Any feedback from the development team on the error in the import of the LSTM layer? I would appreciate any solution proposal to the problem.

Regards

Alejandro

fauvarque.daniel
ST Employee

Issue has been reproduced and fix is ongoing, normally the fix will be part of the next 8.1 release planned in June

Regards

Daniel


In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
APerd.1
Associate II

Thanks for your reply.

Regards

Alejandro

wg
Associate II

I have met the same problem. Have you solved it? Thank you very much

APerd.1
Associate II

Hello @武 钢​ 

No, I could not solve the problem. I expect the problem to be fixed for the next 8.1 release planned in June. @fauvarque.daniel​ reported the problem and fix is ongoing.