Error in STM32CUBEIDE while analyzing a tensorflow lite model [X CUBE AI]
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-05-20 8:42 AM
good morning, I'm trying to import a neural network for NILM (non-intrusive load monitoring) in the stm32 environment, but I can't complete the analysis due to the presence of the error:
INTERNAL ERROR: unpack_from requires a buffer of at least 122836864 bytes for unpacking 4 bytes at offset 122836860 (actual buffer size is 1654784)
The network is CNN type and I'm working with xcube ai v. 9.0.0 and stm32cubeide 1.15.1
I'm asking anyone if they know how I can solve the problem of the buffer size.
Belolw I share the link for the TFlite model (seq2point_CNN_converted_model.tflite), thanks in advantace.
https://drive.google.com/file/d/1FhDdYCNd0y86SptZ5FHCSjpUQdkdxXZ9/view?usp=drive_link
Solved! Go to Solution.
- Labels:
-
STM32CubeAI
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-07-17 8:50 AM
Seems that your model is malformed I can't even open it with Netron.
At this point the tool is still using tensorflow 2.15.1, maybe the error comes from a more recent version of tensorflow
In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content
‎2024-07-17 8:50 AM
Seems that your model is malformed I can't even open it with Netron.
At this point the tool is still using tensorflow 2.15.1, maybe the error comes from a more recent version of tensorflow
In order to give better visibility on the answered topics, please click on 'Accept as Solution' on the reply which solved your issue or answered your question.
