2025-02-04 10:14 AM - edited 2025-02-04 10:15 AM
I'm trying to convert the BirdNET-Analyzer (https://github.com/kahst/BirdNET-Analyzer/tree/main) model to run on a STM32U5A5. However, both the TFLite and STM32Cube.AI MCU Runtime fail.
The TFLite runtime fails with:
Operator REDUCE_MIN not supported
The STM32Cube.AI MCU Runtime fails with:
INTERNAL ERROR: Invalid size_splits configuration: [1 1 0]
As I'm by no means a ML expert, I was wondering how to fix these conversion issues. I've tried to convert the TF SavedModel (.pb) to a .tflite model with the TFLiteConverter in python with the following enabled:
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS]
However, this gives no warning and converts the model with a REDUCE_MIN operation. I haven't been able to find any information to mitigate the STM32Cube.AI MCU error.
I'm on the latest STM32Cube.AI package in CubeIDE (v10.0.0) and Tensorflow V2.18.0.
Any help is very appreciated!