2019-07-31 07:05 AM
Hello,
i used the stm32ai software to quantize my MNIST model.
I can load the quantized model on my STM32H743-Nucleo Board and with the Validation Software I can validate my quantized model on target.
But I would like to create my own Application, so I used the Application Template.
I can already run my "normal" model with floating point input-vecors.
But if I want to feed my quantized net with fixed point input vector, I only get the same probability
on every number (see picture).
I think the problem is my input data type. The Application Template expects ai_i8. But this data type is to small for grey-scaled Mnist Digits (0-255). So I changed it to ai_u8 and got the output data shown in the picture.
But if I put the exactly same input Data as .csv file in the Validation software, i get the right classified number.
Does anyone knows the answer to my question?
Solved! Go to Solution.
2019-08-01 05:42 AM
I solved the problem by myself. The stm32ai quantize creates a file with inputs and outputs. Any value above 127 ist set to 127 (no overflow). Any other value remains the same.
2019-08-01 05:42 AM
I solved the problem by myself. The stm32ai quantize creates a file with inputs and outputs. Any value above 127 ist set to 127 (no overflow). Any other value remains the same.