2025-09-21 6:57 AM
Hey,
Looking at the image classification application example, I couldn't find any part of the code that handles the normalization of pixel value from [0,255] to [-1,1] or [0,1] that I would expect from a CV model of that kind.
Do you mind explaining how this implementation that is based on the mobilenet_v2_0 skips the need for normalization?
Thanks.
2025-09-22 1:17 AM
Hello @pmiracle,
In the example you linked, the deployed model is a model from model zoo:
mobilenet_v2_0.35_128_fft_int8.tflite
This model takes a float32 input [-1,1] and contains the scale and offset:
The yaml from model zoo only indicate the scripts how to edit the getting started application for model zoo to deploy it on the stm32. ie:
If you run the deployment yaml throught the stm32ai_main.py, you can look at the edited and deployed application here: /application_code/image_classification/STM32N6/
Have a good day,
Julian
2025-09-22 9:04 AM
Thanks for the reply,
This doesn't makes sense to me, because the camera on the DevKit creates RGB values in the range of 0-255 and there is no code in the example project that I have seen that translates these values to -1,1.
Any explanation?
2025-09-23 12:10 AM
Hello @pmiracle,
You're right, I misread.
The float32[-1:1] is the output (look at the image).
The input is uint8[128,128,3] and most likely RGB values in the range of 0-255 as you pointed out.
Have a good day,
Julian
2025-09-23 5:50 AM
But MobileNet expects values [0,1], so what am I missing?
2025-09-23 6:18 AM
Hello @pmiracle,
There are two different cases depending on whether you’re talking about the float model or the quantized model.
Case 1: Float (FP32 / FP16) MobileNetV2
So if you are using the non-quantized version, the comment “MobileNet expects values [0,1]” is correct.
Case 2: Quantized (INT8 / UINT8) MobileNetV2
q = 0 → float ≈ -1
q = 127 → float = 0
q = 255 → float ≈ +1
So for quantized models, you do not normalize to [0,1] yourself. You just give raw 8-bit pixels, and the model’s quantization parameters handle the mapping.
Have a good day,
Julian