cancel
Showing results for 
Search instead for 
Did you mean: 

Transformer models inference on STM32MP257F using its NPU

ramkumarkoppu
Senior

Is NPU of STM32MP257F capable of running inference of Transformers models like Small VLMs on this device using its NPU? what frameworks are supported like execuTorch or onnxruntime ?

1 REPLY 1
Pwxn
ST Employee

Hello @ramkumarkoppu 

 

I recommend you to have a look to the expansion package X-LINUX-AI.

On MPU, ST provide the STAI_mpu api which unified different framework: ONNX, TFLite™, and Network Binary Graph (NBG).

You will find in the wiki different reference for implementation to benchmarking:

https://wiki.st.com/stm32mpu/wiki/Category:X-LINUX-AI_expansion_package

 

Regards,