2019-08-06 07:22 AM
Hello,
I have a STM32F746G-DISCO which has a microcontroller with 1 Mbytes of Flash memory and 340 Kbytes of RAM. However, I would need more space for my bigger Keras/TensorFlow models.
I have noticed that this Discovery board comes with 128 Mbit Quad-SPI Flash memory, as well as 64 Mbit SDRAM. Would it be possible to use the QSPI Flash memory to store the weights and the SDRAM as an extension of the 1 Mbytes RAM of the microcontroller? I have seen examples of code in the STM32Cube F7 Firmware package that makes use of the QSPI Flash memory and SDRAM, however I am not sure if those examples could be integrated with the X-CUBE-AI generated code in the way that I described, and what the implications of that would be.
I know that there is already the possibility to quantize Keras weights, but I would still be interested in using those extended memory if possible.
Thank you and kind regards
Solved! Go to Solution.
2019-08-06 08:11 AM
Hello,
The new function pack FP-AI-VISION1 is providing a Computer Vision AI example of food classification. It is also providing different memory configurations : use of internal Flash vs external Flash, internal RAM vs external SDRAM. The function pack is based on STM32H747 Discovery board, but it could be used as generic guideline on how to use external memories.
You can find the function pack here:
Best Regards,
Matthieu
2019-08-06 08:11 AM
Hello,
The new function pack FP-AI-VISION1 is providing a Computer Vision AI example of food classification. It is also providing different memory configurations : use of internal Flash vs external Flash, internal RAM vs external SDRAM. The function pack is based on STM32H747 Discovery board, but it could be used as generic guideline on how to use external memories.
You can find the function pack here:
Best Regards,
Matthieu
2019-08-06 08:47 AM
Thank you very much!