cancel
Showing results for 
Search instead for 
Did you mean: 

Using a CLUT with LTDC to reduce framebuffer size

ikrosoft
Associate II

Hello,

I'm in the process of choosing a STM32 and determining whether an external SRAM/SDRAM is really needed for my application or not.

The microcontroller will receive frames of pixels from a camera through SPI in L8 format (i.e 8bit grayscale images), and will have the main task of streaming these in RGB888 using the LTDC peripheral to another chip. A full RGB888 frame (= 24bpp) won't fit in the RAM of any STM32 because I'm working with 800x600 resolution. But there's a note at the very end of section 4.2.1 of AN4861 that had me think about the following idea: can I store the incoming 8bpp frame in the internal memory (800 x 600 x 1B = 480 kB so it fits if I have 512kB or even 1MB of RAM), and then have the CLUT generate the 24bpp frame without storing it at all, i.e with the LTDC directly outputting it as it is created?

The reference manual for STM32F6xxx/STM32F7xxx (for example) seems to say it is not possible, as the CLUT, which is part of the DMA2D, is applied during a memory-to-memory transfer (if I get it right), so I would need 2 framebuffers: 1 for the 8bpp frame and 1 for the 24bpp frame, and that won't fit in the internal RAM.

But then I don't understand the mentioned note of AN4861: how could using a CLUT decrease the required framebuffer size?

Thank you for your help

3 REPLIES 3

LTDC itself contains a CLUT, you can set it to input L8 data and output RGB.

If you'd sync the LTDC exactly (or closely enough) to the camera's input, you wouldn't need a full frame worth of buffer memory, a fraction of it would suffice. If this is the only task the mcu has to do, and if that sync is doable (especially if the camera's data clock is driven from the mcu), and if the LTDC clock is not excessive - I'd say this could be pulled off with a 'F42x.

JW

ikrosoft
Associate II

Thank you @Community member​  for your answer. I hadn't seen that the LTDC iteslf contains a CLUT, but that's exactly what I need.

Sadly though I can't sync the LTDC to the camera input, because I'll receive frames at a quite lower fps than what I'll output. I do need to store a frame so that it can be sent multiple times by the LTDC to satisfy the output fps.

Now this makes me think of another issue. I would like to avoid the tearing effect on the output video, which is possible by double buffering the frames. This can be done using the two layers of the LTDC (writing to one as the other one is streamed out, and vice versa). But it doubles the RAM amount I need, i.e 2x 480kB.

The STM32 with the most RAM have 1MB, like the STM32H743 for instance. So theoretically it would fit, but each buffer should be in a contiguous space I think, and if I get it right this STM32 only has one big enough RAM space, the 512kB AXI-SRAM.

Is this correct? Am I stuck with adding an external SRAM or else giving up on double buffering?

Thanks

Pretty sure the cost trade-offs for an external memory vs an H7 device, the H753 looses. The H750 is cheaper, but have smaller amount of tested FLASH, so you'd need a QSPI memory.

Tips, Buy me a coffee, or three.. PayPal Venmo
Up vote any posts that you find helpful, it shows what's working..